Talk:Guidebook: Difference between revisions
(comment from article page moved here) |
|||
Line 46: | Line 46: | ||
{{defend|#1: |Making a distinction between processes that relate to collecting, manipulating or creating information in a non-predefined format (as defined by the general assessment framework) and processes that relate to synthesizing the information into the predefined format could bring some clarity to understanding their roles in relation to the products of the assessment as well as their mutual roles in teh overall assessment process.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:41, 1 February 2008 (EET)}} | {{defend|#1: |Making a distinction between processes that relate to collecting, manipulating or creating information in a non-predefined format (as defined by the general assessment framework) and processes that relate to synthesizing the information into the predefined format could bring some clarity to understanding their roles in relation to the products of the assessment as well as their mutual roles in teh overall assessment process.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:41, 1 February 2008 (EET)}} | ||
}} | }} | ||
== Assessing uncertainty == | |||
{{comment|#1): |Will a paragraph on uncertainty be at all relevant processes, e.g. emission calculation, ERFs etc.?|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 15:13, 11 February 2008 (EET)}} | |||
==Discussion on some particular page contents== | ==Discussion on some particular page contents== |
Revision as of 14:01, 11 February 2008
Processes and products in guidebook
Fact discussion: . |
---|
Opening statement:
Closing statement: Resolution not yet found. (A closing statement, when resolved, should be updated to the main page.) |
Argumentation:
←--#2):: . On the level of actual assessment there are the processes themselves and the products (structured descriptions of reality) themselves. A guidebook should be about the processes and the about the products, i.e. it should contain process descriptions and product descriptions. Because the types of products that different processes produce are already described under the structure of the process/output format attribute, there is no need to make separate product descriptions. --Mikko Pohjola 13:15, 18 January 2008 (EET) (type: truth; paradigms: science: defence) ⇤--#3:: . When there is a clear 1 – 1 relationship between process and product (eg DALY process – DALY product), we will only ask for description of either process or product, in order to avoid confusion. In most cases, the process is the important object that needs to be described, and in practice the methods etc. described in the guidebook will be process descriptions. When there is no such direct 1 – 1 relationship, we will ask for separate description of process and product. An example of this is meta-analysis (process), and the exposure response function (ERF) (product). Even though the process of meta analysis can lead to an estimation of the ERF, the meta analysis can also lead to an estimation of another product (e.g. severity weight), and an ERF (the product) can also be derived from another process (e.g. expert judgement). -- Anne Knol (type: truth; paradigms: science: attack) ----#1:: . I think, the processes and products should be put together and not separated too much. --Alexandra Kuhn 17:29, 14 January 2008 (EET) (type: truth; paradigms: science: comment) |
Frameworks in guidebook
Fact discussion: . |
---|
Opening statement:
Closing statement: Resolution not yet found. (A closing statement, when resolved, should be updated to the main page.) |
Argumentation:
⇤--#1:: . Why is cost-benefit analysis "another framework" in the guidebook? --Alexandra Kuhn 10:27, 17 January 2008 (EET) (type: truth; paradigms: science: attack)
|
Tools in guidebook
Fact discussion: . |
---|
Opening statement:
Closing statement: Resolution not yet found. (A closing statement, when resolved, should be updated to the main page.) |
Argumentation: |
Step-specific methods
Fact discussion: . |
---|
Opening statement:
Closing statement: Resolution not yet found. (A closing statement, when resolved, should be updated to the main page.) |
Argumentation:
⇤--#1:: . Step-specific information to be included in an assessment can also be obtained by other means than modeling, e.g. direct observation or measurement. Of course, if we use modeling in a very broad sense to mean all kinds of information collection and manipulation for the purpose of synthesizing it as a part of a particular assessment, the term is acceptable. Restricting the guidebook to cover only modeling methods related to specific in the way it is commonly understood would be too constrained. --Mikko Pohjola 13:01, 1 February 2008 (EET) (type: truth; paradigms: science: attack)
|
Information processing methods by sub-processes
Fact discussion: . |
---|
Opening statement:
Closing statement: Resolution not yet found. (A closing statement, when resolved, should be updated to the main page.) |
Argumentation:
←--#1:: . Making a distinction between processes that relate to collecting, manipulating or creating information in a non-predefined format (as defined by the general assessment framework) and processes that relate to synthesizing the information into the predefined format could bring some clarity to understanding their roles in relation to the products of the assessment as well as their mutual roles in teh overall assessment process. --Mikko Pohjola 13:41, 1 February 2008 (EET) (type: truth; paradigms: science: defence)
|
Assessing uncertainty
----#1):: . Will a paragraph on uncertainty be at all relevant processes, e.g. emission calculation, ERFs etc.? --Alexandra Kuhn 15:13, 11 February 2008 (EET) (type: truth; paradigms: science: comment)
Discussion on some particular page contents
Assessment (universal product)
- Scope. What is the use purpose of an (impact) assessment? (To answer a policy information need) 3, 6, 12
- Definition
- What is an impact assessment
- Different assessments: HIA, RA, IA... 4-5 (possibly own articles)
- ----#(number):: . This part describes the process of performing an impact assessment. It goes not into details about the methodologies --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment) ⇤--#(number):: . No, this is an overview. --Jouni 23:05, 15 January 2008 (EET) (type: truth; paradigms: science: attack)
Performing an impact assessment (process description:assessment framework) 10
- Scope: Purpose of making an impact assessment is to produce an assessment product. ----#(number):: . What would this be? A general purpose? something like policy consulting??? --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment)
- Definition
- General methodology 10 (----#(number):: . would be the same as the assessment framework? equals dimension "work environment" number 3. --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment))
- description of methodology used 11
- Result
- Inputs
- Procedure: Phases of an impact assessment 16
- Scoping an impact assessment 26
- Selecting indicators 50
- Applying general information
- Drawing a causal diagram 34 Links: Help:Causal diagram | Links to alternatives: Causal chain, impact pathway, DPSEEA, DPSIR
- ----#(number):: . Discussing with some colleagues here at USTUTT they said it would be good to describe the differences and communalities between causal chain, impact pathway approach, DPSEEA, DPSIR. Where would this belong to? --Alexandra Kuhn 18:06, 14 January 2008 (EET) (type: truth; paradigms: science: comment)
- ----#(number):: . Perhaps these descriptions should be included in the links to the different alternatives? Ex give the differences and communalities between causal diagram and impact pathway in the linked section of impact pathway and so on? --Kb 13:43, 18 January 2008 (EET) (type: truth; paradigms: science: comment)
- Designing variables
- Executing variables and analyses
- Reporting an assessment
- Scoping an impact assessment 26
- Outputs
Reporting an assessment (process description) 67
- Scope
- Definition: different approaches
- Result
- Reporting uncertainties 70, 73 (incl. qualitative and quantitative uncertainties)
Stakeholder involvement (process description) 68
Issue framing (process description:issue framing)
- ----#(number):: . Where would be the boundaries to "process: assessment framework?" --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment)
- Scope:
- Purpose, questions 27
- Indicator selection 50
- Boundaries 29
- Scenarios 30-33
- Definition
- Variables
Emission modelling (process description) 36-39
- Scope: purpose of emission modelling
- Definition: background
- Result:
- How to model 37
- Sectoral, spatial, and temporal resolution 38
- Uncertainties 39
Source-to-exposure modelling (process description) 40
- Scope: purpose
- Definition: Different types 41
- See also: pointers to resource centre 42
- Direct approach: measure data (----#(number):: . whatever. biomarkers, concentrations... --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment))
- Uncertainties 43
Exposure-response function modelling (process description)
- Scope 45
- Definition:
- Different types 46
- How can they be derived? 47-48
- Uncertainties 49
Risk characterisation (process description) 51
- Scope
- Definition:
- ----#(number):: . maybe we could summarise "DALYs / QUALYs and monetary valuation under "aggregation". But I don't know how to do this at the moment. --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment)
Disability-adjusted life year (process description) 52
- Scope
- Definition:
- How are they derived 54
- Alternatives 53
Monetary valuation (process description) 59
- Scope: Why do we need monetary values 60
- Definition
- Why do we choose monetary values and not utility points? 61
- Result
- How are monetary values derived 63
Uncertainty assessment (process description) 39, 43, 49, 58, 65, 69
- Scope: Purpose of uncertainty assessment
- Definition: Different approaches
- Qualitative methods eg pedigree matrix 71
- Quantitative methods 72-73
- When to use which method? 73
- Result
- Uncertainty of the result: parameter uncertainty
- Uncertainty of the definition: model uncertainty
- Uncertainty of the scope: relevance
Uncertainty tools (process: tool) 76
- ⇤--#(number):: . This does not belong into the Guidebook but it is good to keep it in mind. --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: attack)
Propagating uncertainties (process description) 72
- Scope
- Definition: approaches
- Monte Carlo 72
- Bayesian analysis 72
Impact assessment (product:assessment)
- ⇤--#(number):: . What should this be? Why should we have that? Scenarios etc. should be positioned under the process as the user should be explained how to build a scenario. --Alexandra Kuhn 17:26, 14 January 2008 (EET) (type: truth; paradigms: science: attack)
- Scope:
- Purpose, questions 27
- Boundaries 29
- Scenarios 30-33
- Definition
- Variables
- Analyses
- Result
- Results
- Conclusions