Talk:Guidebook: Difference between revisions

From Opasnet
Jump to navigation Jump to search
No edit summary
No edit summary
 
(12 intermediate revisions by 3 users not shown)
Line 1: Line 1:
'''Brief guidance to the guidebook page content
The [[Guidebook]] page is a table of contents for the actual Intarese guidebook, which has dozens of separate pages, linked to on the Guidebook page. A typical item in the table of contents looks like this:
;Method article name WPX.X: Purpose of the method. | Categories: [[:Category:Method]] | Tools: [[Reporting tool]]
* Method article name is the name of the method to be described, including the link to the actual page.
* WPX.X is the workpackage is that is responsible for producing the article.
* Purpose is the essence of the method: why is it performed. The whole content of the article is conditional to the purpose of the method.
* Categories link to pages that have applied this method. Usually they are products that are produced by using the method.
* Tools are practical software or other tools that make it easier to apply the method.
The following types of links are useful in the guidebook:
* <nowiki>[http:www.site.org/page link] Link to an external source</nowiki>
* <nowiki>[[Link|Link name]] link to a page in Heande main namespace ("Link name" shown on the page)</nowiki>
* <nowiki>[[Help:Link|]] Link to a page in another namespace (the namespace is not shown)</nowiki>
* <nowiki>[[:intarese:Link|]] link to a page in Intarese wiki</nowiki>
* <nowiki>[[:en:Link|]] link to a page in the English Wikipedia</nowiki>
==Processes and products in guidebook==
==Processes and products in guidebook==


{{discussion
{{discussion
|Dispute= All content in the guidebook, excluding theoretical foundations and glue pages, should be process descriptions
|Statements= All content in the guidebook, excluding theoretical foundations and glue pages, should be process descriptions
|Outcome= Under discussion (to be changed when a conclusion is found)
|Resolution=  
|Argumentation =
|Argumentation =
{{defend|#2): |On the level of actual assessment there are the processes themselves and the products (structured descriptions of reality) themselves. A guidebook should be about the processes and the about the products, i.e. it should contain process descriptions and product descriptions. Because the types of products that different processes produce are already described under the ''structure of the process/output format'' attribute, there is no need to make separate product descriptions.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:15, 18 January 2008 (EET)}}
{{defend|2|On the level of actual assessment there are the processes themselves and the products (structured descriptions of reality) themselves. A guidebook should be about the processes and the about the products, i.e. it should contain process descriptions and product descriptions. Because the types of products that different processes produce are already described under the ''structure of the process/output format'' attribute, there is no need to make separate product descriptions.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:15, 18 January 2008 (EET)}}


{{attack|#3: |When there is a clear 1 – 1 relationship between process and product (eg DALY process – DALY product), we will only ask for description of either process or product, in order to avoid confusion. In most cases, the process is the important object that needs to be described, and in practice the methods etc. described in the guidebook will be process descriptions. When there is no such direct 1 – 1 relationship, we will ask for separate description of process and product. An example of this is meta-analysis (process), and the exposure response function (ERF) (product). Even though the process of meta analysis can lead to an estimation of the ERF, the meta analysis can also lead to an estimation of another product (e.g. severity weight), and an ERF (the product) can also be derived from another process (e.g. expert judgement).|--[[User:Anne.knol | Anne Knol]]}}
{{attack|3|When there is a clear 1 – 1 relationship between process and product (eg DALY process – DALY product), we will only ask for description of either process or product, in order to avoid confusion. In most cases, the process is the important object that needs to be described, and in practice the methods etc. described in the guidebook will be process descriptions. When there is no such direct 1 – 1 relationship, we will ask for separate description of process and product. An example of this is meta-analysis (process), and the exposure response function (ERF) (product). Even though the process of meta analysis can lead to an estimation of the ERF, the meta analysis can also lead to an estimation of another product (e.g. severity weight), and an ERF (the product) can also be derived from another process (e.g. expert judgement).|--[[User:Anne.knol | Anne Knol]]}}


{{comment|#1: |I think, the processes and products should be put together and not separated too much.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 17:29, 14 January 2008 (EET)}}
{{comment|1|I think, the processes and products should be put together and not separated too much.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 17:29, 14 January 2008 (EET)}}
{{comment|4|Well, it seems there are no products as such... At least not in the methods-parts. maybe in the Intarese-Framework-Part but that's fine because one special framework is described there and the methods in the methods-chapters are generic for all frameworks.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 09:31, 18 March 2008 (EET)}}
}}
}}


Line 15: Line 36:


{{discussion
{{discussion
|Dispute= Guidebook contains descriptions of a general assessment framework and the Intarese framework
|Statements= Guidebook contains descriptions of a general assessment framework and the Intarese framework
|Outcome= Under discussion (to be changed when a conclusion is found)
|Resolution= Accepted.
|Argumentation =
|Argumentation =
{{attack|#1: |Why is cost-benefit analysis "another framework" in the guidebook?|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 10:27, 17 January 2008 (EET)}}}}
{{attack_invalid|1|Why is cost-benefit analysis "another framework" in the guidebook?|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 10:27, 17 January 2008 (EET)}}
:{{attack|2|Intarese framework is not limited to CBA, and therefore CBA is another framework with a more limited scope.|--[[User:Jouni|Jouni]] 00:41, 8 March 2008 (EET)}}
}}


==Tools in guidebook==
==Tools in guidebook==


{{discussion
{{discussion
|Dispute= Tools are operational descriptions of managing particular sub-processes and they are mostly described together with the particular methods they relate to
|Statements= Tools are operational descriptions of managing particular sub-processes and they are mostly described together with the particular methods they relate to
|Outcome= Under discussion (to be changed when a conclusion is found)
|Resolution=  
|Argumentation =
|Argumentation =
{{comment|#1: |Result database is described as a tool in two places|--[[User:Jouni|Jouni]] 16:22, 30 January 2008 (EET)}}
{{comment|1|Result database is described as a tool in two places|--[[User:Jouni|Jouni]] 16:22, 30 January 2008 (EET)}}
}}
}}


Line 32: Line 55:


{{discussion
{{discussion
|Dispute= Step-specific methods are all modeling methods
|Statements= Step-specific methods are all modelling methods
|Outcome= Under discussion (to be changed when a conclusion is found)
|Resolution= Not accepted.
|Argumentation =
|Argumentation =
{{attack|#1: |Step-specific information to be included in an assessment can also be obtained by other means than modeling, e.g. direct observation or measurement. Of course, if we use modeling in a very broad sense to mean all kinds of information collection and manipulation for the purpose of synthesizing it as a part of a particular assessment, the term is acceptable. Restricting the guidebook to cover only modeling methods related to specific in the way it is commonly understood would be too constrained.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:01, 1 February 2008 (EET)}}
{{attack|1|Step-specific information to be included in an assessment can also be obtained by other means than modeling, e.g. direct observation or measurement. Of course, if we use modeling in a very broad sense to mean all kinds of information collection and manipulation for the purpose of synthesizing it as a part of a particular assessment, the term is acceptable. Restricting the guidebook to cover only modeling methods related to specific in the way it is commonly understood would be too constrained.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:01, 1 February 2008 (EET)}}
:{{defend|2|I agree to Mikkos comment and we should also include other means, e.g. biomonitoring.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 09:55, 7 March 2008 (EET)}}
{{comment|3|But, is it still a process to collect and provide biomonitorindata?|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 10:35, 7 March 2008 (EET)}}
:{{defend|4|Yes, these are processes that should be described in the Guidebook.|--[[User:Jouni|Jouni]] 00:41, 8 March 2008 (EET)}}
}}
}}


Line 41: Line 67:


{{discussion
{{discussion
|Dispute= Information processing methods (other than assessment process management methods) should be separated into information collection and information synthesis methods
|Statements= Information processing methods (other than assessment process management methods) should be separated into information collection and information synthesis methods
|Outcome= Under discussion (to be changed when a conclusion is found)
|Resolution= Accepted.
|Argumentation =
|Argumentation =
{{defend|#1: |Making a distinction between processes that relate to collecting, manipulating or creating information in a non-predefined format and processes that relate to synthesizing the information into the predefined format could bring some clarity to understanding their roles in relation to the products of the assessment as well as their mutual roles in teh overall assessment process.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:41, 1 February 2008 (EET)}}
{{defend|1|Making a distinction between processes that relate to collecting, manipulating or creating information in a non-predefined format (as defined by the general assessment framework) and processes that relate to synthesizing the information into the predefined format could bring some clarity to understanding their roles in relation to the products of the assessment as well as their mutual roles in the overall assessment process.|--[[User:Mikko Pohjola|Mikko Pohjola]] 13:41, 1 February 2008 (EET)}}
:{{defend|2|I agree. How/where would you do that? will you do that? we could add (later in the Guidebook) the picture you presented in Kuopio with the information processes and the difference of synthesising and manipulation.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 09:58, 7 March 2008 (EET)}}
}}
}}
== Assessing uncertainty ==
{{comment|1|Will a paragraph on uncertainty be at all relevant processes, e.g. emission calculation, ERFs etc.?|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 15:13, 11 February 2008 (EET)}}
:{{comment|2|Assessing uncertainty will be about the general concepts, and they will apply to all variables, including those that you mention.|--[[User:Jouni|Jouni]] 00:41, 8 March 2008 (EET)}}
{{attack|3|But the actual appliance of the general methods should be explained at the relevant processes. e.g. in emission calculation it would also say that the emission factor is more uncertain than the activities. or s.th. in that direction. s.th. specific for THIS purpose.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 09:28, 10 March 2008 (EET)}}
==Discussion on some particular page contents==
'''Assessment (universal product)
* Scope. What is the use purpose of an (impact) assessment? (To answer a policy information need) 3, 6, 12
*Definition
**What is an impact assessment
**Different assessments: HIA, RA, IA... 4-5 (possibly own articles)
:{{comment|4|This part describes the process of performing an impact assessment. It goes not into details about the methodologies|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:02, 14 January 2008 (EET)}} {{attack|#(number): |No, this is an overview.|--[[User:Jouni|Jouni]] 23:05, 15 January 2008 (EET)}}
'''Performing an impact assessment (process description:assessment framework) 10
*Scope: Purpose of making an impact assessment is to produce an assessment product.
{{comment|1|What would this be? A general purpose? something like policy consulting???|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:02, 14 January 2008 (EET)}}
*Definition
**General methodology 10 ({{comment|2|would be the same as the assessment framework? equals dimension "work environment" number 3.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:02, 14 January 2008 (EET)}})
**description of methodology used 11
*Result
**Inputs
**Procedure: Phases of an impact assessment 16
***Scoping an impact assessment 26
****Selecting indicators 50
***Applying general information
***Drawing a causal diagram 34 Links: [[Help:Causal diagram]] | Links to alternatives: Causal chain, impact pathway, DPSEEA, DPSIR
***: {{comment|3|Discussing with some colleagues here at USTUTT they said it would be good to describe the differences and communalities between causal chain, impact pathway approach, DPSEEA, DPSIR. Where would this belong to?|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:06, 14 January 2008 (EET)}}
***: {{comment|5|Perhaps these descriptions should be included in the links to the different alternatives? Ex give the differences and communalities between causal diagram and impact pathway in the linked section of impact pathway and so on?|--[[User:Kb|Kb]] 13:43, 18 January 2008 (EET)}}
***Designing variables
***Executing variables and analyses
***[[Reporting an assessment]]
**Outputs
'''Reporting an assessment (process description) 67
*Scope
*Definition: different approaches
*Result
**Reporting uncertainties 70, 73 (incl. qualitative and quantitative uncertainties)
'''Stakeholder involvement (process description) 68
Issue framing (process description:issue framing)
:{{comment|1|Where would be the boundaries to "process: assessment framework?"|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:02, 14 January 2008 (EET)}}
* Scope:
**Purpose, questions 27
**Indicator selection 50
**Boundaries 29
**Scenarios 30-33
*Definition
**Variables
'''Emission modelling (process description) 36-39
*Scope: purpose of emission modelling
*Definition: background
*Result:
**How to model 37
**Sectoral, spatial, and temporal resolution 38
**Uncertainties 39
'''Source-to-exposure modelling (process description) 40
*Scope: purpose
*Definition: Different types 41
*See also: pointers to resource centre 42
*Direct approach: measure data
({{comment|1|whatever. biomarkers, concentrations...|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:02, 14 January 2008 (EET)}})
*Uncertainties 43
'''Exposure-response function modelling (process description)
*Scope 45
*Definition:
**Different types 46
**How can they be derived? 47-48
**Uncertainties 49
'''Risk characterisation (process description) 51
*Scope
*Definition:
:{{comment|1|maybe we could summarise "DALYs / QUALYs and monetary valuation under "aggregation". But I don't know how to do this at the moment.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:02, 14 January 2008 (EET)}}
'''Disability-adjusted life year (process description) 52
*Scope
*Definition:
** How are they derived 54
**Alternatives 53
'''Monetary valuation (process description) 59
*Scope: Why do we need monetary values 60
*Definition
** Why do we choose monetary values and not utility points? 61
*Result
** How are monetary values derived 63
'''Uncertainty assessment (process description)  39, 43, 49, 58, 65, 69
*Scope: Purpose of uncertainty assessment
*Definition: Different approaches
**Qualitative methods eg pedigree matrix 71
**Quantitative methods 72-73
**When to use which method? 73
*Result
**Uncertainty of the result: parameter uncertainty
**Uncertainty of the definition: model uncertainty
**Uncertainty of the scope: relevance
'''Uncertainty tools (process: tool) 76
:{{attack|1|This does not belong into the Guidebook but it is good to keep it in mind.|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 18:02, 14 January 2008 (EET)}}
'''Propagating uncertainties (process description) 72
*Scope
*Definition: approaches
**Monte Carlo 72
**Bayesian analysis 72
'''Impact assessment (product:assessment)
:{{attack|1|What should this be? Why should we have that? Scenarios etc. should be positioned under the process as the user should be explained how to build a scenario. |--[[User:Alexandra Kuhn|Alexandra Kuhn]] 17:26, 14 January 2008 (EET)}}
* Scope:
**Purpose, questions 27
**Boundaries 29
**Scenarios 30-33
*Definition
**Variables
**Analyses
*Result
**Results
**Conclusions
==The pages of the Pyrkilo guide 2==
The table contains all pages that are attached to the [[Intarese:Help:Pyrkilo guide 2]] page. The second column (where it should go) talks about where the current content of the page should go. It does not imply that the scopes of the old and the new page should be the same.
{| {{prettytable}}
|+In the Guidebook
!The location in Intarese (links redirect to the current page)
!Where it is now
|-
| [[Intarese:Help:Analysis tool]]||Tool:Analysis tool
|-
| [[Intarese:Help:Applying general information]] ||Method:Applying general information
|-
| [[Intarese:Help:Class]] ||Universal products:Class
|-
| [[Intarese:Help:Collaborative workspace]] ||Tool:Collaborative workspace
|-
| [[Intarese:Help:Designing variables]]|| Method:Designing variables
|-
| [[Intarese:Help:Dispute]]|| Method:Dealing with disputes
|-
| [[Intarese:Help:Drawing a causal diagram]]|| Method:Drawing a causal diagram
|-
| [[Intarese:Help:Executing variables]]|| Method:Executing variables
|-
| [[Intarese:Help:External model]]|| Tool:External model
|-
| [[Intarese:Help:Extracted model]]|| Tool:Extracted model
|-
| [[Intarese:Help:Issue framing tool]]|| Tool:Issue framing tool
|-
| [[Intarese:Help:Mass collaboration]]|| Method:Mass collaboration
|-
| [[Intarese:Help:Model extractor]]|| Tool:Extracted model tool
|-
| [[Intarese:Help:Object design tool]]|| Tool:Object design tool
|-
| [[Intarese:Help:Purpose of a risk assessment]]|| Method:Defining the purpose of an assessment
|-
| [[Intarese:Help:Reporting a risk assessment]]|| Method:Reporting an assessment
|-
| [[Intarese:Help:Reporting tool]]|| Tool:Reporting tool
|-
| [[Help:Result database]]|| Tool:Result database
|-
| [[Intarese:Help:Risk assessment structure]]|| Universal product:Assessment
|-
| [[Intarese:Help:Scoping a risk assessment]]|| Method:Scoping an assessment
|-
| [[Intarese:Help:The phases of a risk assessment]]|| Method:Performing an assessment
|-
| [[Intarese:Help:Universal information structure for risk assessment products]]|| Theoretical foundation:Universal products
|-
| [[Intarese:Help:Users of the pyrkilo method]]|| Method:Defining the users of an assessment
|-
| [[Intarese:Help:Variable]]|| Universal products:Variable
|-
| [[Intarese:Help:Variable transfer protocol]] || Method:Variable transfer protocol
|-
| [[Intarese:Help:Managing stakeholder involvement]]|| Method:Organizing stakeholder information
|-
| [[Intarese:Help:Variable definition]]|| Universal products:Variable
|-
| [[Intarese:Help:Plausibility tests]]|| Universal products:Variable
|-
| [[Intarese:Help:General properties of good risk assessments]]|| General assessment framework:Purpose and properties of good assessments
|-
| [[Intarese:Help:Performance]]|| General assessment framework:Evaluating assessment performance
|-
| [[Intarese:Help:Societal role of risk assessment]]|| General assessment framework:Societal context of assessments
|-
| [[Intarese:Help:Open participation in risk assessment]]|| Method:Participating in assessments
|-
| [[Intarese:Help:Risk assessment process]]|| Theoretical foundation:General processes
|-
| [[Intarese:Help:Uncertainty]]|| Method:Assessing uncertainty
|-
| [[Intarese:Help:Contributing to a discussion]]|| Method:Dealing with disputes
|}
{| {{prettytable}}
|+In other parts of the Guidance system
!The location in Intarese (links redirect to the current page)
!Where it is now
|-
| [[Intarese:Help:Causality]] ||Glossary:Causality
|-
| [[Intarese:Help:Decision variable]]|| Glossary:Decision variable
|-
| [[Intarese:Help:Value judgement]]|| Glossary:Value judgement
|}
{| {{prettytable}}
|+The new location not yet clear
!The current location in Intarese
!Where it should go
|-
| [[Intarese:Help:Copyright issues]]|| Method:Dealing with copyright issues
|-
| [[Intarese:Help:Data gateway]]|| Tool:Data gateway
|-
| [[Intarese:Help:Encyclopedia on environmental health]]|| Glossary:Encyclopedia on environmental health
|-
| [[Intarese:Help:Guidebook on risk assessment]]|| Glossary:Guidebook
|-
| [[Intarese:Help:Integrated resource platform]]|| Glossary:Resource platform
|-
| [[Intarese:Help:Open risk assessment (toolbox)]]|| Glossary:Open assessment
|-
| [[Intarese:Help:Pyrkilo method (summary)]]|| Glossary:Pyrkilo method
|}

Latest revision as of 11:18, 16 November 2009

Brief guidance to the guidebook page content

The Guidebook page is a table of contents for the actual Intarese guidebook, which has dozens of separate pages, linked to on the Guidebook page. A typical item in the table of contents looks like this:

Method article name WPX.X
Purpose of the method. | Categories: Category:Method | Tools: Reporting tool
  • Method article name is the name of the method to be described, including the link to the actual page.
  • WPX.X is the workpackage is that is responsible for producing the article.
  • Purpose is the essence of the method: why is it performed. The whole content of the article is conditional to the purpose of the method.
  • Categories link to pages that have applied this method. Usually they are products that are produced by using the method.
  • Tools are practical software or other tools that make it easier to apply the method.


The following types of links are useful in the guidebook:

  • [http:www.site.org/page link] Link to an external source
  • [[Link|Link name]] link to a page in Heande main namespace ("Link name" shown on the page)
  • [[Help:Link|]] Link to a page in another namespace (the namespace is not shown)
  • [[:intarese:Link|]] link to a page in Intarese wiki
  • [[:en:Link|]] link to a page in the English Wikipedia


Processes and products in guidebook

How to read discussions

Fact discussion: .
Opening statement: All content in the guidebook, excluding theoretical foundations and glue pages, should be process descriptions

Closing statement: Resolution not yet found.

(A closing statement, when resolved, should be updated to the main page.)

Argumentation:

←--2: . On the level of actual assessment there are the processes themselves and the products (structured descriptions of reality) themselves. A guidebook should be about the processes and the about the products, i.e. it should contain process descriptions and product descriptions. Because the types of products that different processes produce are already described under the structure of the process/output format attribute, there is no need to make separate product descriptions. --Mikko Pohjola 13:15, 18 January 2008 (EET) (type: truth; paradigms: science: defence)

⇤--3: . When there is a clear 1 – 1 relationship between process and product (eg DALY process – DALY product), we will only ask for description of either process or product, in order to avoid confusion. In most cases, the process is the important object that needs to be described, and in practice the methods etc. described in the guidebook will be process descriptions. When there is no such direct 1 – 1 relationship, we will ask for separate description of process and product. An example of this is meta-analysis (process), and the exposure response function (ERF) (product). Even though the process of meta analysis can lead to an estimation of the ERF, the meta analysis can also lead to an estimation of another product (e.g. severity weight), and an ERF (the product) can also be derived from another process (e.g. expert judgement). -- Anne Knol (type: truth; paradigms: science: attack)

----1: . I think, the processes and products should be put together and not separated too much. --Alexandra Kuhn 17:29, 14 January 2008 (EET) (type: truth; paradigms: science: comment)

----4: . Well, it seems there are no products as such... At least not in the methods-parts. maybe in the Intarese-Framework-Part but that's fine because one special framework is described there and the methods in the methods-chapters are generic for all frameworks. --Alexandra Kuhn 09:31, 18 March 2008 (EET) (type: truth; paradigms: science: comment)

Frameworks in guidebook

How to read discussions

Fact discussion: .
Opening statement: Guidebook contains descriptions of a general assessment framework and the Intarese framework

Closing statement: Accepted.

(A closing statement, when resolved, should be updated to the main page.)

Argumentation:

⇤--1: . Why is cost-benefit analysis "another framework" in the guidebook? --Alexandra Kuhn 10:27, 17 January 2008 (EET) (type: truth; paradigms: science: attack)

⇤--2: . Intarese framework is not limited to CBA, and therefore CBA is another framework with a more limited scope. --Jouni 00:41, 8 March 2008 (EET) (type: truth; paradigms: science: attack)

Tools in guidebook

How to read discussions

Fact discussion: .
Opening statement: Tools are operational descriptions of managing particular sub-processes and they are mostly described together with the particular methods they relate to

Closing statement: Resolution not yet found.

(A closing statement, when resolved, should be updated to the main page.)

Argumentation:
----1: . Result database is described as a tool in two places --Jouni 16:22, 30 January 2008 (EET) (type: truth; paradigms: science: comment)

Step-specific methods

How to read discussions

Fact discussion: .
Opening statement: Step-specific methods are all modelling methods

Closing statement: Not accepted.

(A closing statement, when resolved, should be updated to the main page.)

Argumentation:

⇤--1: . Step-specific information to be included in an assessment can also be obtained by other means than modeling, e.g. direct observation or measurement. Of course, if we use modeling in a very broad sense to mean all kinds of information collection and manipulation for the purpose of synthesizing it as a part of a particular assessment, the term is acceptable. Restricting the guidebook to cover only modeling methods related to specific in the way it is commonly understood would be too constrained. --Mikko Pohjola 13:01, 1 February 2008 (EET) (type: truth; paradigms: science: attack)

←--2: . I agree to Mikkos comment and we should also include other means, e.g. biomonitoring. --Alexandra Kuhn 09:55, 7 March 2008 (EET) (type: truth; paradigms: science: defence)

----3: . But, is it still a process to collect and provide biomonitorindata? --Alexandra Kuhn 10:35, 7 March 2008 (EET) (type: truth; paradigms: science: comment)

←--4: . Yes, these are processes that should be described in the Guidebook. --Jouni 00:41, 8 March 2008 (EET) (type: truth; paradigms: science: defence)

Information processing methods by sub-processes

How to read discussions

Fact discussion: .
Opening statement: Information processing methods (other than assessment process management methods) should be separated into information collection and information synthesis methods

Closing statement: Accepted.

(A closing statement, when resolved, should be updated to the main page.)

Argumentation:

←--1: . Making a distinction between processes that relate to collecting, manipulating or creating information in a non-predefined format (as defined by the general assessment framework) and processes that relate to synthesizing the information into the predefined format could bring some clarity to understanding their roles in relation to the products of the assessment as well as their mutual roles in the overall assessment process. --Mikko Pohjola 13:41, 1 February 2008 (EET) (type: truth; paradigms: science: defence)

←--2: . I agree. How/where would you do that? will you do that? we could add (later in the Guidebook) the picture you presented in Kuopio with the information processes and the difference of synthesising and manipulation. --Alexandra Kuhn 09:58, 7 March 2008 (EET) (type: truth; paradigms: science: defence)

Assessing uncertainty

----1: . Will a paragraph on uncertainty be at all relevant processes, e.g. emission calculation, ERFs etc.? --Alexandra Kuhn 15:13, 11 February 2008 (EET) (type: truth; paradigms: science: comment)

----2: . Assessing uncertainty will be about the general concepts, and they will apply to all variables, including those that you mention. --Jouni 00:41, 8 March 2008 (EET) (type: truth; paradigms: science: comment)

⇤--3: . But the actual appliance of the general methods should be explained at the relevant processes. e.g. in emission calculation it would also say that the emission factor is more uncertain than the activities. or s.th. in that direction. s.th. specific for THIS purpose. --Alexandra Kuhn 09:28, 10 March 2008 (EET) (type: truth; paradigms: science: attack)

Discussion on some particular page contents

Assessment (universal product)

  • Scope. What is the use purpose of an (impact) assessment? (To answer a policy information need) 3, 6, 12
  • Definition
    • What is an impact assessment
    • Different assessments: HIA, RA, IA... 4-5 (possibly own articles)
----4: . This part describes the process of performing an impact assessment. It goes not into details about the methodologies --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment) ⇤--#(number):: . No, this is an overview. --Jouni 23:05, 15 January 2008 (EET) (type: truth; paradigms: science: attack)

Performing an impact assessment (process description:assessment framework) 10

  • Scope: Purpose of making an impact assessment is to produce an assessment product.

----1: . What would this be? A general purpose? something like policy consulting??? --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment)

  • Definition
    • General methodology 10 (----2: . would be the same as the assessment framework? equals dimension "work environment" number 3. --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment))
    • description of methodology used 11
  • Result
    • Inputs
    • Procedure: Phases of an impact assessment 16
      • Scoping an impact assessment 26
        • Selecting indicators 50
      • Applying general information
      • Drawing a causal diagram 34 Links: Help:Causal diagram | Links to alternatives: Causal chain, impact pathway, DPSEEA, DPSIR
        ----3: . Discussing with some colleagues here at USTUTT they said it would be good to describe the differences and communalities between causal chain, impact pathway approach, DPSEEA, DPSIR. Where would this belong to? --Alexandra Kuhn 18:06, 14 January 2008 (EET) (type: truth; paradigms: science: comment)
        ----5: . Perhaps these descriptions should be included in the links to the different alternatives? Ex give the differences and communalities between causal diagram and impact pathway in the linked section of impact pathway and so on? --Kb 13:43, 18 January 2008 (EET) (type: truth; paradigms: science: comment)
      • Designing variables
      • Executing variables and analyses
      • Reporting an assessment
    • Outputs

Reporting an assessment (process description) 67

  • Scope
  • Definition: different approaches
  • Result
    • Reporting uncertainties 70, 73 (incl. qualitative and quantitative uncertainties)

Stakeholder involvement (process description) 68

Issue framing (process description:issue framing)

----1: . Where would be the boundaries to "process: assessment framework?" --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment)
  • Scope:
    • Purpose, questions 27
    • Indicator selection 50
    • Boundaries 29
    • Scenarios 30-33
  • Definition
    • Variables

Emission modelling (process description) 36-39

  • Scope: purpose of emission modelling
  • Definition: background
  • Result:
    • How to model 37
    • Sectoral, spatial, and temporal resolution 38
    • Uncertainties 39

Source-to-exposure modelling (process description) 40

  • Scope: purpose
  • Definition: Different types 41
  • See also: pointers to resource centre 42
  • Direct approach: measure data

(----1: . whatever. biomarkers, concentrations... --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment))

  • Uncertainties 43

Exposure-response function modelling (process description)

  • Scope 45
  • Definition:
    • Different types 46
    • How can they be derived? 47-48
    • Uncertainties 49

Risk characterisation (process description) 51

  • Scope
  • Definition:
----1: . maybe we could summarise "DALYs / QUALYs and monetary valuation under "aggregation". But I don't know how to do this at the moment. --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: comment)

Disability-adjusted life year (process description) 52

  • Scope
  • Definition:
    • How are they derived 54
    • Alternatives 53

Monetary valuation (process description) 59

  • Scope: Why do we need monetary values 60
  • Definition
    • Why do we choose monetary values and not utility points? 61
  • Result
    • How are monetary values derived 63

Uncertainty assessment (process description) 39, 43, 49, 58, 65, 69

  • Scope: Purpose of uncertainty assessment
  • Definition: Different approaches
    • Qualitative methods eg pedigree matrix 71
    • Quantitative methods 72-73
    • When to use which method? 73
  • Result
    • Uncertainty of the result: parameter uncertainty
    • Uncertainty of the definition: model uncertainty
    • Uncertainty of the scope: relevance

Uncertainty tools (process: tool) 76

⇤--1: . This does not belong into the Guidebook but it is good to keep it in mind. --Alexandra Kuhn 18:02, 14 January 2008 (EET) (type: truth; paradigms: science: attack)

Propagating uncertainties (process description) 72

  • Scope
  • Definition: approaches
    • Monte Carlo 72
    • Bayesian analysis 72

Impact assessment (product:assessment)

⇤--1: . What should this be? Why should we have that? Scenarios etc. should be positioned under the process as the user should be explained how to build a scenario. --Alexandra Kuhn 17:26, 14 January 2008 (EET) (type: truth; paradigms: science: attack)
  • Scope:
    • Purpose, questions 27
    • Boundaries 29
    • Scenarios 30-33
  • Definition
    • Variables
    • Analyses
  • Result
    • Results
    • Conclusions

The pages of the Pyrkilo guide 2

The table contains all pages that are attached to the Intarese:Help:Pyrkilo guide 2 page. The second column (where it should go) talks about where the current content of the page should go. It does not imply that the scopes of the old and the new page should be the same.

In the Guidebook
The location in Intarese (links redirect to the current page) Where it is now
Intarese:Help:Analysis tool Tool:Analysis tool
Intarese:Help:Applying general information Method:Applying general information
Intarese:Help:Class Universal products:Class
Intarese:Help:Collaborative workspace Tool:Collaborative workspace
Intarese:Help:Designing variables Method:Designing variables
Intarese:Help:Dispute Method:Dealing with disputes
Intarese:Help:Drawing a causal diagram Method:Drawing a causal diagram
Intarese:Help:Executing variables Method:Executing variables
Intarese:Help:External model Tool:External model
Intarese:Help:Extracted model Tool:Extracted model
Intarese:Help:Issue framing tool Tool:Issue framing tool
Intarese:Help:Mass collaboration Method:Mass collaboration
Intarese:Help:Model extractor Tool:Extracted model tool
Intarese:Help:Object design tool Tool:Object design tool
Intarese:Help:Purpose of a risk assessment Method:Defining the purpose of an assessment
Intarese:Help:Reporting a risk assessment Method:Reporting an assessment
Intarese:Help:Reporting tool Tool:Reporting tool
Help:Result database Tool:Result database
Intarese:Help:Risk assessment structure Universal product:Assessment
Intarese:Help:Scoping a risk assessment Method:Scoping an assessment
Intarese:Help:The phases of a risk assessment Method:Performing an assessment
Intarese:Help:Universal information structure for risk assessment products Theoretical foundation:Universal products
Intarese:Help:Users of the pyrkilo method Method:Defining the users of an assessment
Intarese:Help:Variable Universal products:Variable
Intarese:Help:Variable transfer protocol Method:Variable transfer protocol
Intarese:Help:Managing stakeholder involvement Method:Organizing stakeholder information
Intarese:Help:Variable definition Universal products:Variable
Intarese:Help:Plausibility tests Universal products:Variable
Intarese:Help:General properties of good risk assessments General assessment framework:Purpose and properties of good assessments
Intarese:Help:Performance General assessment framework:Evaluating assessment performance
Intarese:Help:Societal role of risk assessment General assessment framework:Societal context of assessments
Intarese:Help:Open participation in risk assessment Method:Participating in assessments
Intarese:Help:Risk assessment process Theoretical foundation:General processes
Intarese:Help:Uncertainty Method:Assessing uncertainty
Intarese:Help:Contributing to a discussion Method:Dealing with disputes


In other parts of the Guidance system
The location in Intarese (links redirect to the current page) Where it is now
Intarese:Help:Causality Glossary:Causality
Intarese:Help:Decision variable Glossary:Decision variable
Intarese:Help:Value judgement Glossary:Value judgement


The new location not yet clear
The current location in Intarese Where it should go
Intarese:Help:Copyright issues Method:Dealing with copyright issues
Intarese:Help:Data gateway Tool:Data gateway
Intarese:Help:Encyclopedia on environmental health Glossary:Encyclopedia on environmental health
Intarese:Help:Guidebook on risk assessment Glossary:Guidebook
Intarese:Help:Integrated resource platform Glossary:Resource platform
Intarese:Help:Open risk assessment (toolbox) Glossary:Open assessment
Intarese:Help:Pyrkilo method (summary) Glossary:Pyrkilo method