User:Paula Maatela: Difference between revisions
No edit summary |
No edit summary |
||
(One intermediate revision by the same user not shown) | |||
Line 84: | Line 84: | ||
==Homework 9== | ==Homework 9== | ||
[[:User:Mohammad Shahidehnia]] | [[:User:Mohammad Shahidehnia]] | ||
==Homework 10== | |||
{|{{prettytable}} | |||
|+ Evaluation of the assessment of the health impacts of H1N1 vaccination | |||
! Category | |||
! Property | |||
! Guiding question | |||
! Evaluation | |||
|----- | |||
| Quality of content | |||
| Informativeness | |||
| How many possible worlds does the answer rule out? How few possible interpretations are there for the answer? | |||
| The answer is univocal. | |||
|----- | |||
| Quality of content | |||
| Calibration | |||
| How close is the answer to reality or real value? | |||
| The assessment was mainly based on real cases, and in that way represent reality. Disability weights used for DALYs were estimates. | |||
|----- | |||
| Quality of content | |||
| Coherence | |||
| How completely does the answer address the assessment question? Is everything addressed? Is something unnecessary? | |||
| There were two questions addressed, and the latter question was answered. Overall health impact of the H1N1 vaccination was included in the answer. | |||
|----- | |||
| Applicability | |||
| Relevance | |||
| How well does the information provided by the assessment serve the needs of the users? Is the assessment question good? | |||
| The needs of the users are served well by the assessment information provided. The assessment question could be framed better. | |||
|----- | |||
| Applicability | |||
| Availability | |||
| Is the information provided by the assessment available when, where and to who is needed? | |||
| In principle the information of the assessment is available all the time to those (decision makers, researchers etc) who know where to search it. Performing Google query of the assessment by its title didn’t give any results connecting to the Opasnet. This reveals that participation is restricted to those who know existence of the Opasnet, and the assessments included in it | |||
|----- | |||
| Applicability | |||
| Usability | |||
| Can the users perceive and internalize the information provided by the assessment? Does users' understanding increase about the assessed issue? | |||
| Yes. Information was clarified with graphs, and with comparison information of two harmful substances common in environment and one neurological disease. Users understand increase about the H1N1 issue. | |||
|----- | |||
| Applicability | |||
| Acceptability | |||
| Is the assessment result (output), and the way it is obtained and delivered for use, perceived as acceptable by the users? | |||
| Assessment results were logically derived from data, and therefore should be acceptable by the users. | |||
|----- | |||
| Efficiency | |||
| Intra-assessment efficiency | |||
| How much effort is spent in the making of an assessment? | |||
| The assessment covers all the relevant points concerning the H1N1 vaccination (sufficient effort). | |||
|----- | |||
| Efficiency | |||
| Inter-assessment efficiency | |||
| If another (somewhat similar) assessment was made, how much (less) effort would be needed? | |||
| It is expected that the model for the assessment was developed, and the raw data collected. Usability of some parts of old raw data depends on the time elapsed between previous assessment and the new assessment. Most part of the raw data for a new assessment has to be gathered. Estimation 50-60% less effort needed for a new assessment. | |||
|} |
Latest revision as of 11:10, 19 May 2015
Homework 1
4. What are co-creation skills?
Co-creation skills are divided into four categories:
• Encouragement (skills for helping people to participate in a decision process, produce useful information and learn from others)
• Synthesis (skills to synthesize the information obtained into a more structured and useful format)
• Open data (skills for converting data into machine-readable formats to be used in assessment models)
• Modelling (skills for developing assessment models based on generic methods and case-specific data)
9. What are dimensions of openness?
Dimensions of openness are:
• Scope of participation (who are allowed to participate)
• Access to information (what information is made available to participants)
• Timing of openness (when participants are allowed or invited to participate)
• Scope of contribution (which aspects of the issue participants are invited or allowed to contribute to)
• Impact of contribution (how much weight is given to participant contributions)
18. What parts does the open policy practice consist of?
Open policy practice consists of four parts:
• Shared understanding (main target of the work)
• Execution of decision support (consists of six principles: intentionality, shared information objects, causality, critique, openness and reuse)
• Evaluation and management
• Co-creation skills and facilitation
Homework 2
Link to the Paula's training page Paula's training page There seems to be some kind of error happened, and the page do not existing anymore.
Homework 3
I need clarification for universal objects, PSSP and Bayesian networks.
Homework 4
Homework 5
Homework 6
Homework 7
Homework 8
Starting discussion with the title in main page • Climate change adaptation is more important than mitigation on city level.
⇤--#: . Climate change and air quality are strongly interconnected, and two major health hazards tropospheric ozone and particular matter are highly affected by regional emissions, which can be reduced solely by city level mitigation actions. --Paula M (type: truth; paradigms: science: attack)
Homework 9
Homework 10
Category | Property | Guiding question | Evaluation |
---|---|---|---|
Quality of content | Informativeness | How many possible worlds does the answer rule out? How few possible interpretations are there for the answer? | The answer is univocal. |
Quality of content | Calibration | How close is the answer to reality or real value? | The assessment was mainly based on real cases, and in that way represent reality. Disability weights used for DALYs were estimates. |
Quality of content | Coherence | How completely does the answer address the assessment question? Is everything addressed? Is something unnecessary? | There were two questions addressed, and the latter question was answered. Overall health impact of the H1N1 vaccination was included in the answer. |
Applicability | Relevance | How well does the information provided by the assessment serve the needs of the users? Is the assessment question good? | The needs of the users are served well by the assessment information provided. The assessment question could be framed better. |
Applicability | Availability | Is the information provided by the assessment available when, where and to who is needed? | In principle the information of the assessment is available all the time to those (decision makers, researchers etc) who know where to search it. Performing Google query of the assessment by its title didn’t give any results connecting to the Opasnet. This reveals that participation is restricted to those who know existence of the Opasnet, and the assessments included in it |
Applicability | Usability | Can the users perceive and internalize the information provided by the assessment? Does users' understanding increase about the assessed issue? | Yes. Information was clarified with graphs, and with comparison information of two harmful substances common in environment and one neurological disease. Users understand increase about the H1N1 issue. |
Applicability | Acceptability | Is the assessment result (output), and the way it is obtained and delivered for use, perceived as acceptable by the users? | Assessment results were logically derived from data, and therefore should be acceptable by the users. |
Efficiency | Intra-assessment efficiency | How much effort is spent in the making of an assessment? | The assessment covers all the relevant points concerning the H1N1 vaccination (sufficient effort). |
Efficiency | Inter-assessment efficiency | If another (somewhat similar) assessment was made, how much (less) effort would be needed? | It is expected that the model for the assessment was developed, and the raw data collected. Usability of some parts of old raw data depends on the time elapsed between previous assessment and the new assessment. Most part of the raw data for a new assessment has to be gathered. Estimation 50-60% less effort needed for a new assessment. |