Difference between revisions of "Testiwiki:Guidebook specification"

From Testiwiki
Jump to: navigation, search
(Page contents in more detail: duplicate descriptions removed)
(Title list: some topics added)
Line 36: Line 36:
 
#Methods related to a particular step in the causal chain
 
#Methods related to a particular step in the causal chain
 
##Emission modelling (process description) 36-39
 
##Emission modelling (process description) 36-39
##Source-to-exposure modelling (process description) 40
+
##Exposure modelling (process description)
 +
###Source-to-exposure modelling (process description) 40
 +
####Atmospheric models (process description)
 +
####Aquatic models (process description)
 +
####Multimedia models (process description)
 +
###Source apportionment
 +
###Intake fraction (process description)
 
##Exposure-response function modelling (process description)  
 
##Exposure-response function modelling (process description)  
 
###Performing meta-analysis (process description) 48
 
###Performing meta-analysis (process description) 48
 +
###Combining toxicological and epidemiological information (process description)
 
##Risk characterisation (process description) 51  
 
##Risk characterisation (process description) 51  
 +
###Risk appraisal method (process description)
 +
####Distance to (regulatory) target (process description)
 +
####Impact estimation (process description)
 +
####Monetary estimation (process description)
 +
####Risk perception and acceptability (process description)
 +
####Equity estimation (process description)
 
###Disability-adjusted life year (process description) 52
 
###Disability-adjusted life year (process description) 52
 
###Quality-adjusted life year (process description) 52
 
###Quality-adjusted life year (process description) 52
Line 50: Line 63:
 
##Open participation in (risk) assessment (process description) 8
 
##Open participation in (risk) assessment (process description) 8
 
##Stakeholder involvement (process description) 68
 
##Stakeholder involvement (process description) 68
## expert panel / elicitation
+
## Expert panel / elicitation
## multiple-bias modelling
+
## Multiple-bias modelling
 
## GIS and spatial issues
 
## GIS and spatial issues
 
##Uncertainty assessment (process description)  39, 43, 49, 58, 65, 69
 
##Uncertainty assessment (process description)  39, 43, 49, 58, 65, 69
Line 69: Line 82:
 
#Other assessment frameworks
 
#Other assessment frameworks
 
##Cost-benefit analysis (process description) 62
 
##Cost-benefit analysis (process description) 62
 +
##Cost-effectiveness analysis
 
##Multi-attribute utility analysis (process description) 55
 
##Multi-attribute utility analysis (process description) 55
 
##Important issues that are outside the (health) impact assessment
 
##Important issues that are outside the (health) impact assessment
 
### Global warming 78  
 
### Global warming 78  
 
###Accidents 79  
 
###Accidents 79  
###Ecosystems and biodiversity 80  
+
###Ecosystems and biodiversity 80
 
 
 
 
'''Still missing (found in SP 1 method table and not in this TOC:'''
 
  
 
===Page contents in more detail===
 
===Page contents in more detail===

Revision as of 12:58, 16 January 2008

<accesscontrol>Members of projects</accesscontrol> Guidebook specification describes the contents of the guidebook (to be developed). In addition, it describes the general hierarchy and organisation of the major types of objects.

Table of Contents for the Guidebook

--#(number): : I think, the processes and products should be put together and not separated too much. --Alexandra Kuhn 17:29, 14 January 2008 (EET)

Title list

  1. Guidebook (product) 1
  2. Universal products
    1. Policy context (universal product)
    2. Scientific context (universal product)
    3. Assessment (universal product)
    4. Causal chain (universal product)
    5. Variable (universal product)
    6. Class (universal product)
    7. Process description (?) (universal product)
    8. Assessment framework (universal product)
    9. Method (universal product)
    10. Impact assessment (product:assessment)
  3. General processes
    1. Observing (process) [actually belongs to basic science but is described here for completeness]
    2. Information collection (process)
    3. Information synthesis (process)
    4. Process management (process)
    5. Description development (process)
  4. Performing an impact assessment (process description: assessment framework) 10 [Necessary phases of performing an assessment]
    1. Issue framing (process description)
      1. Scoping (process description)
      2. Applying general information (process description)
      3. Causal diagram (process description) 34
    2. Designing variables (process description)
    3. Executing variables and analyses (process description)
    4. Reporting an assessment (process description) 67
  5. Methods related to a particular step in the causal chain
    1. Emission modelling (process description) 36-39
    2. Exposure modelling (process description)
      1. Source-to-exposure modelling (process description) 40
        1. Atmospheric models (process description)
        2. Aquatic models (process description)
        3. Multimedia models (process description)
      2. Source apportionment
      3. Intake fraction (process description)
    3. Exposure-response function modelling (process description)
      1. Performing meta-analysis (process description) 48
      2. Combining toxicological and epidemiological information (process description)
    4. Risk characterisation (process description) 51
      1. Risk appraisal method (process description)
        1. Distance to (regulatory) target (process description)
        2. Impact estimation (process description)
        3. Monetary estimation (process description)
        4. Risk perception and acceptability (process description)
        5. Equity estimation (process description)
      2. Disability-adjusted life year (process description) 52
      3. Quality-adjusted life year (process description) 52
      4. Monetary valuation (process description) 59
      5. Discounting (process description) 64
      6. Risk perception (process description) 55
      7. Value judgement (process description) 56
      8. Equity issues
  6. Processes that are useful or necessary and cover several phases or steps
    1. Open participation in (risk) assessment (process description) 8
    2. Stakeholder involvement (process description) 68
    3. Expert panel / elicitation
    4. Multiple-bias modelling
    5. GIS and spatial issues
    6. Uncertainty assessment (process description) 39, 43, 49, 58, 65, 69
      1. Estimating uncertainties (process description)
      2. Propagating uncertainties (process description) 72
      3. Value-of-information analysis (process description) 57
      4. Uncertainty tools (process: tool) 76
    7. Collective structured learning
    8. Mass collaboration
    9. Dealing with disputes
  7. Introduction to important topics in the Resource Centre
    1. Health impacts (universal product)
    2. Emissions (product: class)
    3. Exposures (product: class)
    4. Exposure-response function (product: class) 44
    5. Impacts (product: class) 77
  8. Other assessment frameworks
    1. Cost-benefit analysis (process description) 62
    2. Cost-effectiveness analysis
    3. Multi-attribute utility analysis (process description) 55
    4. Important issues that are outside the (health) impact assessment
      1. Global warming 78
      2. Accidents 79
      3. Ecosystems and biodiversity 80

Page contents in more detail

This chapter only contains pages that have more description than the title!

Assessment (universal product)

  • Scope. What is the use purpose of an (impact) assessment? (To answer a policy information need) 3, 6, 12
  • Definition
    • What is an impact assessment
    • Different assessments: HIA, RA, IA... 4-5 (possibly own articles)
--#(number): : This part describes the process of performing an impact assessment. It goes not into details about the methodologies --Alexandra Kuhn 18:02, 14 January 2008 (EET) #(number): : No, this is an overview. --Jouni 23:05, 15 January 2008 (EET)

Performing an impact assessment (process description:assessment framework) 10

  • Scope: Purpose of making an impact assessment is to produce an assessment product. --#(number): : What would this be? A general purpose? something like policy consulting??? --Alexandra Kuhn 18:02, 14 January 2008 (EET)
  • Definition
    • General methodology 10 (--#(number): : would be the same as the assessment framework? equals dimension "work environment" number 3. --Alexandra Kuhn 18:02, 14 January 2008 (EET))
    • description of methodology used 11
  • Result
    • Inputs
    • Procedure: Phases of an impact assessment 16
      • Scoping an impact assessment 26
        • Selecting indicators 50
      • Applying general information
      • Drawing a causal diagram 34 Links: Help:Causal diagram | Links to alternatives: Causal chain, impact pathway, DPSEEA, DPSIR
        --#(number): : Discussing with some colleagues here at USTUTT they said it would be good to describe the differences and communalities between causal chain, impact pathway approach, Deepsea, DPSIR. Where would this belong to? --Alexandra Kuhn 18:06, 14 January 2008 (EET)
      • Designing variables
      • Executing variables and analyses
      • Reporting an assessment
    • Outputs
--#(number): : Processes needed for conducting the impact assessment. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Reporting an assessment (process description) 67

  • Scope
  • Definition: different approaches
  • Result
    • Reporting uncertainties 70, 73 (incl. qualitative and quantitative uncertainties)

Stakeholder involvement (process description) 68

--#(number): : Processes needed to help to specify the results of the variables. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Issue framing (process description:issue framing)

--#(number): : Where would be the boundaries to "process: assessment framework?" --Alexandra Kuhn 18:02, 14 January 2008 (EET)
  • Scope:
    • Purpose, questions 27
    • Indicator selection 50
    • Boundaries 29
    • Scenarios 30-33
  • Definition
    • Variables

Emission modelling (process description) 36-39

  • Scope: purpose of emission modelling
  • Definition: background
  • Result:
    • How to model 37
    • Sectoral, spatial, and temporal resolution 38
    • Uncertainties 39

Source-to-exposure modelling (process description) 40

  • Scope: purpose
  • Definition: Different types 41
  • See also: pointers to resource centre 42
  • Direct approach: measure data (--#(number): : whatever. biomarkers, concentrations... --Alexandra Kuhn 18:02, 14 January 2008 (EET))
  • Uncertainties 43

Exposure-response function modelling (process description)

  • Scope 45
  • Definition:
    • Different types 46
    • How can they be derived? 47-48
    • Uncertainties 49

Risk characterisation (process description) 51

  • Scope
  • Definition:
--#(number): : maybe we could summarise "DALYs / QUALYs and monetary valuation under "aggregation". But I don't know how to do this at the moment. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Disability-adjusted life year (process description) 52

  • Scope
  • Definition:
    • How are they derived 54
    • Alternatives 53

Monetary valuation (process description) 59

  • Scope: Why do we need monetary values 60
  • Definition
    • Why do we choose monetary values and not utility points? 61
  • Result
    • How are monetary values derived 63

Uncertainty assessment (process description) 39, 43, 49, 58, 65, 69

  • Scope: Purpose of uncertainty assessment
  • Definition: Different approaches
    • Qualitative methods eg pedigree matrix 71
    • Quantitative methods 72-73
    • When to use which method? 73
  • Result
    • Uncertainty of the result: parameter uncertainty
    • Uncertainty of the definition: model uncertainty
    • Uncertainty of the scope: relevance

Uncertainty tools (process: tool) 76

#(number): : This does not belong into the Guidebook but it is good to keep it in mind. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Propagating uncertainties (process description) 72

  • Scope
  • Definition: approaches
    • Monte Carlo 72
    • Bayesian analysis 72

Impact assessment (product:assessment)

#(number): : What should this be? Why should we have that? Scenarios etc. should be positioned under the process as the user should be explained how to build a scenario. --Alexandra Kuhn 17:26, 14 January 2008 (EET)
  • Scope:
    • Purpose, questions 27
    • Boundaries 29
    • Scenarios 30-33
  • Definition
    • Variables
    • Analyses
  • Result
    • Results
    • Conclusions

Article templates

What are processes and products?

The guidance system will be composed of pages related to processes, and pages related to products, perhaps complemented with a few so-called ‘glue pages’, which could contain information that is not easily captured in the process/product structure. ...which are on the top-level of the Guidebook and give a short overview over the topic linking to the respective process and products articles.

A process is a method (tool, model, calculation, formal discussion, etc) in which various inputs are formed into a new product; the output. A process is “something you do”.

A product is both the input as well as the output of a process, and is a representation of reality (it can – at least in theory – be validated against reality).

Processes can lead to various products and products can be developed in various processes (there is not necessarily a 1 -1 relationship between processes and products)

(I did not think at all about nice words for these definitions, because I’m quite sure KTL has them already!)

--#(number): : So what would this mean for a variable? Is the formula in the variable the process and the result of the variable is the product? What would this mean for a model? Is the model the process and the result the product? --Alexandra Kuhn 12:50, 14 January 2008 (EET)


Process/ product structure in Intarese

Elements of the Intarese method as developed in SP1 relate to either a process, or a product, or both. The graph below gives an idea of processes and products as developed in the Intarese project (graph should be improved/updated, if we want to present such a graph!) . For all processes and products, information needs to be provided in the guidance system.

When there is a clear 1 – 1 relationship between process and product (eg DALY process – DALY product), we will only ask for description of either process or product, in order to avoid confusion. When there is no such direct 1 – 1 relationship, we will ask for separate description of process and product. An example of this is the process meta analysis, and the product exposure response function (ERF). Even though the process of meta analysis can lead to an estimation of the ERF, the meta analysis can also lead to an estimation of another product (eg severity weight), and an ERF (the product) can also be derived from another process (eg expert judgment).

How should an article about disability-adjusted life years be structured? The question is not at all obvious, so the different options are tested here. Discussion about the goodness of approaches is welcome. The following parts are repeated several times in the table; therefore, they are listed here and only referred to in the table.

Purpose
DALY translates the impacts of diseases into life years based on their severity and duration, so that different diseases can be measured using a single currency, the life year. DALYs are based on disease-specific weights. (In contrast, QALYs evaluate the quality of life in a certain health state, not disease.)
Formula
DALY = life-years lost YLL (due to mortality) + life-years with disease YLD = YLL + number of disease cases * severity weight of the disease * the duration of the disease
the severity weights for diseases come from the variable Variable:Disability-adjusted weights for diseases

In this ontology, the scope can be seen as a research question. The answer to this question is the result, and the definition tells how this result can be achieved.

Approach Scope Definition Result
Variable (product object) DALY is a summary measure of burden of disease adding up increased mortality (years of life lost) and years lived with disability. Purpose

The research question: What is the average DALY per person?

Subattributes: Data tells about what data there exists about DALY estimates and gives links to the most important publications. Causality lists the upstream variables mentioned in the formula (see below). Unit is life-year. Formula is the formula. Result gives estimates for DALYs. However, the research question does not fulfil the clairvoyant criteria (or, it can be thought to apply to each individual in the world!) Therefore, only a few most important results (if any) can be mentioned here for illustration.
Method (process object) DALY estimation is a process for measuring summarised burden of disease adding up increased mortality (years of life lost) and years lived with disability. Purpose

The research question: What is a good way to estimate DALYs?

The definition contains the reasoning and motivation for calculating DALYs. It compares and discusses different alternatives. It contains links to methodology articles. It is also open for comments and further developments. The result describes the state-of-the-art method and formulas such as the formula. The result is based on the content of the definition. Sub-attributes include input (the upstream variables), procedure (the calculations), and output (format of the process output).
Class (product object) This DALY class describes the common properties of all objects that share this key property: The object is a summary measure of burden of disease adding up increased mortality (years of life lost) and years lived with disability. Purpose

The research question: What are the common properties of all objects that fulfil the key property?

Definition describes all other common properties of these objects. It also includes the discussions about the properties and whether they really are common to the objects. The most important property is the formula. The result contains a list of all objects (mainly variables) that belong to this class, i.e. variables that use DALYs to measure summary health impacts.
Universal (product object) The kind of DALY describes the essence of DALY objects: a DALY object is a summary measure of burden of disease adding up increased mortality (years of life lost) and years lived with disability. Purpose
--#1: : Maybe an universal (as defined by Loew) is actually the scope and definition attributes of a class, while the class (as defined mathematically as a particular kind of set) is the result attribute of the same object. In this way, Loew is happy, because we separate universals and classes and give the universal a stronger epistemological weight. --Jouni 00:45, 13 January 2008 (EET)
Definition describes other properties of the "kind of DALY". The most important property is the formula.
--#1: : With universals, the distinction between the scope and definition is not clear? --Jouni 00:45, 13 January 2008 (EET)
Not relevant for universals?

Conclusions

  1. Of these possibilities, the process (method) seems to be the most suitable object for DALY. #(number): : This would mean that in the definition-part we describe several alternatives and in the result-part we describe the one that is best suitable? what happens if there is no alternative? e.g. if we describe Monte-Carlo-analysis? then, the definition would be the same as the result. --Alexandra Kuhn 14:21, 14 January 2008 (EET) #(number): : Should we also - in addition to the process object - have the product object? or are the parts too much redundand? I suppose so... . --Alexandra Kuhn 14:21, 14 January 2008 (EET)
  2. This previous idea was not a good one: idea that there is a general product object which contains the essence of DALY, and then a process object with only the formula to calculate the DALYs (the idea that the process object is trivial given the product object).
  3. Most of the content in the guidebook is the first meta level: how to perform the process of an assessment. Thus, the content is not about the real world (the "zero meta level").
  4. The purpose of all processes is to describe a good process for achieving the outcome described in the scope. Thus, scope belongs to the purpose of a particular process object.
  5. The definition of a process describes the information you need to understand whether some process is suitable and good for the purpose. The definition together form the structure of the process (in the same way as with variables). --#(number): : ??? We should only put such method descriptions into the definition-part that are helpful for achieving the purpose anyway? --Alexandra Kuhn 14:21, 14 January 2008 (EET)
  6. The result describes the state-of-the-art method for fulfilling the purpose. --#(number): : What happens if there are several??? e.g. Exposure-Response-Function estimation (process object): modelling, meta-analyses, epi-studies... Or would we have then several process objects for each of the methods? We could have generic meta-analysis as a separate process object and then apply this method/process on the ERFs and put this into the definition-part of the ERFs estimation? But there is still not ONE method that would prove valid for the results-part. --Alexandra Kuhn 14:21, 14 January 2008 (EET)
  7. The class cannot be used here, because it is not clear how that would be applied in practical assessments. The common properties in the definition do not necessarily form a whole method that can be applied.

Example: products and processes in Intarese guidance system

Template: process

The process template is the most common template for the guidebook contents.

Summary
The summary of a process is a very short overview of the process, and may contain all types of information that are considered relevant for this specific process (max words: 250?)
Name
The name of the process should be unique (there should not be two objects (processes or products) with identical names). The name should be chosen so that it is descriptive, unambiguous and not easily confused with other products/ processes. (max words: 20?)
Scope
The scope of the process describes its purpose: what is the process to be used for? This should contain all relevant information needed to distinguish the process from other processes. (max words: 400?)
Definition
This section may contain a link (or a "more" button) to further background information about the process, eg its history, current practice, etc) (or where do we put this background info?) It also contains links top other processes or products that are related, or function as input to the process, as well as to the resource centre. (max words: 2000) --#(number): : the background information should be positioned in one or several background articles. These may be unstructured objects. (although I think that our structure is so general that it can be made applicable to nearly anything) If the information is necessary to understand or apply the process, it should be added in the process article itself. The “more” part is not thought to hide irrelevant background information because the “more” just should be in the article to enhance readability. --Alexandra Kuhn 12:50, 14 January 2008 (EET)
Results
The result section has three sub-attributes. (max words: 3000?)
Input
The input variables or parameters, their syntax, and other relevant information is described here.
Procedure
The actual process – how it works – is described here. It consists of e.g. the mathematical formula to calculate the result. The procedure uses algebra or other explicit methods if possible.
Output
shortly describes the product of the process, which should fulfil the purpose as described in the scope. It furthermore links – if applicable – to the product object.
See also
See also links to pages (both internal guidebook pages and external) which relate to the process subject. All subjects that could be relevant for readers of this page can be listed here. (max links: 20?)
References
All references, as used in the texts above. (max references: 30?)

Template: product

Product is common in the resource centre, as the products describe real-world entities ("zero meta level") and they are products of the assessment (sub-)processes.

Summary
The summary of a product is a very short description of the product, and may contain all types of information that are considered relevant for this specific product (max words: 250?)
Name
The name of the product should be unique (there should not be two products or processes with identical names). The name should be chosen so that it is descriptive, unambiguous and not easily confused with other products/ processes. (max words: 20?)
Scope
The scope of the product gives a research question that this product object aims to answer. The scope includes – if applicable – its spatial, temporal, or other limits (system boundaries). (max words: 400?) --#(number): : Must this be a research question??? If we have something like a DALY object or an ERF object to which research question would these apply? Could you give an example of a research question? Maybe, if we decide that for DALYs and ERFs and such like the process is more important than the product and if the product is also included in the results-part of the project, would we then end up with variables as only products? These would never be generic als guidance for the guidebook but would always be used in assessments and therefore would belong to the resource centre. Did I forget something? --Alexandra Kuhn 14:26, 14 January 2008 (EET)
Definition
The definition describes the data and reasoning that tells us what the answer to the question in the scope is. In the case of a variable, it has four sub-attributes: Data, Causality, Unit, and Formula. This section may contain a link ( or a "more" button) to further background information about the product, eg its history, current practice, etc) (or where do we put this background info?) --#(number): : See my comment about this in the Process template. --Alexandra Kuhn 12:50, 14 January 2008 (EET) Definition may link to the process(es) that lead to this product, explaining shortly why these processes are relevant. (max words: 3000?)
Results
Contains the answer to the question presented in the scope. Usually the text is short (max 500 words), but the result tables or figures may be extensive; there is no upper limit.
See also
See also links to pages (both internal resources centre pages and external) which relate to the product. All subjects that could be relevant for readers of this page can be listed here. (max links: 20?)
References
All references, as used in the texts above. (max references: 30?)

Practical information for both templates

--#(number): : General information on how the sentences should be structured etc. (see also my first version and the project that RIVM already have conducted. --Alexandra Kuhn 12:50, 14 January 2008 (EET)

Dimensions and hierarchies

The guidance system has several dimensions and hierarchies. It can therefore be confusing to browse through different parts. This is an attempt to clarify these issues. The hierarchies are listed from bottom to top for each dimension.

  • Dimension: classes
    1. Item: an object that belongs to a particular class. For example, a pollutant concentration in a particular location at a particular time.
    2. Class: a set of all objects that share a key property, defined in the scope of the class. For example a class of "dioxin concentration objects" (in any spatio-temporal location) forms a class that can be useful when utilising the full chain approach for a dioxin assessment.
  • Dimension: real world (or product object dimension?)
    1. Variable: this level looks at individual product objects (typically variables) that describe particular pieces of reality.
    2. Assessment: this level looks at groups of variables that together form a synthesis of a particular problem at hand.
    3. Policy context: this level looks a particular problem as its relation to the policy context in which it is assessed. This level is usually not described explicitly.
  • Dimension: work environment (or process object dimension?)
    1. Tool: a practical operationalisation (a software or a program) of a particular method
    2. Method: a procedure for manipulating information (about the real world) into a useful form for a particular place in an assessment.
    3. Assessment framework: A distinct set of methods that, when used together, form a scientifically coherent and efficient way for manipulating all information that is needed to perform a particular kind of an assessment.
    4. Scientific context: this level looks at the assessment frameworks in their scientific context in which they are used. This level is usually not described explicitly.

  • Dimension:Guidebook articles
    1. Maybe: "glue" articles that are a summary of all the other articles concerning one big topic
    2. Process articles (descriptions of methodologies) and Product articles
    3. Background articles (unstructured objects???) describing the background/history of information used in the process and product articles.
  • Dimension:Resource Centre
    1. a) Tools/models: a software or a program b) Core data set (incl. description)
    2. a) Links to tools/models and description of the tools/models (metadata). b) Links to data and description of the data (metadata).
    3. a) Links to tool/model databases and description of the tool/model databases (meta-metadata). b) Links to databases and description of the databases (metadata).
    4. Links to data and description of the data (meta-metadata).
  • Dimension:Tools/Models
    1. Tools/Models that help to specify the result of the variables (so dealing with the product of the assessment). These would be e.g. Chimère, WATSON... --#(number): : Jouni, see the difference between a model and a tool? --Alexandra Kuhn 14:56, 14 January 2008 (EET)
    2. Tools that help to perform the assessment as such (so dealing with the process of an assessment). These would be e.g. a tool for stakeholder involvement. --#(number): : I'm not quite sure where to put tools for issue framing and uncertainty analysis, e.g. Monte-Carlo-analysis tool. I would see them spontanously as tools for the procss of the assessment, but they have also influence on (single) variables... --Alexandra Kuhn 14:56, 14 January 2008 (EET)

  • Dimension: meta
    1. Zero meta level asks: "What does the real world look like?"
    2. First meta level asks: "How do we do assessments to find out how the real world looks like?"
    3. Second meta level asks: "How can we know how we should do the assessments?"
    4. Third meta level asks: "How can we know how we can know about good assessments?" (rarely needed or described)