Difference between revisions of "Testiwiki:Guidebook specification"

From Testiwiki
Jump to: navigation, search
(Article templates: major edits based on today's discussions with Mikko)
(Dimensions and hierarchies: major changes based on discussions with Mikko)
Line 297: Line 297:
 
==Dimensions and hierarchies==
 
==Dimensions and hierarchies==
  
The guidance system has several dimensions and hierarchies. It can therefore be confusing to browse through different parts. This is an attempt to clarify these issues. The hierarchies are listed from bottom to top for each dimension.
+
The guidance system has two distinct dimensions. It can be confusing to browse through different parts. This is an attempt to clarify these issues. The hierarchy levels are listed from bottom to top for each dimension. The hierarchy list has dramatically changed; the [http://heande.pyrkilo.fi/heande/index.php?title=Heande:Guidebook_specification&oldid=3473 original list] was much longer and more complex. However, the old "dimensions" are captured in the new two dimensions. "Class", "real world", and "work environment" are actually examples of structures in the complexity dimension. "Meta" still exists but in a more rich form. "Guidebook articles" about glue (summary) articles, structured (according to the method) articles, and background articles is a practical grouping of objects inside and outside the method. "Resource Centre" is an application of the meta dimension. "Tools/Models" is an application of the complexity dimension.
*Dimension: '''classes
+
 
*#Item: an object that belongs to a particular class. For example, a pollutant concentration in a particular location at a particular time.
+
*Dimension: '''Meta (or Abstraction level)
*#Class: a set of all objects that share a key property, defined in the scope of the class. For example a class of "dioxin concentration objects" (in any spatio-temporal location) forms a class that can be useful when utilising the full chain approach for a dioxin assessment.
+
*#Zero meta level is the real world: the real objects and events.
*Dimension: '''real world (or product object dimension?)
+
*#First metal level contains the descriptions of the real world (such as variables and assessments) and the collection and synthesis processes of the first meta level information.
*#Variable: this level looks at individual product objects (typically variables) that describe particular pieces of reality.
+
*#Second meta level contains the general structures of the first-level objects, and the descriptions of these objects (such as process descriptions).
*#Assessment: this level looks at groups of variables that together form a synthesis of a particular problem at hand.
+
*#Third meta level describes the development processes and the descriptions of the second-level objects.
*#Policy context: this level looks a particular problem as its relation to the policy context in which it is assessed. This level is usually not described explicitly.
+
**For objects in the Resource Centre
*Dimension: '''work environment (or process object dimension?)
+
**# (Nothing exists from the zero meta level)
*#Tool: a practical operationalisation (a software or a program) of a particular method
+
**# Tools/models (software or programs) and core data sets
*#Method: a procedure for manipulating information (about the real world) into a useful form for a particular place in an assessment.
+
**# Descriptions of tools, models, and core data sets, including links to those that are not within the Resource Centre.
*#Assessment framework: A distinct set of methods that, when used together, form a scientifically coherent and efficient way for manipulating all information that is needed to perform a particular kind of an assessment.
+
**# Descriptions and links to tool, model, and data databases (meta-metadata).
*#Scientific context: this level looks at the assessment frameworks in their scientific context in which they are used. This level is usually not described explicitly.
+
 
<small>
+
 
* <span style="color:#ff0000;"> Dimension:'''Guidebook articles </span>
+
*Dimension: '''Complexity
*# <span style="color:#ff0000;"> Maybe: "glue" articles that are a summary of all the other articles concerning one big topic </span>
+
**For product objects
*# <span style="color:#ff0000;"> Process articles (descriptions of methodologies) and Product articles </span>
+
**#Variable: this level looks at individual product objects (typically variables) that describe particular pieces of reality.
*# <span style="color:#ff0000;"> Background articles (unstructured objects???) describing the background/history of information used in the process and product articles. </span>
+
**#Assessment: this level looks at groups of variables that together form a synthesis of a particular problem at hand.
* <span style="color:#ff0000;"> Dimension:'''Resource Centre </span>
+
**#Policy context: this level looks a particular problem as its relation to the policy context in which it is assessed. This level is usually not described explicitly.
*# <span style="color:#ff0000;"> a) Tools/models: a software or a program b) Core data set (incl. description) </span>
+
**For process objects
*# <span style="color:#ff0000;"> a) Links to tools/models and description of the tools/models (metadata).  b) Links to data and description of the data (metadata). </span>
+
**#Method: a procedure for manipulating information (about the real world) into a useful form for a particular place in an assessment.
*# <span style="color:#ff0000;"> a) Links to tool/model databases and description of the tool/model databases (meta-metadata).  b) Links to databases and description of the databases (metadata). </span>
+
**#Assessment framework: A distinct set of methods that, when used together, form a scientifically coherent and efficient way for manipulating all information that is needed to perform a particular kind of an assessment.
*# <span style="color:#ff0000;"> Links to data and description of the data (meta-metadata). </span>
+
**#Scientific context: this level looks at the assessment frameworks in their scientific context in which they are used. This level is usually not described explicitly.
* <span style="color:#ff0000;"> Dimension:'''Tools/Models </span>
+
**For Classes
*# <span style="color:#ff0000;"> Tools/Models that help to specify the result of the variables (so dealing with the product of the assessment). These would be e.g. Chimère, WATSON... </span> {{comment|#(number): |Jouni, see the difference between a model and a tool?|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 14:56, 14 January 2008 (EET)}}
+
**#Item: an object (typically a variable or a class) that belongs to a particular class. For example, a pollutant concentration in a particular location at a particular time.
*# <span style="color:#ff0000;"> Tools that help to perform the assessment as such (so dealing with the process of an assessment). These would be e.g. a tool for stakeholder involvement. </span> {{comment|#(number): |I'm not quite sure where to put tools for issue framing and uncertainty analysis, e.g. Monte-Carlo-analysis tool. I would see them spontanously as tools for the procss of the assessment, but they have also influence on (single) variables...|--[[User:Alexandra Kuhn|Alexandra Kuhn]] 14:56, 14 January 2008 (EET)}}
+
**#Class: a set of all objects that share a key property, defined in the scope of the class. For example a class of "dioxin concentration objects" (in any spatio-temporal location) forms a class that can be useful when utilising the full chain approach for a dioxin assessment.
</small>
+
**For tools:
*Dimension: '''meta
+
**# Tools/Models that help to specify the result of the variables (e.g. Chimère, WATSON)
*#Zero meta level asks: "What does the real world look like?"
+
**# Tools that help to perform the assessment as such (so dealing with the process of an assessment). These would be e.g. a tool for stakeholder involvement.
*#First meta level asks: "How do we do assessments to find out how the real world looks like?"
 
*#Second meta level asks: "How can we know how we should do the assessments?"
 
*#Third meta level asks: "How can we know how we can know about good assessments?" (rarely needed or described)
 

Revision as of 22:08, 16 January 2008

<accesscontrol>Members of projects</accesscontrol> Guidebook specification describes the contents of the guidebook (to be developed). In addition, it describes the general hierarchy and organisation of the major types of objects.

Table of Contents for the Guidebook

--#(number): : I think, the processes and products should be put together and not separated too much. --Alexandra Kuhn 17:29, 14 January 2008 (EET)

Title list

  1. Guidebook (product) 1
  2. Universal products
    1. Policy context (universal product)
    2. Scientific context (universal product)
    3. Assessment (universal product)
    4. Causal chain (universal product)
    5. Variable (universal product)
    6. Class (universal product)
    7. Process description (?) (universal product)
    8. Assessment framework (universal product)
    9. Method (universal product)
    10. Impact assessment (product:assessment)
  3. General processes
    1. Observing (process) [actually belongs to basic science but is described here for completeness]
    2. Information collection (process)
    3. Information synthesis (process)
    4. Process management (process)
    5. Description development (process)
  4. Performing an impact assessment (process description: assessment framework) 10 [Necessary phases of performing an assessment]
    1. Issue framing (process description)
      1. Scoping (process description)
      2. Applying general information (process description)
      3. Causal diagram (process description) 34
    2. Designing variables (process description)
    3. Executing variables and analyses (process description)
    4. Reporting an assessment (process description) 67
  5. Methods related to a particular step in the causal chain
    1. Emission modelling (process description) 36-39
    2. Exposure modelling (process description)
      1. Source-to-exposure modelling (process description) 40
        1. Atmospheric models (process description)
        2. Aquatic models (process description)
        3. Multimedia models (process description)
      2. Source apportionment
      3. Intake fraction (process description)
    3. Exposure-response function modelling (process description)
      1. Performing meta-analysis (process description) 48
      2. Combining toxicological and epidemiological information (process description)
    4. Risk characterisation (process description) 51
      1. Risk appraisal method (process description)
        1. Distance to (regulatory) target (process description)
        2. Impact estimation (process description)
        3. Monetary estimation (process description)
        4. Risk perception and acceptability (process description)
        5. Equity estimation (process description)
      2. Disability-adjusted life year (process description) 52
      3. Quality-adjusted life year (process description) 52
      4. Monetary valuation (process description) 59
      5. Discounting (process description) 64
      6. Risk perception (process description) 55
      7. Value judgement (process description) 56
      8. Equity issues
  6. Processes that are useful or necessary and cover several phases or steps
    1. Open participation in (risk) assessment (process description) 8
    2. Stakeholder involvement (process description) 68
    3. Expert panel / elicitation
    4. Multiple-bias modelling
    5. GIS and spatial issues
    6. Uncertainty assessment (process description) 39, 43, 49, 58, 65, 69
      1. Estimating uncertainties (process description)
      2. Propagating uncertainties (process description) 72
      3. Value-of-information analysis (process description) 57
      4. Uncertainty tools (process: tool) 76
    7. Collective structured learning
    8. Mass collaboration
    9. Dealing with disputes
  7. Introduction to important topics in the Resource Centre
    1. Health impacts (universal product)
    2. Emissions (product: class)
    3. Exposures (product: class)
    4. Exposure-response function (product: class) 44
    5. Impacts (product: class) 77
  8. Other assessment frameworks
    1. Cost-benefit analysis (process description) 62
    2. Cost-effectiveness analysis
    3. Multi-attribute utility analysis (process description) 55
    4. Important issues that are outside the (health) impact assessment
      1. Global warming 78
      2. Accidents 79
      3. Ecosystems and biodiversity 80

Page contents in more detail

This chapter only contains pages that have more description than the title!

Assessment (universal product)

  • Scope. What is the use purpose of an (impact) assessment? (To answer a policy information need) 3, 6, 12
  • Definition
    • What is an impact assessment
    • Different assessments: HIA, RA, IA... 4-5 (possibly own articles)
--#(number): : This part describes the process of performing an impact assessment. It goes not into details about the methodologies --Alexandra Kuhn 18:02, 14 January 2008 (EET) #(number): : No, this is an overview. --Jouni 23:05, 15 January 2008 (EET)

Performing an impact assessment (process description:assessment framework) 10

  • Scope: Purpose of making an impact assessment is to produce an assessment product. --#(number): : What would this be? A general purpose? something like policy consulting??? --Alexandra Kuhn 18:02, 14 January 2008 (EET)
  • Definition
    • General methodology 10 (--#(number): : would be the same as the assessment framework? equals dimension "work environment" number 3. --Alexandra Kuhn 18:02, 14 January 2008 (EET))
    • description of methodology used 11
  • Result
    • Inputs
    • Procedure: Phases of an impact assessment 16
      • Scoping an impact assessment 26
        • Selecting indicators 50
      • Applying general information
      • Drawing a causal diagram 34 Links: Help:Causal diagram | Links to alternatives: Causal chain, impact pathway, DPSEEA, DPSIR
        --#(number): : Discussing with some colleagues here at USTUTT they said it would be good to describe the differences and communalities between causal chain, impact pathway approach, Deepsea, DPSIR. Where would this belong to? --Alexandra Kuhn 18:06, 14 January 2008 (EET)
      • Designing variables
      • Executing variables and analyses
      • Reporting an assessment
    • Outputs
--#(number): : Processes needed for conducting the impact assessment. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Reporting an assessment (process description) 67

  • Scope
  • Definition: different approaches
  • Result
    • Reporting uncertainties 70, 73 (incl. qualitative and quantitative uncertainties)

Stakeholder involvement (process description) 68

--#(number): : Processes needed to help to specify the results of the variables. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Issue framing (process description:issue framing)

--#(number): : Where would be the boundaries to "process: assessment framework?" --Alexandra Kuhn 18:02, 14 January 2008 (EET)
  • Scope:
    • Purpose, questions 27
    • Indicator selection 50
    • Boundaries 29
    • Scenarios 30-33
  • Definition
    • Variables

Emission modelling (process description) 36-39

  • Scope: purpose of emission modelling
  • Definition: background
  • Result:
    • How to model 37
    • Sectoral, spatial, and temporal resolution 38
    • Uncertainties 39

Source-to-exposure modelling (process description) 40

  • Scope: purpose
  • Definition: Different types 41
  • See also: pointers to resource centre 42
  • Direct approach: measure data (--#(number): : whatever. biomarkers, concentrations... --Alexandra Kuhn 18:02, 14 January 2008 (EET))
  • Uncertainties 43

Exposure-response function modelling (process description)

  • Scope 45
  • Definition:
    • Different types 46
    • How can they be derived? 47-48
    • Uncertainties 49

Risk characterisation (process description) 51

  • Scope
  • Definition:
--#(number): : maybe we could summarise "DALYs / QUALYs and monetary valuation under "aggregation". But I don't know how to do this at the moment. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Disability-adjusted life year (process description) 52

  • Scope
  • Definition:
    • How are they derived 54
    • Alternatives 53

Monetary valuation (process description) 59

  • Scope: Why do we need monetary values 60
  • Definition
    • Why do we choose monetary values and not utility points? 61
  • Result
    • How are monetary values derived 63

Uncertainty assessment (process description) 39, 43, 49, 58, 65, 69

  • Scope: Purpose of uncertainty assessment
  • Definition: Different approaches
    • Qualitative methods eg pedigree matrix 71
    • Quantitative methods 72-73
    • When to use which method? 73
  • Result
    • Uncertainty of the result: parameter uncertainty
    • Uncertainty of the definition: model uncertainty
    • Uncertainty of the scope: relevance

Uncertainty tools (process: tool) 76

#(number): : This does not belong into the Guidebook but it is good to keep it in mind. --Alexandra Kuhn 18:02, 14 January 2008 (EET)

Propagating uncertainties (process description) 72

  • Scope
  • Definition: approaches
    • Monte Carlo 72
    • Bayesian analysis 72

Impact assessment (product:assessment)

#(number): : What should this be? Why should we have that? Scenarios etc. should be positioned under the process as the user should be explained how to build a scenario. --Alexandra Kuhn 17:26, 14 January 2008 (EET)
  • Scope:
    • Purpose, questions 27
    • Boundaries 29
    • Scenarios 30-33
  • Definition
    • Variables
    • Analyses
  • Result
    • Results
    • Conclusions

Article templates

What are processes and products?

The guidance system will be composed of pages related to processes, and pages related to products, perhaps complemented with a few so-called ‘glue pages’, which could contain information that is not easily captured in the process/product structure. ...which are on the top-level of the Guidebook and give a short overview over the topic linking to the respective process and products articles.

A process is a method (tool, model, calculation, formal discussion, etc) in which various pieces of input information are manipulated and formed into a new product of information; the output. A process is “something you do”.

A product is a piece of information that describes a particular piece of reality. It can – at least in theory – be validated against reality. It can act as an input as well as an output of a process. Variable is a typical product object. In a variable, there is the Definition/Formula attribute, which tells how the result of the variable can be derived or computed. If the formula is a large piece of text, or generally usable in many variables, it might be useful to extract that piece out of the variable and make it a separate process object. This process of creating new objects from within an existing object is called budding. In this way, models can be seen as ancient Formula attributes that were created from a variable using budding during the evolution of the system.

Processes can lead to various products and products can be developed in various processes (there is not necessarily a 1 -1 relationship between processes and products).


Process/ product structure in Intarese

Elements of the Intarese method as developed in SP1 relate to either a process, or a product, or both. The graph below gives an idea of processes and products as developed in the Intarese project (graph should be improved/updated, if we want to present such a graph!) . For all processes and products, information needs to be provided in the guidance system.

When there is a clear 1 – 1 relationship between process and product (eg DALY process – DALY product), we will only ask for description of either process or product, in order to avoid confusion. In most cases, the process is the important object that needs to be described in the method guidebook. When there is no such direct 1 – 1 relationship, we will ask for separate description of process and product. An example of this is the process meta analysis, and the product exposure response function (ERF). Even though the process of meta analysis can lead to an estimation of the ERF, the meta analysis can also lead to an estimation of another product (e.g. severity weight), and an ERF (the product) can also be derived from another process (e.g. expert judgement).

How should an article about disability-adjusted life years be structured? The question is not at all obvious, so the different options were tested in a previous version of this page. Only the conclusions are presented here.

In this ontology, the scope can be seen as a research question. The answer to this question is the result, and the definition tells how this result can be or was achieved.

Approach Scope Definition Result
Method (process description object) DALY estimation is a process for measuring summarised burden of disease adding up increased mortality (years of life lost) and years lived with disability.

Purpose: DALY translates the impacts of diseases into life years based on their severity and duration, so that different diseases can be measured using a single currency, the life year. DALYs are based on disease-specific weights. (In contrast, QALYs evaluate the quality of life in a certain health state, not disease.)
The research question: What is a good way to estimate DALYs?

The definition contains the reasoning and motivation for calculating DALYs. It compares and discusses different alternatives. It contains links to methodology articles. It is also open for comments and further developments. The result describes the state-of-the-art method and formulas.

Formula: DALY = life-years lost YLL (due to mortality) + life-years with disease YLD = YLL + number of disease cases * severity weight of the disease * the duration of the disease. The severity weights for diseases come from the variable Variable:Disability-adjusted weights for diseases The result is motivated by the content of the definition. Sub-attributes of the result include input (the upstream variables), procedure (the calculations), and output (format of the process output).

Conclusions:

  1. The process description (method) seems to be the most suitable object for DALY. The result attribute of the DALY process description describes the state-of-the-art procedure for estimating DALYs. If there is not a single best method, several methods can be described. The definition would then discuss the good and bad properties of each method, and their limitations. Usually a process is a general one, and therefore it will produce a large number of products for specific purposes in several risk assessments. There is no need to describe a "generalized product", as all general information is already described in the process description.
  2. The purpose of all process descriptions is to describe a good process for achieving the outcome described in the scope. Thus, scope belongs to the purpose of a particular process description object.
  3. The definition of a process description describes the information you need to understand whether some procedure is suitable and good for the purpose. The definition gives rationale for the procedure selected.
  4. To be precise, a process description is a product object that describes how to actually do the work. The doing itself is the process object, but that is something that vanishes as soon as the work is done and the product has been produced. We try to be precise when talking about the process or process description. However, we do not emphasize their differential nature (process vs. product, respectively), because the difference between doing and talking about doing is probably clear to the reader until someone tries to define that difference using metaphysical terminology.

Example: products and processes in the Intarese guidance system

Template: process

The process template is the most common template for the guidebook contents.

Summary
The summary of a process is a very short overview of the process, and may contain all types of information that are considered relevant for this specific process (max words: 250?)
Name
The name of the process should be unique (there should not be two objects (processes or products) with identical names). The name should be chosen so that it is descriptive, unambiguous and not easily confused with other products/ processes. (max words: 20?)
Purpose
(This is the same as Scope in the general object attribute list.) The general purpose of every process is to manipulate information in aim to produce a particular information product. The process-specific purpose describes the intended product of this particular process. This should contain all relevant information needed to distinguish the process from other processes. (max words: 400?)
Structure of the process
(This is the same as the Result (of the process description) in the general object attribute list.) The result section has three sub-attributes. (max words: 3000?)
Input format
The input variable or parameter types, their syntax, and other relevant information is described here.
Procedure
The actual process – how it works and how to do – is described here. It consists of e.g. the mathematical formula to calculate the result. The procedure uses algebra or other explicit methods if possible.
Management
A complex process may require that also the management process of the procedure is described. The procedure may for example be a mathematical algorithm. The management process is a computer software that runs the algorithm, so that when you have the software, the management is a trivial task. If you do not have the management process - well, good luck to you and your calculator. So, actually models and software are procedures that have the management process packed into the same neat package. An other example of a management process is a guidance by the U.S.EPA about how to plan and organise for stakeholder meetings. The actual procedure here is to obtain information and feedback from the stakeholders.
Output format
shortly describes the format of the product of the process, which should fulfil the purpose as described in the scope. It furthermore links – if applicable – to the product object.
Rationale
(This is the same as the Definition (of the process description) in the general object attribute list.) This attribute answers to the following questions: What is known about a good process for this purpose? How do you know that the procedure described is good? (max words: 2000)
See also
See also links to pages (both internal guidebook pages and external) which relate to the process subject. All subjects that could be relevant for readers of this page can be listed here. (max links: 20?)
References
All references, as used in the texts above. (max references: 30?)

Template: product

Product is common in the resource centre, as the products describe real-world entities ("zero meta level") and they are products of the assessment (sub-)processes.

Summary
The summary of a product is a very short description of the product, and may contain all types of information that are considered relevant for this specific product (max words: 250?)
Name
The name of the product should be unique (there should not be two products or processes with identical names). The name should be chosen so that it is descriptive, unambiguous and not easily confused with other products or processes. (max words: 20?)
Scope
The scope of the product gives a research question that this product object aims to answer. The scope includes – if applicable – its spatial, temporal, or other limits (system boundaries). A product object may also be a generic one applying to any spatio-temporal locations. Whether these are just variables, and how they are used in the Guidebook is under study. (max words: 400?)
Definition
The definition describes the data and reasoning that tells us what the answer to the question in the scope is. In the case of a variable, it has four sub-attributes:
Data
What data or observations are available about this object?
Causality
What variables affect the result of this object when they change (i.e., which variables are causally related to this one)?
Unit
What is the measurement unit of the result?
Formula: How is the result derived or computed?.
Definition may link to the process(es) that lead to this product, explaining shortly why these processes are relevant. (max words: 3000?)
Results
Contains the answer to the question presented in the scope. Usually the text is short (max 500 words), but the result tables or figures may be extensive; there is no upper limit.
See also
See also links to pages (both internal resources centre pages and external) which relate to the product. All subjects that could be relevant for readers of this page can be listed here. (max links: 20?)
References
All references, as used in the texts above. (max references: 30?)

Practical information for both templates

--#(number): : General information on how the sentences should be structured etc. (see also my first version and the project that RIVM already have conducted. --Alexandra Kuhn 12:50, 14 January 2008 (EET)

Any section may contain a "more" button to more detailed information that is hidden by default to increase readability. The content is still directly relevant for the section. In addition, if there is a lot of further background information available about the object, e.g. its history, current practice, etc.) which is useful but not required to utilise the object, a background article can be created and linked to from the object. A background article can be freely structured, but an established encyclopedia article structure is recommended.

Dimensions and hierarchies

The guidance system has two distinct dimensions. It can be confusing to browse through different parts. This is an attempt to clarify these issues. The hierarchy levels are listed from bottom to top for each dimension. The hierarchy list has dramatically changed; the original list was much longer and more complex. However, the old "dimensions" are captured in the new two dimensions. "Class", "real world", and "work environment" are actually examples of structures in the complexity dimension. "Meta" still exists but in a more rich form. "Guidebook articles" about glue (summary) articles, structured (according to the method) articles, and background articles is a practical grouping of objects inside and outside the method. "Resource Centre" is an application of the meta dimension. "Tools/Models" is an application of the complexity dimension.

  • Dimension: Meta (or Abstraction level)
    1. Zero meta level is the real world: the real objects and events.
    2. First metal level contains the descriptions of the real world (such as variables and assessments) and the collection and synthesis processes of the first meta level information.
    3. Second meta level contains the general structures of the first-level objects, and the descriptions of these objects (such as process descriptions).
    4. Third meta level describes the development processes and the descriptions of the second-level objects.
    • For objects in the Resource Centre
      1. (Nothing exists from the zero meta level)
      2. Tools/models (software or programs) and core data sets
      3. Descriptions of tools, models, and core data sets, including links to those that are not within the Resource Centre.
      4. Descriptions and links to tool, model, and data databases (meta-metadata).


  • Dimension: Complexity
    • For product objects
      1. Variable: this level looks at individual product objects (typically variables) that describe particular pieces of reality.
      2. Assessment: this level looks at groups of variables that together form a synthesis of a particular problem at hand.
      3. Policy context: this level looks a particular problem as its relation to the policy context in which it is assessed. This level is usually not described explicitly.
    • For process objects
      1. Method: a procedure for manipulating information (about the real world) into a useful form for a particular place in an assessment.
      2. Assessment framework: A distinct set of methods that, when used together, form a scientifically coherent and efficient way for manipulating all information that is needed to perform a particular kind of an assessment.
      3. Scientific context: this level looks at the assessment frameworks in their scientific context in which they are used. This level is usually not described explicitly.
    • For Classes
      1. Item: an object (typically a variable or a class) that belongs to a particular class. For example, a pollutant concentration in a particular location at a particular time.
      2. Class: a set of all objects that share a key property, defined in the scope of the class. For example a class of "dioxin concentration objects" (in any spatio-temporal location) forms a class that can be useful when utilising the full chain approach for a dioxin assessment.
    • For tools:
      1. Tools/Models that help to specify the result of the variables (e.g. Chimère, WATSON)
      2. Tools that help to perform the assessment as such (so dealing with the process of an assessment). These would be e.g. a tool for stakeholder involvement.