Difference between revisions of "Quality assurance and quality control"

From Testiwiki
Jump to: navigation, search
(added 2 links that might be of interest)
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
<accesscontrol>members of projects</accesscontrol>
 
{{method}}
 
 
[[Category:Quality control]]
 
[[Category:Quality control]]
'''Quality control measures''' are needed to ensure that the risk assessments are reflecting the current best scientific knowledge and the state-of-the-art methods. Quality control will impact the process of making an assessment, as well as the contents of the product. Assessments that fulfil the quality criteria at every part are called full open risk assessments. If an assessment fulfils the criteria only partly, it is called a draft open risk assessment.
+
{{method|moderator=Jouni}}
 +
'''Quality control measures''' are needed to ensure that the risk assessments are reflecting the current best scientific knowledge and the state-of-the-art methods. Quality control will impact the process of making an assessment, as well as the contents of the product. In [[open assessment]]s, this is especially challenging, as the contents of the product may be changed by any participant. Special procedures are needed to make this possible but ensure the assurance of the quality at the same time.
  
==Four different quality controls of the content==
+
==Scope==
  
The content of an assessment is evaluated for each object (assessment, variable) separately.
+
==Definition==
  
A systematic literature reviews are performed for
+
===Data===
* the data sub-attribute of each variable
 
* the causality sub-attribute of each variable
 
* the definition attribute (the variable list) of an assessment [is this possible?]
 
  
A critical evaluation using open participatory discussion is performed for
+
===Dependencies===
* the scope attribute of the assessment
 
* the analyses sub-attribute of the assessment
 
* the conclusion sub-attribute of the assessment
 
* the scope attribute of each variable [how is this different from the definition attribute of an assessment?]
 
  
A technical quality check is performed for
+
* [[Quality evaluation criteria]]
* the results sub-attribute of the assessment
 
* the unit sub-attribute of each variable
 
* the formula sub-attribute of each variable
 
  
An uncertainty analysis is performed for
+
==Result==
* the result attribute of each variable
+
 
 +
All different quality measures apply to a certain version of a page and are therefore time-dependent. However, the editing work of the page may go on despite a quality assessment. Therefore, people should be careful about the quality assessments and always pay attention to the date and possible changes made since.
 +
 
 +
===Quality controls of the content===
 +
 
 +
The content of an assessment is evaluated for each object (assessment, variable) separately. Quality controls can typically be performed for all formal objects and all attributes. However, there are a few exceptions.
 +
* A systematic literature review can be performed for the definition attribute of a variable.
 +
* An uncertainty analysis can be performed for the result attribute of a variable.
  
 
===Systematic literature review===
 
===Systematic literature review===
  
A systematic literature review is performed according to the rules of [[Cochrane reviews]].
+
A systematic literature review is performed according to the rules of [[:en:Cochrane Collaboration|Cochrane Collaboration]]. The options are
 +
* not performed (default)
 +
* performed.
  
 
===Open participatory discussion===
 
===Open participatory discussion===
Line 35: Line 33:
 
An open participatory discussion is performed according to a pragma-dialectical argumentation theory<ref>van Eemeren and Grootendorst, 2004</ref>.  
 
An open participatory discussion is performed according to a pragma-dialectical argumentation theory<ref>van Eemeren and Grootendorst, 2004</ref>.  
  
A discussion is considered saturated when [possible suggestions for a systematic criteria]
+
A discussion is considered resolved, when the outcome of the discussion is synthesises based on all arguments in the discussion, and the resolution is incorporated into the relevant place in the main page. It should be noted that a resolution does not mean that the discussants agree on the topic; it means that they agree on what the agreements and disagreements are, and this is reflected in the discussion as a lack of new contributions. The options are
* explicit comments has been received from all identified stakeholders or their representatives
+
* no discussion about the topic (default)
* more than 300 independent contributions have been received
+
* the topic under discussion
* the new contributions do not bring substantive information to what exists already
+
* discussion about the topic resolved.
  
 
===Technical quality check===
 
===Technical quality check===
  
A technical quality check is performed by peer review by two experienced experts of the methodology used.
+
A technical quality check is performed by the moderator of a page. The technical quality grade is her personal opinion. The options are
 +
* not evaluated (default)
 +
* intermediate
 +
* up to the standard
  
 
===Uncertainty analysis===
 
===Uncertainty analysis===
  
For a variable, the data, causality, and formula subattributes result in a probability distribution that in theory reflects the current understanding of the uncertainty about the variable. However, this is only true for variables that have gone through the whole quality control process. Ucertainty analyses need to answer the following question about the level of uncertainty:
+
The result of a variable is a probability distribution based on all information in the definition, and in theory it reflects the best current understanding of the uncertainty about the variable. However, this is only true for variables that have gone through a process where a large group of scientific experts have contributed and showed their acceptance about the validity of the estimates.<ref>In a [http://en.opasnet.org/en-opwiki/index.php?title=Quality_assurance_and_quality_control&oldid=7251 previous version] we promoted the idea of explicating the absolute amount of data used for deriving a probability distribution, but we abandoned it.</ref> Thus, the probability distribution, as a subjective probability, is always subject to acceptance by individuals that are considered experts, and ultimately, by myself.
* Which of the following is true (about the absolute amount of data used)?
+
 
** The data used is only a placeholder for illustration
+
===Development, evaluation, and application of environmental models===
** The data used is based on some understanding of the issue, and it is a reasonable guess
+
 
** The data used is based on reasonable amount of information about the expected value, but inadequate data about uncertainties
+
There is an EPA report about this topic <ref>The Council for Regulatory Environmental Modeling: '''Draft Guidance on the Development, Evaluation, and Application of Regulatory Environmental Models'''. Models Guidance Draft - November 2003. [http://www.epa.gov/crem/library/CREM%20Guidance%20Draft%2012_03.pdf]</ref>
** The data allows for a reasonable estimate of the full probability distribution.
+
 
** The data is extensive and it is unlikely that even large amounts of new data would change the current estimate substantially.
+
"This Guidance recommends best practices to help determine when a model, despite its
In addition, draft risk assessments should answer the following question:
+
uncertainties, can be appropriately used to inform a decision. Specifically, it recommends that
* Which of the following is true about the data used (in relation to the amount of existing data)?
+
model developers and users: (a) subject their model to credible, objective peer review; (b) assess
** No existing data was used
+
the quality of the data they use; (c) corroborate their model by evaluating the degree to which it
** Only some general knowledge was used
+
corresponds to the system being modeled; and (d) perform sensitivity and uncertainty analyses.
** A few easily available information sources were used
+
Sensitivity analysis evaluates the effect of changes in input values or assumptions on a model's
** A non-systematic literature review or an informal expert elicitation was used.
+
results. Uncertainty analysis investigates the effects of lack of knowledge and other potential
** A systematic literature review or a formal expert elicitation was used.
+
sources of error in the model (e.g., the “uncertainty” associated with model parameter values)
 +
and when conducted in combination with sensitivity analysis allows a model user to be more
 +
informed about the confidence that can be placed in model results. A model’s quality to support
 +
a decision becomes known when information is available to assess these factors."
 +
 
 +
===Maturity of a page===
 +
 
 +
Each page in [Opasnet]] is at different level of maturity. More mature pages are protected more strictly to prevent degradation of quality due to uninformed edits. The levels of maturity (or page phase) are classified into three groups:
 +
 
 +
'''Maturity not determined.
 +
 
 +
* This level applies to all new pages by default. A page will be on this level until there are specific needs to apply a more strict protection (see below).
 +
* If users want to edit pages, they must register and give their real name either in the user name or in their own user page.
 +
* Edits must be done according to the good practice of editing. However, this is not protected by technical means.
 +
* Users may also use the discussion page if they do not want to edit the page content directly.
 +
* Moderators will check for new edits and remove material that is against [[:en:Wikipedia:Policies and guidelines|policies or guidelines]].
 +
* For major edits, a discussion must be held at the respective Talk page before the edits are made.
 +
 
 +
 
 +
'''Intermediate maturity.
 +
 
 +
* This level applies to pages that are so a) large, b) complex, c) interconnected, or d) contradictory that it is (in the moderator's opinion) likely that the page will suffer damage from poor edits. The moderator should give motivation why she thinks that this particular page is under greater risk of damage than other pages.
 +
* Edits by other users than the moderator of the page are technically prevented.
 +
* Users contribute using the talk page. Users place their arguments in the respective discussions, or they start new discussions.
 +
* The moderator will update the main page based on resolved discussions. She can also make other changes to the main page, but not something that is against resolutions.
 +
 
 +
 
 +
'''Mature pages. (Not in use at the moment)
 +
 
 +
* Also the talk page is protected from the users. They contribute using a contribution tool, which has similar functionalities as a blog discussion forum. The users must refer to an existing argument or statement, or they can also start a new discussion by making a new statement.
 +
* The moderator will take all new substantive material from the contributions and locate it into a relevant place in the talk page into a formal or informal discussion. The resolutions of the discussions will be inserted into the main page.
 +
* All intermediate pages must be [[peer review]]ed before their status is upgraded to mature.{{reslink|Peer review level}}
  
 
== See also ==
 
== See also ==
Line 65: Line 97:
 
* [http://www.wikisym.org/ws2008/index.php/Measuring_Author_Contributions_to_the_Wikipedia Measuring Author Contributions to the Wikipedia]
 
* [http://www.wikisym.org/ws2008/index.php/Measuring_Author_Contributions_to_the_Wikipedia Measuring Author Contributions to the Wikipedia]
 
* [http://www.wikisym.org/ws2008/index.php/A_Method_for_Measuring_Co-authorship_Relationships_in_MediaWiki A Method for Measuring Co-authorship Relationships in MediaWiki]
 
* [http://www.wikisym.org/ws2008/index.php/A_Method_for_Measuring_Co-authorship_Relationships_in_MediaWiki A Method for Measuring Co-authorship Relationships in MediaWiki]
 +
* [[:Template:Quality assessment]]
 +
* [[:Category:Pages of assessed quality]]
 +
* [[:Heande:Quality assessment of a page]]
 +
 +
Some formal procedures for ensuring quality of a scientific article are listed below, and they may be considered:
 +
* CONSORT statement - All randomised controlled trials [http://www.consort-statement.org]
 +
* QUOROM statement - All systematic reviews [http://www.consort-statement.org/evidence.html#quorom]
 +
* EVEREST statement - All economic evaluations [http://bmj.com/advice/checklists.shtml#eco]
 +
* STARD statement - All diagnostic research papers [http://www.consort-statement.org/stardstatement.htm]
 +
 +
 +
==References==
 +
 +
<references/>

Latest revision as of 14:56, 4 November 2009


Quality control measures are needed to ensure that the risk assessments are reflecting the current best scientific knowledge and the state-of-the-art methods. Quality control will impact the process of making an assessment, as well as the contents of the product. In open assessments, this is especially challenging, as the contents of the product may be changed by any participant. Special procedures are needed to make this possible but ensure the assurance of the quality at the same time.

Scope

Definition

Data

Dependencies

Result

All different quality measures apply to a certain version of a page and are therefore time-dependent. However, the editing work of the page may go on despite a quality assessment. Therefore, people should be careful about the quality assessments and always pay attention to the date and possible changes made since.

Quality controls of the content

The content of an assessment is evaluated for each object (assessment, variable) separately. Quality controls can typically be performed for all formal objects and all attributes. However, there are a few exceptions.

  • A systematic literature review can be performed for the definition attribute of a variable.
  • An uncertainty analysis can be performed for the result attribute of a variable.

Systematic literature review

A systematic literature review is performed according to the rules of Cochrane Collaboration. The options are

  • not performed (default)
  • performed.

Open participatory discussion

An open participatory discussion is performed according to a pragma-dialectical argumentation theory[1].

A discussion is considered resolved, when the outcome of the discussion is synthesises based on all arguments in the discussion, and the resolution is incorporated into the relevant place in the main page. It should be noted that a resolution does not mean that the discussants agree on the topic; it means that they agree on what the agreements and disagreements are, and this is reflected in the discussion as a lack of new contributions. The options are

  • no discussion about the topic (default)
  • the topic under discussion
  • discussion about the topic resolved.

Technical quality check

A technical quality check is performed by the moderator of a page. The technical quality grade is her personal opinion. The options are

  • not evaluated (default)
  • intermediate
  • up to the standard

Uncertainty analysis

The result of a variable is a probability distribution based on all information in the definition, and in theory it reflects the best current understanding of the uncertainty about the variable. However, this is only true for variables that have gone through a process where a large group of scientific experts have contributed and showed their acceptance about the validity of the estimates.[2] Thus, the probability distribution, as a subjective probability, is always subject to acceptance by individuals that are considered experts, and ultimately, by myself.

Development, evaluation, and application of environmental models

There is an EPA report about this topic [3]

"This Guidance recommends best practices to help determine when a model, despite its uncertainties, can be appropriately used to inform a decision. Specifically, it recommends that model developers and users: (a) subject their model to credible, objective peer review; (b) assess the quality of the data they use; (c) corroborate their model by evaluating the degree to which it corresponds to the system being modeled; and (d) perform sensitivity and uncertainty analyses. Sensitivity analysis evaluates the effect of changes in input values or assumptions on a model's results. Uncertainty analysis investigates the effects of lack of knowledge and other potential sources of error in the model (e.g., the “uncertainty” associated with model parameter values) and when conducted in combination with sensitivity analysis allows a model user to be more informed about the confidence that can be placed in model results. A model’s quality to support a decision becomes known when information is available to assess these factors."

Maturity of a page

Each page in [Opasnet]] is at different level of maturity. More mature pages are protected more strictly to prevent degradation of quality due to uninformed edits. The levels of maturity (or page phase) are classified into three groups:

Maturity not determined.

  • This level applies to all new pages by default. A page will be on this level until there are specific needs to apply a more strict protection (see below).
  • If users want to edit pages, they must register and give their real name either in the user name or in their own user page.
  • Edits must be done according to the good practice of editing. However, this is not protected by technical means.
  • Users may also use the discussion page if they do not want to edit the page content directly.
  • Moderators will check for new edits and remove material that is against policies or guidelines.
  • For major edits, a discussion must be held at the respective Talk page before the edits are made.


Intermediate maturity.

  • This level applies to pages that are so a) large, b) complex, c) interconnected, or d) contradictory that it is (in the moderator's opinion) likely that the page will suffer damage from poor edits. The moderator should give motivation why she thinks that this particular page is under greater risk of damage than other pages.
  • Edits by other users than the moderator of the page are technically prevented.
  • Users contribute using the talk page. Users place their arguments in the respective discussions, or they start new discussions.
  • The moderator will update the main page based on resolved discussions. She can also make other changes to the main page, but not something that is against resolutions.


Mature pages. (Not in use at the moment)

  • Also the talk page is protected from the users. They contribute using a contribution tool, which has similar functionalities as a blog discussion forum. The users must refer to an existing argument or statement, or they can also start a new discussion by making a new statement.
  • The moderator will take all new substantive material from the contributions and locate it into a relevant place in the talk page into a formal or informal discussion. The resolutions of the discussions will be inserted into the main page.
  • All intermediate pages must be peer reviewed before their status is upgraded to mature.R↻

See also

Some formal procedures for ensuring quality of a scientific article are listed below, and they may be considered:

  • CONSORT statement - All randomised controlled trials [2]
  • QUOROM statement - All systematic reviews [3]
  • EVEREST statement - All economic evaluations [4]
  • STARD statement - All diagnostic research papers [5]


References

  1. van Eemeren and Grootendorst, 2004
  2. In a previous version we promoted the idea of explicating the absolute amount of data used for deriving a probability distribution, but we abandoned it.
  3. The Council for Regulatory Environmental Modeling: Draft Guidance on the Development, Evaluation, and Application of Regulatory Environmental Models. Models Guidance Draft - November 2003. [1]