From Testiwiki
Jump to: navigation, search

<section begin=glossary />

Performance is the measure of how well an object fulfills its purpose. In this context, we talk about objects that are used in assessments for describing reality or methods to produce these descriptions.

<section end=glossary />

Research question about performance structure
What is a structure for performance such that it
  • covers all aspects of the performance of an assessment,
  • is a reflection of the purposes of an assessment,
  • can be applied to all attributes of all objects,
  • defines an external standard against which the performance can be evaluated,
  • defines by whom the performance should be evaluated,
  • complies with PSSP ontology.

Attribute Property evaluated Reference Evaluator Comments
Name Not interesting to be evaluated
Scope Relevance Object's use purpose in its context User (?)
  • Structure: Model uncertainty
  • State: Work done
  • State-of-the-art
  • Work load needed
  • Peer-reviewer
  • Participants
Result Truthlikeness A golden standard Expert Can be divided into calibration and informativeness.

Note that the result of evaluation may change if a different reference is chosen.

Three dimensions: towards the use purpose, the making, and the truth

  • Relevance is evaluated in the narrative description of the upper-level object. E.g. the scope of a variable is evaluated in the Definition/narrative description of an assessment.
  • State of the description: participant evaluates the state against the resources available or needed. This becomes into the narrative description of the Definition. This can also be evaluated by an external reviewer based on the methodology used. The evaluation comes to Definition/narrative description as well.
  • The result is evaluated against the truth. It comes to the narrative descriptioin and is performed by an outside expert against some external golden standard.

Performance evaluation by users

Users are the ultimate evaluators of an object. To facilitate this, it is useful to have a general set of questions that apply to any information object in Opasnet. It will be put to the end of each wiki page:

Please evaluate this page from your own point of view. Your evaluation will be used as a piece of important information to give merit and also financial compensation to people who have made this page possible. It only takes a few minutes from you, but it is very important for us. We are happy even if you answer only some of the questions, but don't hesitate to answer to all questions where you have an opinion.

  1. Overall rating of the page and its information.
    • I like it.
    • I don' like it because _________________.
    • I don't know/I don't want to tell.
  2. Relevance: Was this page about the topic you were looking for?
    • Yes.
    • No, but it was interesting anyway.
    • No. I was looking for __________________.
    • I don't know/I don't want to tell.
  3. Informativeness: Was the page informative?
    • Yes.
    • No, it was vague.
    • I don't know/I don't want to tell.
  4. Do you think that the information reflects the truth?
    • Yes.
    • No. I believe that _____________.
    • I don't know/I don't want to tell.
  5. Acceptability: Was the information convincing to you?
    • Yes.
    • No, because ____________________.
    • I don't know/I don't want to tell.
  6. Applicability: Can you apply the information in your problem?
    • Yes.
    • No, because ____________________.
    • I don't know/I don't want to tell.
  7. Usability: Was the information easy to find and understand?
    • Yes.
    • No, because ____________________.
    • I don't know/I don't want to tell.
  8. Tell us about yourself.
    • I am an expert of this topic.
    • I am well informed about this topic.
    • I have general information about this topic.
    • I don't know much about the topic.
    • I don't know/I don't want to tell.

See also