Skip to content

HED Schema V. 7.0.5 (HED 2G Deprecated)

VisLab edited this page Aug 18, 2021 · 1 revision

HED version: 7.0.5

Changelog

  • 9/29/2019: v. 7.0.5: Added 'Visual/Rendering type/Screen/Head-mounted display'. Added 'extensionAllowed' to Sensory presentation. Added 'Tone' and 'Genre' under 'Sensory presentation/Auditory/Music'.
  • 8/18/2019: v. 7.0.4: Added 'extensionsAllowed' to 'Custom' and removed 'extensionsAllowed' from nodes under 'Attribute' and 'Item/Object'. Also replaced 'per-sec' with 'per-s' in units.
  • 7/8/2019: v. 7.0.3: Removed the extra 'Action/Turn' node.
  • 5/18/2019: v. 7.0.2: Added centisecond and millisecond as valid time units.
  • 3/7/2018: v. 7.0.1: Added Paradigm/Psychomotor vigilance task.
  • 2/15/2018: v. 7.0.0: Removed parentheses from some paradigm tags (used hyphens instead)
  • 1/11/2018: v. 6.0.5: Changed unit class for a number of values from 'velocity' to 'speed'.
  • 8/4/2017: v. 6.0.4: Added 'Machine failure detection task' to paradigms list.
  • 7/20/2017: v. 6.0.3: Removed 'Action observation paradigm' since there is already 'Action observation task'.
  • 7/17/2017: v. 6.0.2: Added 'Attribute/Object control/Accelerate' and Attribute/Object control/Decelerate'. Also added a description for extensionAllowed.
  • 7/14/2017: v. 6.0.1: Replaced 'Item/Natural scene/Arial' with 'Item/Natural scene/Aerial'.
  • 7/13/2017: v. 6.0.0: Replaced XXX in Event/Category/Experimental stimulus/Instruction/XXX with Event/Category/Experimental stimulus plus (Attribute/Instruction, Action/XXX). Moved a number of instructions under Action/. Also added 'Rest eyes open', 'Rest eyes closed' and 'Imagined emotion' paradigms. Removed Action/Imagine and instead one should use Attribute/Imagined.
  • 6/22/2017: v. 5.2.3: Added clarification for Attribute/Background (stimuli to be ignored)
  • 3/22/2017: v. 5.2.2: Fixed the missing [in]Attribute/Imagined.
  • 3/16/2017: v. 5.2.1: Added Action/Interact, Action/Interact/With human, Action/Take survey, Action/Eye blink/Left base, Left zero, Left half height, Right half height, Right zero, right base.
  • 3/7/2017, v 5.1.1: Added Attribute/Blink/Duration and fixed some issues.
  • 2/16/2017, v 5.0.0: Changed Attribute/Presentation fraction to Attribute/Presentation/Fraction and added a few other attributes under Attribute/Presentation. Removed Participant/Role and and added Attribute/Role.
  • 2/9/2017, v 4.0.5: added Paradigm/ID screening tag and moved language n-back under n-back paradigm. Also moved Flickering/rate to Attribute/Temporal rate.
  • 2/2/2017, v 4.0.4: moved Height and Width from Experiment/Context/Fixed screen/ to under Attribute/
  • 1/31/2017, v 4.0.3: added Attribute/Condition.
  • 1/5/2017, v. 4.0.2: fixed description in Sentence and Paragraph
  • 12/1/2016, v. 4.0.1: added Blink action and attributes.
  • 11/10/2016, v. 4.0.0: changed "Event/Sequence group id" to "Event/Group ID". Added "Paradigm/Instructed movement", "Attribute/Response start delay" and "Attribute/Response end delay".
  • 9/19/2016: Typo in Custom tag description fixed.
  • 9/8/2016: Added Attribute/Action judgment/Indeterminate.
  • 9/6/2016: Added Attribute/Imagined
  • 9/6/2016 Added back Experiment control/Activity/Participant action.in
  • 9/1/2016: Added Eyes open/Keep,Eyes close/Keep. Added Action/Make fist Action/Curl toes. MAJOR: moved Participant action/... from Event/Category/Experiment control/Activity to under Attribute/Activity judgment/. This is a backward incompatible change.
  • 8/4/2016: Added Attribute/Participant indication tag.
  • 7/21/2016: Added Attribute/Intended effect tag. This can be used e.g. to specify an image in an RSVP experiment where it was intended to be perceived as a target (but it may not have by the subject).
  • 6/22/2016: Added Attribute/Temporal uncertainty tag along with its child nodes. Also required a child for Attribute/Presentation fraction tag.
  • 6/16/2016: Added sleep stages.
  • 5/23/2016: Added more description to Attribute/Probability, allowing text values like 'low' and 'high' in addition to numerical values between 0 and 1.
  • 3/31/2016: Added constant and variable delays under Participant/Effect/Cognitive/Cue.
  • 3/10/2016:
    • Added Non-informative under Cognitive/Feedback
    • Added Cue under Cognitive
  • 2/25/2016:
    • Removed /Participant/Effect/Cognitive/Oddball/Three stimuli/Target
    • Moved Non-target from under Expected to the same level as Target, this is because in Three stimuli oddball paradigm the Non-targets are oddballs and hence not expected.
    • Note: user are encouraged to add Expected and Oddball tags when tagging targets and non-targets
    • Added Presentation fraction
Syntax

HED tags can only be separated by commas. Use semicolons (;) instead of commas in event descriptions since otherwise extra commas could confuse HED parsers. From version 2.2, HED adheres to http://semver.org/ versioning.

!# start hed

Event

  • Category {required, requireChild, predicateType=passThrough, position=1} [This is meant to designate the reason this event was recorded]
    • Initial context [The purpose is to set the starting context for the experiment --- and if there is no initial context event---this information would be stored as a dataset tag]
    • Participant response [The purpose of this event was to record the state or response of a participant. Note: participant actions may occur in other kinds of events, such as the experimenter records a terrain change as an event and the participant happens to be walking. In this case the event category would be Environmental. In contrast, if the participant started walking in response to an instruction or to some other stimulus, the event would be recorded as Participant response]
    • Technical error [Experimenters forgot to turn on something or the cord snagged and something may be wrong with the data]
      • # {takesValue} [Description as string]
    • Participant failure [Situation in which participant acts outside of the constraints of the experiment -- such as driving outside the boundary of a simulation experiment or using equipment incorrectly]
      • # {takesValue} [Description as string]
    • Environmental [Change in experimental context such as walking on dirt versus sidewalk]
      • # {takesValue} [Description as a string]
    • Experimental stimulus
      • Instruction
    • Experimental procedure [For example doing a saliva swab on the person]
      • # {takesValue}[Description as string]
    • Incidental [Not a part of the task as perceived by/instructed to the participant --- for example an airplane flew by and made noise or a random person showed up on the street]
      • # {takesValue}[Description as string]
    • Miscellaneous [Events that are only have informational value and cannot be put in other event categories]
      • # {takesValue}[Description as a string]
    • Experiment control [Information about states and events of the software program that controls the experiment]
      • Sequence{predicateType=propertyOf}
        • Permutation ID {requireChild}
          • # {takesValue} [Permutation number/code used for permuted experiment parts]
        • Experiment [Use Attribute/Onset and Attribute/Offset to indicate start and end of the experiment]
        • Block [Each block has the same general context and contains several trials -- use Attribute/Onset and Attribute/Offset to specify start and end]
          • # {takesValue} [Block number or identifier]
        • Trial [Use Attribute/Onset and Attribute/Offset to specify start and end]
          • # {takesValue} [Trial number or identifier]
        • Pause [Use Attribute/Onset and Attribute/Offset to specify start and end]
      • Task
        • # {takesValue} [Label here]
      • Activity [Experiment-specific actions such as moving a piece in a chess game]
        • Participant action
      • Synchronization [An event used for synchronizing data streams]
        • Display refresh
        • Trigger
        • Tag {predicateType=propertyOf}
          • # {takesValue} [Actual tag: string or integer]
      • Status
        • Waiting for input
        • Loading
        • Error
      • Setup
        • Parameters {predicateType=propertyOf}
          • # {takesValue} [Experiment parameters in some a string. Do not used quotes.]
  • ID {predicateType=propertyOf} [A number or string label that uniquely identifies an event instance from all others in the recording (a UUID is strongly preferred).]
    • # {takesValue} [ID of the event]
  • Group ID {predicateType=propertyOf} [A number or string label that uniquely identifies a group of events associated with each other.]
    • # {takesValue} [ID of the group]
  • Duration {requireChild, predicateType=propertyOf} [An offset that is implicit after duration time passed from the onset]
    • # {takesValue, isNumeric, unitClass=time}
  • Description {requireChild, required, unique, predicateType=propertyOf, position=3}[Same as HED 1.0 description for human-readable text]
    • # {takesValue}
  • Label {requireChild, required, unique, predicateType=propertyOf, position=0} [A label for the event that is less than 20 characters. For example /Label/Accept button. Please note that the information under this tag is primarily not for use in the analysis and is provided for the convenience in referring to events in the context of a single study. Please use Custom tag to define custom event hierarchies. Please do not mention the words Onset or Offset in the label. These should only be placed in Attribute/Onset and Attribute/Offset. Software automatically generates a final label with (onset) or (offset) in parentheses added to the original label. This makes it easier to automatically find onsets and offsets for the same event.]
    • # {takesValue}
  • Long name {requireChild, unique, predicateType=propertyOf, position=2} [A long name for the event that could be over 100 characters and could contain characters like vertical bars as separators. Long names are used for cases when one wants to encode a lot of information in a single string such as Scenario | VehiclePassing | TravelLaneBLocked | Onset].
    • # {takesValue}
Item
  • ID {requireChild, predicateType=propertyOf} [Optional]
    • # {takesValue}
    • Local [For IDs with local scope --- that is IDs only defined in the scope of a single event. The local ID 5 in events 1 and 2 may refer to two different objects. The global IDs directly under ID/ tag refer to the same object through the whole experiment]
      • # {takesValue}
  • Group ID {requireChild, predicateType=propertyOf} [Optional]
    • # {takesValue}
  • Object {extensionAllowed} [Visually discernable objects. This item excludes sounds that are Items but not objects]
    • Vehicle
      • Bicycle
      • Car
      • Truck
      • Cart
      • Boat
      • Tractor
      • Train
      • Aircraft
        • Airplane
        • Helicopter
    • Person
      • Pedestrian
      • Cyclist
      • Mother-child
      • Experimenter
    • Animal
    • Plant
      • Flower
      • Tree
        • Branch
        • Root
    • Building
    • Food
      • Water
    • Clothing
      • Personal [clothing]
    • Road sign
    • Barrel
    • Cone
    • Speedometer
    • Construction zone
    • 3D shape
    • Sphere
    • Box
      • Cube
  • 2D shape [Geometric shapes]
    • Ellipse
      • Circle
    • Rectangle
      • Square
    • Star
    • Triangle
    • Gabor patch
    • Cross [By default a vertical-horizontal cross. For a rotated cross add Attribute/Object orientation/Rotated/ tag]
    • Single point
    • Clock face [Used to study things like hemispheric neglect. The tag is related to the clock-drawing-test]
      • # {takesValue, unitClass=time} [Hour:min]
  • Pattern
    • Checkerboard
    • Abstract
    • Fractal
    • LED
    • Dots
      • Random dot
    • Complex
  • Face
    • Whole face with hair
    • Whole face without hair
    • Cut-out
    • Parts only
      • Nose
      • Lips
      • Chin
      • Eyes
        • Left only
        • Right only
  • Symbolic [Something that has a meaning, could be linguistic or not such as a stop signs.]
    • Braille character
    • Sign [Like the icon on a stop sign. This should not to be confused with the actual object itself.]
      • Traffic
        • Speed limit
          • # {takesValue, isNumeric, unitClass=speed} [Always give units e.g. mph or kph]
    • Character
      • Digit
      • Pseudo-character [Alphabet-like but not really]
      • Letter [Authograph or valid letters and numbers such as A or 5]
        • # {takesValue}
    • Composite [Has multiple of the above on it
  • Natural scene
    • Aerial
      • Satellite
  • Drawing [Cartoon or sketch]
    • Line drawing
  • Film clip
    • Commercial TV
    • Animation
  • IAPS [International Affective Picture System]
  • IADS [International Affective Digital Sounds]
  • SAM [The Self-Assessment Manikin]
Sensory presentation {extensionAllowed} [Object manifestation]
  • Auditory [Sound]
    • Nameable
    • Cash register
    • Ding [Often associated with positive valence]
    • Buzz [Often associated with negative valence]
    • Fire alarm
    • Click
      • ABR [Auditory Brainstem Response]
    • Tone
    • Siren
    • Music
      • Chord sequence
      • Vocal
      • Instrumental
      • Tone
      • Genre
    • Noise
      • White
      • Colored [Not white --- for example a 1/f spectrum]
    • Human voice
    • Animal voice {extensionAllowed}
      • Bird
      • Dog
      • Insect
      • Squirrel
    • Real world [For example people walking or machines operating]
      • Pedestrian
      • Footsteps
        • Walking
        • Running
      • Noisemaker
      • Construction noise
      • Machine
      • Vehicle
        • Horn
        • Aircraft
          • Airplane
          • Helicopter
        • Train
        • Cart
        • Car alarm
        • Car
        • Bicycle
    • Nonverbal vocal
      • Emotional
        • Crying
        • Sighing
      • Gulp
      • Gurgle
      • Sneeze
      • Cough
      • Yawn
    • Nonvocal [A car engine or gears grinding --- anything that is not made by a human or an animal]
      • Engine
  • Olfactory [Odor]
  • Taste
  • Tactile [Pressure]
  • Visual
    • Rendering type {requireChild, predicateType=passThrough}
      • Screen
        • Head-mounted display
        • View port [Two or more views on the same object --- for example one from top one from street view]
          • ID {predicateType=passThrough}
            • # {takesValue} [A descriptive label for the viewport]
        • 2D
        • 3D
        • Movie
          • Video-tape
          • Motion-capture [Stick figure of motion capture of someone else]
            • Point light
            • Stick figure
            • Outline
          • Flickering
          • Steady state
      • Real-world
      • LED [Stimulus is turning on/off one or a few LEDs]
Attribute {requireChild, extensionAllowed}
  • Onset [Default]
  • Offset
  • Imagined [This is used to identity that the (sub)event only happened in participant's imagination, e.g. imagined movements in motor imagery paradigms.]
  • State ID {requireChild} [This is used to identify a group of events that are changing the state of a variable where the onset means the offset of any other and a change in the state]
    • # {takesValue} [ID which could be a number or any string]
  • Repetition {requireChild} [When the same type of event such as a fixation on the exact same object happens multiple times and it might be necessary to distinguish the first look vs. others]
    • # {takesValue, isNumeric} [Number starting from 1 where 1 indicates the first occurrence and 2 indicates the second occurrence]
  • Temporal rate {predicateType=propertyOf}
    • # {takesValue, isNumeric, unitClass=frequency} [In Hz]
  • Condition {requireChild} [Specifies the value of an independent variable (number of letters, N, in an N-back task) or function of independent variables that is varied or controlled for in the experiment. This attribute is often specified at the task level and can be associated with specification of an experimental stimulus or an experiment context (e.g. 1-back, 2-back conditions in an N-back task, or changing the target type from faces to houses in a Rapid Serial Visual Presentation, or RSVP, task.)]
    • # {takesValue} [the condition]
  • Action judgment [External judgment (assumed to be ground truth, e.g. from an experiment control software or an annotator) about participant actions such as answering a question, failing to answer in time, etc.]
    • Correct
    • Incorrect [Wrong choice but not time out]
    • Indeterminate [It cannot be determined that the action was correct or incorrect.]
    • Time out
      • Missed [Participant failed or could not have perceived the instruction due to their eyes being off-screen or a similar reason. Not easy to deduce this but it is possible]
    • Inappropriate [A choice that is not allowed such as moving a chess piece to a location it should not go based on game rules]
  • Response start delay [The time interval between this (stimulus) event and the start of the response event specified, usually by grouping with the event ID of the response start event.]
    • # {takesValue, unitClass=time}
  • Response end delay [The time interval between this (stimulus) event and the end of the response event specified, usually by grouping with the event ID of the response end event.]
    • # {takesValue, unitClass=time}
  • Social [Involving interactions among multiple agents such as humans or dogs or robots]
  • Peak [Peak velocity or acceleration or jerk]
  • Object side {requireChild} [Could be the left, right, or both sides of a person or a vehicle]
    • Reference object ID {requireChild,predicateType=propertyOf}[Place object ID after this]
      • # {takesValue}
    • Right
    • Left
    • Front
    • Back
    • Top
    • Bottom
    • Starboard
    • Port
    • Passenger side [Side of a car]
    • Driver side [Side of a car]
    • Bow [Front of a ship]
    • Stern [Back of the ship]
  • Direction {requireChild} [Coordinate system is inferred from Attribute/Location. To specify a vector combine subnodes with number --- for example Attribute/Top/10, Attribute/Direction/Left/5 to create a vector with coordinates 10 and 5]
    • Top [Combine Attribute/Direction/Top and Attribute/Direction/Left to mean the upper left]
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • Bottom
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • Left
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • Right
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • Angle [Clockwise angle in degrees from vertical]
      • # {takesValue, isNumeric, unitClass=angle} [Clockwise angle in degrees from vertical]
    • North
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • South
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • East
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • West
      • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
    • Forward [Like a car moving forward]
    • Backward [Like a car moving backward]
  • Location {requireChild} [Spot or center of an area. Use Area were you are referring to something with significant extent and emphasizing its boundaries, like a city]
    • # {takesValue} [location label]
    • Screen [Specify displacements from each subnode in pixels or degrees or meters. Specify units such as Attribute/Location/Screen/Top/12 px]
      • Center
        • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
      • Top [You can combine Attribute/Location/Top and Attribute/Location/Left to designate UpperLeft and so on]
        • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
      • Bottom
        • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
      • Left
        • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
      • Right
        • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength, unitClass=pixels}
      • Angle
        • # {takesValue, isNumeric, unitClass=angle} [Clockwise angle in degrees from vertical]
      • Center displacement
        • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength} [displacement from screen center, in any direction, in degrees, cm, or other lengths]
        • Horizontal
          • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength} [Displacement from screen center in any direction]
        • Vertical
          • # {takesValue, isNumeric, unitClass=angle, unitClass=physicalLength}[Displacement from screen center in any direction]
    • Lane [For example a car lane]
      • Rightmost
      • Leftmost
      • Right of expected
      • Left of expected
      • Cruising
      • Passing [The lane that cars use to take over other cars]
      • Oncoming
    • Real-world coordinates {requireChild}
      • Room
        • xyz {requireChild} [have a subnode, e.g. Attribute/Location/Real-world coordinates/Room/xyz/[10 50 30]]
          • # {takesValue}
    • Reference frame {requireChild}
      • Specified absolute reference
      • Relative to participant
        • Participant ID {requireChild}
          • # {takesValue}
        • Left
        • Front
        • Right
        • Back
        • Distance {requireChild}
          • # {takesValue, isNumeric, unitClass=physicalLength} [Distance is in meters by default]
          • Near
          • Moderate
          • Far
        • Azimuth {requireChild}
          • # {takesValue, isNumeric, unitClass=angle} [Clockwise with units preferably in degrees]
        • Elevation {requireChild}
          • # {takesValue, isNumeric, unitClass=angle} [Preferably in degrees]
  • Object orientation {requireChild}
    • Rotated
      • Degrees {requireChild}
        • #{takesValue, isNumeric, unitClass=angle}[Preferably in degrees]
  • Size {requireChild}
    • Length{requireChild}
      • # {takesValue, isNumeric, unitClass=physicalLength} [In meters or other units of length ]
    • Width
      • # {takesValue, isNumeric, unitClass=physicalLength} [in meters]
    • Height
      • # {takesValue, isNumeric, unitClass=physicalLength} [Default units are meters]
    • Area {requireChild}
      • # {takesValue, isNumeric, unitClass=area}
    • Volume {requireChild}
      • # {takesValue, isNumeric, unitClass=volume} [In cubic-meters or other units of volume]
    • Angle{requireChild}
      • # {takesValue, isNumeric, unitClass=angle} [In degrees or other units of angle]
  • Item count {requireChild} [Number of items for example when there are 3 cars and they are identified as a single item]
    • # {takesValue, isNumeric} [Numeric value of number of items #]
    • <=# {takesValue, isNumeric} [Number of items less than or equal to #]
    • >=# {takesValue, isNumeric} [Number of items more than or equal to #]
  • Auditory
    • Frequency {requireChild}
      • # {takesValue, isNumeric, unitClass=frequency} [In HZ]
    • Loudness {requireChild}
      • # {takesValue, isNumeric, unitClass=intensity} [in dB]
    • Ramp up [Increasing in amplitude]
    • Ramp down [Decreasing in amplitude]
  • Blink
    • Time shut {requireChild} [The amount of time the eyelid remains closed (typically measured as 90% of the blink amplitude), in seconds.]
      • # {takesValue, isNumeric, unitClass=time, default=s}
    • Duration {requireChild} [Duration of blink, usually the half-height blink duration in seconds taken either from base or zero of EEG signal. For eye-trackers, usually denotes interval when pupil covered by eyelid.]
      • # {takesValue, isNumeric, unitClass=time, default=s}
    • PAVR {requireChild} [Amplitude-Velocity ratio, in centiseconds]
      • # {takesValue, isNumeric, unitClass=time, default=centiseconds}
    • NAVR {requireChild} [Negative Amplitude-Velocity ratio, in centiseconds]
      • # {takesValue, isNumeric, unitClass=time, default=centiseconds}
  • Visual
    • Bistable
    • Background
    • Foreground
    • Up-down separated [Stimuli presented both at the top and the bottom of fovea]
      • # {takesValue, isNumeric, unitClass=angle} [Angle of separation in degrees by default]
    • Bilateral [For bilateral visual field stimulus presentations]
      • # {takesValue, isNumeric, unitClass=angle} [Angle of separation in degrees by default]
    • Motion
      • Down
        • # {takesValue, isNumeric, unitClass=speed} [e.g. 3 degrees-per-second]
      • Up
        • # {takesValue, isNumeric, unitClass=speed} [e.g. 3 degrees-per-second]
      • Horizontal
        • Right
          • # {takesValue, isNumeric, unitClass=speed} [e.g. 3 degrees-per-second]
        • Left
          • # {takesValue, isNumeric, unitClass=speed} [e.g. 3 degrees-per-second]
      • Oblique
        • Clock face
          • # {takesValue, unitClass=time} [For example 4:30]
    • Fixation point
    • Luminance {requireChild}
      • # {takesValue, isNumeric, unitClass=luminousIntensity}[In candelas by default]
    • Color {requireChild}
      • Dark
      • Light
      • Aqua [These are CSS 3 basic color names]
      • Black
      • Fuchsia
      • Gray
      • Lime
      • Maroon
      • Navy
      • Olive
      • Purple
      • Silver
      • Teal
      • White
      • Yellow
      • Red
        • # {takesValue, isNumeric} [R value of RGB between 0 and 1]
      • Blue
        • # {takesValue, isNumeric} [B value of RGB between 0 and 1]
      • Green
        • # {takesValue, isNumeric} [G value of RGB between 0 and 1]
      • Hue {requireChild}
        • # {takesValue, isNumeric} [H value of HSV between 0 and 1]
      • Saturation {requireChild}
        • # {takesValue, isNumeric} [S value of HSV between 0 and 1]
      • Value {requireChild}
        • # {takesValue, isNumeric} [V value of HSV between 0 and 1]
      • Achromatic [Indicates gray scale]
        • # {takesValue, isNumeric} [White intensity between 0 and 1]
  • Nonlinguistic [Something that conveys meaning without using words such as the iconic pictures of a man or a woman on the doors of restrooms. Another example is a deer crossing sign with just a picture of jumping deer.]
  • Semantic [Like in priming or in congruence]
  • Language
    • Unit {predicateType=passThrough}
      • Phoneme
      • Syllable
      • Word
        • Noun
          • Proper [A proper noun that refers to a unique entity such as London or Jupiter]
          • Common [A noun that refers to a class of entities such as cities or planets or corporations such as a Dog or a Skyscraper]
        • Verb
        • Adjective
        • Pseudoword
        • # {takesValue} [Actual word]
      • Sentence
        • Full
        • Partial
        • # {takesValue} [Actual sentence]
      • Paragraph
        • # {takesValue} [Actual paragraph]
      • Story [Multiple paragraphs making a detailed account]
    • Family {predicateType=passThrough}
      • Asian
        • Chinese
        • Japanese
      • Latin
        • English
        • German
        • French
  • Induced [Such as inducing emotions or keeping someone awake or in a coma with an external intervention]
  • Emotional
    • Arousal [Only in the context of 2D emotion representation]
      • # {takesValue, isNumeric} [A value between -1 and 1]
    • Positive valence [Valence by itself can be the name of an emotion such as sadness so this tag distinguishes the type of emotion]
      • # {takesValue, isNumeric} [Ranges from 0 to 1]
    • Negative valence
      • # {takesValue, isNumeric}[Ranges from 0 to 1]
  • Priming
    • Motoric
    • Emotional
    • Perceptual
  • Subliminal
    • Unmasked
    • Masked
      • Forward
      • Backward
  • Supraliminal [By default this is assumed about each stimulus]
  • Liminal [At the 75%-25% perception threshold]
  • Probability [Use to specify the level of certainty about the occurrence of the event. Use either numerical values as the child node or 'low', 'high', etc.]
  • Temporal uncertainty {requireChild} [Use to specify the amount of uncertainty in the timing of the event. Please notice that this is different from Attribute/Probability tag which relates to the occurrence of event and can be interpretative as the integral of probability density across a distribution whose shape (temporal extent) is specified by Attribute/Temporal uncertainty].
    • # {takesValue, isNumeric, unitClass=time} [implies that the temporal uncertainty is a uniform distribution with the amount of time provided, extending on both side. E.g. "0.5 s" specifies a 1-second range centered at event latency.
    • Standard deviation {requireChild} [implies that the distribution of temporal uncertainty is Gaussian with the provided standard deviation (in seconds).]
      • # {takesValue, isNumeric, unitClass=time}
  • Presentation {requireChild}[Attributes associated with visual, auditory, tactile, etc. presentation of an stimulus]
    • Fraction {requireChild} [the fraction of presentation of an Oddball or Expected stimuli to the total number of same-class presentations, e.g. 10% of images in an RSVP being targets]
      • # {takesValue, isNumeric}
    • Cued [what is presented is cued to the presentation of something else]
    • Background [presented in the background such as background music, background image, etc. The main factor here is that background presentations are to be ignored, e.g. ignore math question auditory stimuli.]
  • Intended effect [This tag is to be grouped with Participant/Effect/Cognitive to specify the intended cognitive effect (of the experimenter). This is to differentiate the resulting group with Participant/Effect which specifies the actual effect on the participant. For example, in an RSVP experiment if an image is intended to be perceived as a target, (Participant/Effect/Cognitive/Target, Attribute/Intended effect) group is added. If the image was perceived by the subject as a target (e.g. they pressed a button to indicate so), then the tag Participant/Effect/Cognitive/Target is also added: Participant/Effect/Cognitive/Target, (Participant/Effect/Cognitive/Target, Attribute/Intended effect). otherwise the Participant/Effect/Cognitive/Target tag is not included outside of the group.]
  • Instruction [This tag is placed in events of type Event/Category/Experimental stimulus/Instruction, grouped with one or more Action/ tags to replace the detailed specification XXX in Event/Category/Experimental stimulus/Instruction/XXX) in previous versions. Usage example: Event/Category/Experimental stimulus/Instruction, (Action/Fixate, Attribute/Instruction) ]
  • Participant indication [This tag is placed in events of type Event/Category/Participant response and grouped with Participant/Effect/Cognitive/.. tags to specify the type of cognitive effect the participant has experienced. For example, in an RSVP paradigm, the subject can indicate the detection of a target with a button press. The HED string associated with this button press must include (Attribute/Participant indication, Participant/Effect/Cognitive/Target,...)]
  • Path {requireChild}
    • Velocity [Use Attribute/Onset or Attribute/Offset to specify onset or offset]
      • # {takesValue, isNumeric, unitClass=speed} [Numeric value with default units of m-per-s]
    • Acceleration
      • # {takesValue, isNumeric, unitClass=acceleration} [Numeric value with default units of m-per-s2]
    • Jerk
      • # {takesValue, isNumeric, unitClass=jerk} [Numeric value with default units of m-per-s3]
    • Constrained [For example a path cannot cross some region]
  • File {requireChild} [File attributes]
    • Name
    • Size
      • # {takesValue, isNumeric, unitClass=memorySize} [Numeric value with default units of mb]
    • # {takesValue, isNumeric}[Number of files]
  • Object control {requireChild} [Specifies control such as for a vehicle]
    • Perturb
    • Collide
    • Near miss [Almost having an accident resulting in negative consequences]
    • Correct position [After a lane deviation or side of the walkway]
    • Halt [Time at which speed becomes exactly zero]
    • Brake
    • Shift lane
    • Cross [Crossing in front of another object such as a vehicle]
    • Pass by [Passing by another object or the participant]
    • Accelerate
    • Decelerate
  • Association {requireChild}
    • Another person [Item such as a cup belonging to another person]
    • Same person [Item such as a cup belonging to the participant]
  • Extraneous [Button presses that are not meaningful for example due to intrinsic mechanical causes after a meaningful press]
  • Role {requireChild} [The role of the agent (participant, character, AI..)]
    • Leader
    • Follower
    • # {takesValue}
Action [May or may not be associated with a prior stimulus and can be extended]
  • Involuntary [Like sneezing or tripping on something or hiccuping]
    • Hiccup
    • Cough
    • Sneeze
    • Stumble [Temporary and involuntary loss of balance]
    • Fall
    • Tether Jerk [When a tether attached to the subject is stuck/snagged and forces the participant to involuntary accommodate/react to it]
    • Clear Throat
    • Yawn
    • Sniffle
    • Burp
    • Drop [For example something drops from subject’s hand]
  • Make fist
    • Open and close [Continue to open and close the fist, for example in motor imagery paradigms.]
  • Curl toes
    • Open and close [Continue to curl and uncurl toes, for example in motor imagery paradigms.]
  • Button press {extensionAllowed}
    • Touch screen
    • Keyboard
    • Mouse
    • Joystick
  • Button hold [Press a button and keep it pressed]
  • Button release
  • Cross boundary
    • Arrive
    • Depart
  • Speech
  • Hum
  • Eye saccade [Use Attribute/Peak for the middle of saccade and Attribute/Onset for the start of a saccade]
  • Eye fixation
  • Eye blink
    • Left base [The time of the first detectable eyelid movement on closing.]
    • Left zero [The last time at which the EEG/EOG signal crosses zero during eyelid closing.]
    • Left half height [The time at which the EEG/EOG signal reaches half maximum height during eyelid closing.]
    • Max [The time at which the eyelid is the most closed.]
    • Right half height [The time at which the EEG/EOG signal reaches half maximum height during eyelid opening.]
    • Right zero [The first time at which the EEG/EOG signal crosses zero during eyelid opening.]
    • Right base [The time of the last detectable eyelid movement on opening.]
  • Eye close [Close eyes and keep closed for more than approximately 0.1 s]
    • Keep [Keep the eye closed. If a value is provided it indicates the duration for this.]
      • # [the duration (by default in seconds) that they keep their eye closed.]{takesValue, isNumeric, unitClass=time}
  • Eye open [Open eyes and keep open for more than approximately 0.1 s]
    • Keep [Keep the eye open. If a value is provided it indicates the duration for this.]
      • # [the duration (by default in seconds) that they keep their eye open.]{takesValue, isNumeric, unitClass=time}
      • With blinking [Default. Allow blinking during the eye-open period.]
      • Without blinking [Without blinking during the eye-open period.]
  • Turn [Change in direction of movement or orientation. This includes both turn during movement on a path and also rotations such as head turns]
  • Point
  • Push
  • Grab
  • Tap [When there is nothing to be pressed for example like tapping a finger on a chair surface to follow a rhythm]
  • Lift
  • Reach [Requires a goal such as reaching to touch a button or to grab something. Stretching your body does not count as reach.]
    • To Grab
    • To Touch
  • Course correction [Change the direction of a reach in the middle to adjust for a moving target.]
  • Interact
    • With human
  • Take survey
  • Stretch [Stretch your body such as when you wake up]
  • Bend
  • Deep breath
  • Laugh
  • Sigh
  • Groan
  • Scratch
  • Switch attention
    • Intramodal [In the same modality but with a change in details such as changing from paying attention to red dots and instead of blue dots]
      • Visual
      • Auditory
      • Tactile
      • Taste
      • Smell
    • Intermodal [Between modalities such as changing from audio to visual]
      • From modality
        • Visual
        • Auditory
        • Tactile
        • Taste
        • Smell
      • To modality
        • Visual
        • Auditory
        • Tactile
        • Taste
        • Smell
  • Walk
    • Stride [Use onset and offset attributes to indicate different walking stride stages]
    • Faster [increasing the speed of walking]
    • Slower [decreasing the speed of walking]
  • Control vehicle [Controlling an object that you are aboard]
    • Drive [Driving a vehicle such as a car]
      • Correct [Correct for a perturbation]
      • Near miss
      • Collide
    • Stop [Brake a car]
    • Pilot [Pilot a vehicle such as an airplane]
  • Teleoperate [Control an object that you are not aboard]
  • Allow [Allow access to something such as allowing a car to pass]
  • Deny [Deny access to something such as preventing someone to pass]
  • Step around
  • Step over
  • Step on
  • Swallow
  • Flex
  • Evade
  • Shrug
  • Dance
  • Open mouth
  • Whistle
  • Read
  • Attend
  • Recall
  • Generate
  • Repeat
  • Hold breath
  • Breathe
  • Rest
  • Count
  • Move
    • Upper torso
    • Lower torso
    • Whole body
  • Speak
  • Sing
  • Detect
  • Name
  • Smile
  • Discriminate
  • Track
  • Encode
  • Eye-blink inhibit
Participant {predicateType=passThrough}
  • ID {requireChild, predicateType=propertyOf} [If not given assume 1]
    • # {takesValue, isNumeric} [Numeric value of an ID]
  • Effect [How the stimulus effects the participants]
    • Cognitive
      • Meaningful
      • Not meaningful
      • Newly learned meaning
      • Reward
        • Low
        • Medium
        • High
        • # {takesValue, isNumeric, unitClass=currency} [Monetary values in some currency such as $10, or the ratio of the reward to the maximum possible (3 of max 10 becomes 0.3), or number Points]
      • Penalty
        • Low
        • Medium
        • High
        • # {takesValue, isNumeric, unitClass=currency} [Absolute monetary values in some currency, for example $1, or the ratio of the reward to the maximum possible (3 of max 10 becomes 0.3), or number of Points]
      • Error
        • Self originated
        • Other originated
          • Human
          • Non-human
        • Expected
        • Unexpected
        • Planned [The error feedback was given regardless of the validity of subject response as in a yoked design]
      • Threat
        • To self
        • To others
          • Close
      • Warning [As in a warning message that you are getting too close to the shoulder in a driving task]
      • Oddball [Unexpected or infrequent]
        • One stimulus [Only oddballs are present but no frequent stimuli exist. See http://dx.doi.org/10.1016/0167-8760(96)00030-X]
        • Two stimuli [There are non-targets and targets. See http://dx.doi.org/10.1016/0167-8760(96)00030-X]
        • Three stimuli [There are regular non-targets and targets and infrequent non-targets, see http://dx.doi.org/10.1016/0167-8760(96)00030-X]
        • Silent counting
        • Button pressing for target
        • Button pressing for all
      • Target [Something the subject is looking for]
      • Non-target [Make sure to tag Expected if the Non-target is frequent]
      • Novel [Genuinely novel such as an event occurring once or so per experiment]
      • Expected [Of low information value, for example frequent Non-targets in an RSVP paradigm]
        • Standard
        • Distractor
      • Valid [Something that is understood to be valid such as an ID matches the person being displayed and it has all the correct information]
      • Invalid [Something that is understood to not be valid such as like an ID with an impossible date-of-birth, or a photo not matching the person presenting it]
      • Congruence
        • Congruent [Like in Stroop paradigm when blue colored text displays the word blue]
        • Incongruent [Like in Stroop paradigm whena blue colored text reading displays the word red]
        • Temporal synchrony
          • Synchronous [When a mouse click sound happens right after clicking it]
          • Asynchronous [When a mouse click sound happens with significant delay which give could the person a strange feeling. Or if in a movie the sound of the explosion is heard before it appears visually.]
      • Feedback
        • Correct [Confirm something went well and last action was correct]
        • Incorrect [Confirm something went wrong and last action was incorrect]
        • Non-informative [Feedback that provides no information in regards to correct, incorrect, etc.]
        • Expected [Feedback was expected as in a positive feedback after a response that was expected to be correct.]
        • Unexpected [Feedback was unexpected as when positive feedback was received when response was expected to be incorrect.]
        • On accuracy [Feedback was provided by evaluating response accuracy]
        • On reaction time [Feedback was provided by evaluating subject reaction time]
        • To self [Default]
        • To other [Observed feedback to another person such as in a social paradigm]
        • Deterministic [Feedback has a fixed relationship to what happened before]
        • Stochastic [Feedback is non-deterministic and does not have fixed relationship with what has happened before in the experiment]
        • False feedback [Feedback that was not honest for example as in feedback of correct on an incorrect response or vice versa]
          • Negative [Negative feedback was provided when it was not deserved]
          • Positive [Positive feedback was provided when it was not deserved]
      • Cue [An indicator of a future event, e.g. a sound cue that in 2-5 seconds a perturbation in driving will occur. Use (... Participant/Effect/Cognitive/Cue ~ [hed tags for the event to follow, or main aspect of the event to follow]) syntax.]
        • Constant delay [The cue is for an event that will happen after a constant delay.]
          • # {takesValue, isNumeric, unitClass=time} [The delay, e.g. in seconds.]
        • Variable delay [The cue is for an event that will happen after a variable delay.]
          • # {takesValue} [The interval, e.g. between 2-5 seconds, of the variable delay.]
    • Visual
      • Foveal
      • Peripheral
      • Perturbation [Sudden movement or perturbation of the virtual environment in a car driving or other scenario]
    • Auditory
      • Stereo
      • Mono
        • Left
        • Right
    • TMS
      • With SPGS [SPGS stands for spatial position guiding system]
      • Without SPGS [SPGS stands for spatial position guiding system]
    • Tactile
      • Vibration
      • Acupuncture
      • Eye puff
      • Swab [Mouth swab]
    • Vestibular
      • Shaking [being]
    • Pain
      • Heat
      • Cold
      • Pressure
      • Electric shock
      • Laser-evoked
    • Taste
    • Smell
    • Body part
      • Whole Body
      • Eye
      • Arm
        • Hand
          • Finger
            • Index
            • Thumb
            • Ring
            • Middle
            • Small [Pinkie or little finger]
      • Leg
        • Feet
          • Toes
      • Head
        • Face
          • Eyebrow
          • Lip
          • Forehead
          • Mouth
          • Nose
          • Chin
          • Cheek
      • Torso
  • State {requireChild}
    • Level of consciousness {requireChild}
      • Awake
      • Drowsy
      • Sleep
        • Stage {requireChild}
          • # {takesValue} [a number between 1 to 4, or 'REM']
      • Drunk
      • Anesthesia
      • Locked-in
      • Coma
      • Vegetative
      • Brain-dead
    • Emotion {requireChild}
      • Awe
      • Frustration
      • Joy
      • Anger
      • Happiness
      • Sadness
      • Love
      • Fear
      • Compassion
      • Jealousy
      • Contentment
      • Grief
      • Relief
      • Excitement
      • Disgust
      • Neutral [None of the above]
    • Sense of community [Primed to have an emotion such as patriotism]
      • # {takesValue, isNumeric} [SCI stands for Sense of Community Index]
    • Sense of social justice
      • Distributive
      • Poverty
      • Inequality
      • Procedural
      • Interpersonal
      • Informational
    • Stress level {requireChild}
      • # {takesValue, isNumeric} [A number between 0 and 1]
    • Task load {requireChild}
      • # {takesValue, isNumeric} [A number between 0 and 1]
    • Under time pressure
      • Response window
        • # {takesValue, isNumeric, unitClass=time} [Default time is seconds]
      • Competitive [Subject is competing against an opponent as for example when the faster respondent wins]
    • Social interaction [Social]
      • Pseudo [Instructed so but actually not as when the other person may not exist in real world such as the case of a computer program agent]
    • Passive [There is a stimulus presentation but no behavioral measurements are collected from the subject. Subject is instructed not to make any behavioral outputs for example when told to carefully watch/listen/sense. The resting state is not considered passive.]
    • Resting [State when there is no stimulus presentation and no behavioral outputs]
    • Attention
      • Top-down [Instructed to pay attention to something explicitly]
      • Bottom-up [something captures your attention, like a big bang or your name]
        • Orienting [The lower state of the bottom-up or the pre-bottom up state]
      • Covert [Implicit]
      • Overt [Explicit]
      • Selective [If you have two circles but asked to pay attention to only one of them]
        • Divided [Attending to more than one object or location]
      • Focused [Paying a lot of attention]
      • Sustained [Paying attention for a continuous time]
      • Auditory
      • Visual
      • Tactile
      • Taste
      • Smell
      • To a location [Spatial -- use the location attribute to specify to where the attention is directed]
      • Arousal {requireChild}
      • Alerting [Keeping the arousal up in order to respond quickly]
      • Drowsy
      • Excited
      • Neutral
Experiment context {requireChild, extensionAllowed} [Describes the context of the whole experiment or large portions of it and also includes tags that are common across all events]
  • # {takesValue} [Add common tags across all stimuli/ and/or responses here if all experimental events share /State/Drowsy, you can place it here instead of tagging each event individually ]
  • With chin rest
  • Sitting
  • Standing
  • Prone [As in on a bed]
  • Running
    • Treadmill
  • Walking
    • Treadmill
  • Indoors [Default]
    • Clinic [Recording in a clinical setting such as in a hospital or doctor’s office]
    • Dim Room
  • Outdoors
    • Terrain
      • Grass
      • Uneven
      • Boardwalk
      • Dirt
      • Leaves
      • Mud
      • Woodchip
      • Rocky
      • Gravel
      • Downhill
      • Uphill
  • Motion platform [Subject is on a motion platform such as one that produces simulated car movements]
  • Fixed screen
    • Distance [Assuming static subject]
      • # {takesValue, isNumeric, unitClass=physicalLength} [Distance from subject eyes to the presentation screen for 30 cm from subject eyes to the monitor]
    • Width resolution
      • # {takesValue, isNumeric, unitClass=pixels} [Default units are pixels]
    • Height resolution
      • # {takesValue, isNumeric, unitClass=pixels} [Default units are pixels ]
  • Real world
  • Virtual world
Custom {requireChild, extensionAllowed} [This node can be used to organize events in an alternative (parallel) hierarchy. You can define your custom tags and hierarchies without any restriction under this node. These tags will still be matched to each other as for example /Custom/Dance/Waltz is considered a subtype of /Custom/DanceExample.]

HED {requireChild} [Hierarchical Event Descriptor]

  • # {takesValue, isNumeric} [HED specification version number: normally there is no need to specify the version number in the HED string since it will be matched by default to the most recent compliant version, but this tag can be used to specify the exact HED version the HED string was based on.]
Paradigm {requireChild, extensionAllowed}[See Tasks in http://www.cognitiveatlas.org/tasks and CogPo definitions of paradigms]
  • Action imitation task
  • Action observation task
  • Acupuncture task
  • Adult attachment interview
  • Alternating runs paradigm
  • Animal naming task
  • Antisaccade-prosaccade task
  • Attention networks test
  • Attentional blink task
  • Audio-visual target-detection task
  • Autism diagnostic observation schedule
  • Ax-cpt task
  • Backward digit span task
  • Backward masking
  • Balloon analogue risk task - BART
  • Behavioral investment allocation strategy - BIAS
  • Behavioral rating inventory of executive function
  • Benton facial recognition test
  • Birmingham object recognition battery
  • Block design test
  • Block tapping test
  • Boston naming test
  • Braille reading task
  • Breath-holding
  • Breathhold paradigm
  • Brixton spatial anticipation test
  • California verbal learning test
  • California verbal learning test-ii
  • Cambridge face memory test
  • Cambridge gambling task
  • Cambridge neuropsychological test automated battery
  • Catbat task
  • Category fluency test
  • Cattell culture fair intelligence test
  • Chewing-swallowing
  • Chimeric animal stroop task
  • Choice reaction time task
  • Choice task between risky and non-risky options
  • Classical conditioning
  • Clinical evaluation of language fundamentals-3
  • Color trails test
  • Color-discrimination task
  • Color-word stroop task
  • Complex span test
  • Conditional stop signal task
  • Conditioning paradigm
    • Behavioral conditioning paradigm
    • Classical conditioning paradigm
  • Continuous performance task
  • Continuous recognition paradigm
  • Counting stroop task
  • Counting-calculation
  • Cued explicit recognition
  • Cups task
  • Deception task
  • Deductive reasoning paradigm
  • Deductive reasoning task
  • Delayed discounting task
  • Delayed match to sample task
  • Delayed nonmatch to sample task
  • Delayed recall test
  • Delayed response task
    • Delayed matching to sample paradigm
      • Sternberg paradigm
  • Devils task
  • Dichotic listening task
  • Digit cancellation task
  • Digit span task
  • Digit-symbol coding test
  • Directed forgetting task
  • Divided auditory attention
  • Divided auditory attention paradigm
  • Doors and people test
  • Dot pattern expectancy task
  • Drawing
  • Drawing paradigm
  • Dual-task paradigm
  • Early social communications scales
  • Eating paradigm
  • Eating-drinking
  • Embedded figures test
  • Emotional regulation task
  • Encoding paradigm
  • Encoding task
  • Episodic recall
  • Episodic recall paradigm
  • Eriksen flanker task
  • Extradimensional shift task
  • Eye Saccade paradigm
    • Anti saccade paradigm
    • Simple saccade paradigm
  • Face monitor-discrimination
  • Face n-back task
  • Fagerstrom test for nicotine dependence
  • Film viewing
  • Finger tapping task
  • Fixation task
  • Flashing checkerboard
  • Flexion-extension
  • Forward digit span task
  • Free word list recall
  • Glasgow coma scale
  • Go-no-go task
  • Grasping task
  • Gray oral reading test - 4
  • Haptic illusion task
  • Hayling sentence completion test
  • Heat sensitization-adaptation
  • Heat stimulation
  • Hooper visual organization test
  • ID screening [Visual examination of multiple fields of an ID or document to detect invalid or suspicious fields. For example at a security checkpoint.]
  • Imagined emotion
  • Imagined movement
  • Imagined objects-scenes
  • Instructed movement
  • Immediate recall test
  • Inductive reasoning aptitude
  • International affective picture system
  • Intradimensional shift task
  • Ishihara plates for color blindness
  • Isometric force
  • Item recognition paradigm
    • Serial item recognition paradigm
  • Item recognition task
  • Kanizsa figures
  • Keep-track task
  • Letter comparison
  • Letter fluency test
  • Letter naming task
  • Letter number sequencing
  • Lexical decision task
  • Listening span task
  • Macauthur communicative development inventory
  • Machine failure detection task
  • Matching familiar figures test
  • Matching pennies game
  • Maudsley obsessive compulsive inventory
  • Mechanical stimulation
  • Memory span test
  • Mental rotation task
  • Micturition task
  • Mini mental state examination
  • Mirror tracing test
  • Mismatch negativity paradigm
  • Mixed gambles task
  • Modified erikson scale of communication attitudes
  • Morris water maze
  • Motor sequencing task
  • Music comprehension-production
  • N-back task
    • Letter n-back task
  • Naming
    • Covert
    • Overt
  • Nine-hole peg test
  • Non-choice task to study expected value and uncertainty
  • Non-painful electrical stimulation
  • Non-painful thermal stimulation
  • Nonword repetition task
  • Object alternation task
  • Object-discrimination task
  • Oculomotor delayed response
  • Oddball discrimination paradigm
    • Auditory oddball paradigm
    • Visual oddball paradigm
      • Rapid serial visual presentation
  • Oddball task
  • Olfactory monitor-discrimination
  • Operation span task
  • Orthographic discrimination
  • Paced auditory serial addition test
  • Pain monitor-discrimination task
  • Paired associate learning
  • Paired associate recall
  • Pantomime task
  • Parrott scale
  • Passive listening
  • Passive viewing
  • Pattern comparison
  • Perturbed driving
  • Phonological discrimination
  • Picture naming task
  • Picture set test
  • Picture-word stroop task
  • Pitch monitor-discrimination
  • Pointing
  • Porteus maze test
  • Positive and negative affect scale
  • Posner cueing task
  • Probabilistic classification task
  • Probabilistic gambling task
  • Probabilistic reversal learning
  • Pseudoword naming task
  • Psychomotor vigilance task
  • Pursuit rotor task
  • Pyramids and palm trees task
  • Rapid automatized naming test
  • Rapid serial object transformation
  • Reading - Covert
  • Reading - Overt
  • Reading paradigm
    • Covert braille reading paradigm
    • Covert visual reading paradigm
  • Reading span task
  • Recitation-repetition - Covert
  • Recitation-repetition - Overt
  • Remember-know task
  • Response mapping task
  • Rest
    • Rest eyes open
    • Rest eyes closed
  • Retrieval-induced forgetting task
  • Reversal learning task
  • Reward task
  • Rey auditory verbal learning task
  • Rey-ostereith complex figure test
  • Reynell developmental language scales
  • Rhyme verification task
  • Risky gains task
  • Rivermead behavioural memory test
  • [extend here]
Unit classes
  • acceleration {default=cm-per-s2}
    • m-per-s2
    • cm-per-s2
  • currency {default=$}
    • dollars
    • $
    • points
    • fraction
  • angle {default=radians}
    • degrees
    • degree
    • radian
    • radians
  • frequency {default=Hz}
    • Hz
    • mHz
    • Hertz
    • kHz
  • intensity {default=dB}
    • dB
  • jerk {default=cm-per-s3}
    • m-per-s3
    • cm-per-s3
  • luminousIntensity {default=cd}
    • candela
    • cd
  • memorySize {default=mb}
    • mb
    • kb
    • gb
    • tb
  • physicalLength {default=cm}
    • m
    • cm
    • km
    • mm
    • feet
    • foot
    • meter
    • meters
    • mile
    • miles
  • pixels {default=px}
    • pixels
    • px
    • pixel
  • speed {default=cm-per-s}
    • m-per-s
    • mph
    • kph
    • cm-per-s
  • time {default=s}
    • s
    • second
    • seconds
    • centiseconds
    • centisecond
    • cs
    • hour:min
    • day
    • days
    • ms
    • milliseconds
    • millisecond
    • minute
    • minutes
    • hour
    • hours
  • area {default=cm2}
    • m2
    • cm2
    • km2
    • pixels2
    • px2
    • pixel2
    • mm2
  • volume {default=cm3}
    • m3
    • cm3
    • mm3
    • km3
!# end hed

Attribute Definitions:

  • extensionAllowed [users can add (unlimited levels of) child nodes under this tag.]
  • requireChild [One of its descendants must be chosen to tag an event.]
  • takesValue [The tag name is the "#" character, and this character will be replaced with the user's input. The input is a string.]
  • isNumeric [The tag name is the "#" character, and this character will be replaced with the user's input. The input is numerical.]
  • required [Shown at the top of the events view in the order given by the position attribute. Checked for at the end if the user chooses "done".]
  • recommended [Shown below the required tags in the events view in the order given by the position attribute.]
  • position [Used to specify the order of the required and recommended tags compared separately from each other. It expects an integer and the order can start at 0 or 1. It just checks for comparison. Required or recommended tags without this attribute or with negative position will be shown after the others.]
  • unique [Only one of this tag or its descendants can be used within a single tag group or event.]
  • predicateType [One of propertyOf, subclassOf, passThrough -- used to facilitate mapping to OWL or RDF.]
  • default [Default value -- mostly used for unitClasses.]
Clone this wiki locally