The Effects of Repeated Testing, Simulated Malingering, and Traumatic Brain Injury on High-Precision Measures of Simple Visual Reaction Time

David L Woods, John M Wyma, E William Yund, Timothy J Herron, David L Woods, John M Wyma, E William Yund, Timothy J Herron

Abstract

Simple reaction time (SRT), the latency to respond to a stimulus, has been widely used as a basic measure of processing speed. In the current experiments, we examined clinically-relevant properties of a new SRT test that presents visual stimuli to the left or right hemifield at varying stimulus onset asynchronies (SOAs). Experiment 1 examined test-retest reliability in 48 participants who underwent three test sessions at weekly intervals. In the first test, log-transformed (log-SRT) z-scores, corrected for the influence of age and computer-use, were well predicted by regression functions derived from a normative population of 189 control participants. Test-retest reliability of log-SRT z-scores was measured with an intraclass correlation coefficient (ICC = 0.83) and equaled or exceeded those of other SRT tests and other widely used tests of processing speed that are administered manually. No significant learning effects were observed across test sessions. Experiment 2 investigated the same participants when instructed to malinger during a fourth testing session: 94% showed abnormal log-SRT z-scores, with 83% producing log-SRT z-scores exceeding a cutoff of 3.0, a degree of abnormality never seen in full-effort conditions. Thus, a log-SRT z-score cutoff of 3.0 had a sensitivity (83%) and specificity (100%) that equaled or exceeded that of existing symptom validity tests. We argue that even expert malingerers, fully informed of the malingering-detection metric, would be unable to successfully feign impairments on the SRT test because of the precise control of SRT latencies that would be required. Experiment 3 investigated 26 patients with traumatic brain injury (TBI) tested more than 1 year post-injury. The 22 patients with mild TBI showed insignificantly faster SRTs than controls, but a small group of four patients with severe TBI showed slowed SRTs. Simple visual reaction time is a reliable measure of processing speed that is sensitive to the effects of malingering and TBI.

Keywords: aging; computer; effort; feigning; head injury; motor; reliability; timing errors.

Figures

Figure 1
Figure 1
The SRT paradigm. Stimuli were high-contrast bulls-eyes presented to the left or right hemifield for a duration of 200 ms at randomized stimulus onset asynchronies (SOAs) ranging from 1000–2000 ms in five 250 ms steps. Stimuli could occur in the visual hemifield ipsilateral (shown) or contralateral to the responding hand.
Figure 2
Figure 2
Mean SRT latencies as a function of age. SRT latencies from individual participants in normative data (norm, blue diamonds), Experiment 1a (open red squares), Experiment 2 (simulated malingering, green triangles) and Experiment 3 (patients with mTBI, red circles, sTBI, striped red circles). The normative age-regression slope is shown. Simulated malingerers with SRT latencies >600 ms are not included.
Figure 3
Figure 3
Mean stimulus detection times (SDTs) as a function of age. SDTs were derived by subtracting movement initiation time (measured in a finger-tapping experiment performed in the same test session) from SRTs. SDTs are shown for normative data (norm, blue diamonds), Experiment 1a (open red squares), Experiment 2 (simulated malingering, green triangles) and Experiment 3 (patients with mTBI, red circles, sTBI, striped red circles). The normative age-regression slope is shown.
Figure 4
Figure 4
Log-SRT z-scores and SDT z-scores for the normative group and the three experiments. Data from two simulated malingerers with SDT z-scores greater than 12.0 and two simulated malingerers with SDT z-scores less than −4.0 are not shown. The red lines show p < 0.05 thresholds for normative log-SRT and SDT z-scores.
Figure 5
Figure 5
SRT latencies of individual participants in the three replications of Experiment 1. The ordinate shows the SRT latencies from the earlier session and the abscissa shows the SRT latencies from the later session. Pearson correlations were r = 0.59 (Session 1 vs. Session 2), r = 0.80 (Session 2 vs. Session 3), and r = 0.53 (Session 1 vs. Session 3).

References

    1. Bashem J. R., Rapport L. J., Miller J. B., Hanks R. A., Axelrod B. N., Millis S. R. (2014). Comparisons of five performance validity indices in bona fide and simulated traumatic brain injury. Clin. Neuropsychol. 28, 851–875. 10.1080/13854046.2014.927927
    1. Bashore T. R., Ridderinkhof K. R. (2002). Older age, traumatic brain injury and cognitive slowing: some convergent and divergent findings. Psychol. Bull. 128, 151–198. 10.1037/0033-2909.128.1.151
    1. Bauer L., McCaffrey R. J. (2006). Coverage of the test of memory malingering, victoria symptom validity test and word memory test on the internet: is test security threatened? Arch. Clin. Neuropsychol. 21, 121–126. 10.1016/j.acn.2005.06.010
    1. Berthelson L., Mulchan S. S., Odland A. P., Miller L. J., Mittenberg W. (2013). False positive diagnosis of malingering due to the use of multiple effort tests. Brain Inj. 27, 909–916. 10.3109/02699052.2013.793400
    1. Bryan C., Hernandez A. M. (2012). Magnitudes of decline on automated neuropsychological assessment metrics subtest scores relative to predeployment baseline performance among service members evaluated for traumatic brain injury in iraq. J. Head Trauma Rehabil. 27, 45–54. 10.1097/htr.0b013e318238f146
    1. Camicioli R. M., Wieler M., de Frias C. M., Martin W. R. (2008). Early, untreated parkinson’s disease patients show reaction time variability. Neurosci. Lett. 441, 77–80. 10.1016/j.neulet.2008.06.004
    1. Carlozzi N. E., Tulsky D. S., Chiaravalloti N. D., Beaumont J. L., Weintraub S., Conway K., et al. . (2014). NIH Toolbox Cognitive Battery (NIHTB-CB): the NIHTB pattern comparison processing speed test. J. Int. Neuropsychol. Soc. 20, 630–641. 10.1017/s1355617714000319
    1. Christensen H., Dear K. B., Anstey K. J., Parslow R. A., Sachdev P., Jorm A. F. (2005). Within-occasion intraindividual variability and preclinical diagnostic status: is intraindividual variability an indicator of mild cognitive impairment? Neuropsychology 19, 309–317. 10.1037/0894-4105.19.3.309
    1. Collins L. F., Long C. J. (1996). Visual reaction time and its relationship to neuropsychological test performance. Arch. Clin. Neuropsychol. 11, 613–623. 10.1093/arclin/11.7.613
    1. Deary I. J., Der G., Ford G. (2001). Reaction times and intelligence differences: a population-based cohort study. Intelligence 29, 389–399. 10.1016/s0160-2896(01)00062-9
    1. D’Erme P., Robertson I., Bartolomeo P., Daniele A., Gainotti G. (1992). Early rightwards orienting of attention on simple reaction time performance in patients with left-sided neglect. Neuropsychologia 30, 989–1000. 10.1016/0028-3932(92)90050-v
    1. Eckner J. T., Kutcher J. S., Richardson J. K. (2011). Between-seasons test-retest reliability of clinically measured reaction time in national collegiate athletic association division I athletes. J. Athl. Train. 46, 409–414.
    1. Eonta S. E., Carr W., McArdle J. J., Kain J. M., Tate C., Wesensten N. J., et al. . (2011). Automated neuropsychological assessment metrics: repeated assessment with two military samples. Aviat. Space Environ. Med. 82, 34–39. 10.3357/asem.2799.2011
    1. Erdodi L. A., Roth R. M., Kirsch N. L., Lajiness-O’neill R., Medoff B. (2014). Aggregating validity indicators embedded in conners’ CPT-II outperforms individual cutoffs at separating valid from invalid performance in adults with traumatic brain injury. Arch. Clin. Neuropsychol. 29, 456–466. 10.1093/arclin/acu026
    1. Ferraro F. R. (1996). Cognitive slowing in closed-head injury. Brain Cogn. 32, 429–440. 10.1006/brcg.1996.0075
    1. Fong K. N., Chan M. K., Ng P. P., Ng S. S. (2009). Measuring processing speed after traumatic brain injury in the outpatient clinic. NeuroRehabilitation 24, 165–173. 10.3233/NRE-2009-0465
    1. Garaizar P., Vadillo M. A., Lopóz-De-Ipiña D., Matute H. (2014). Measuring software timing errors in the presentation of visual stimuli in cognitive neuroscience experiments. PLoS One 9:e85108. 10.1371/journal.pone.0085108
    1. Gualtieri C. T., Johnson L. G. (2006). Reliability and validity of a computerized neurocognitive test battery, CNS vital signs. Arch. Clin. Neuropsychol. 21, 623–643. 10.1016/j.acn.2006.05.007
    1. Hetherington C. R., Stuss D. T., Finlayson M. A. (1996). Reaction time and variability 5 and 10 years after traumatic brain injury. Brain Inj. 10, 473–486. 10.1080/026990596124197
    1. Hubel K. A., Reed B., Yund E. W., Herron T. J., Woods D. L. (2013a). Computerized measures of finger tapping: effects of hand dominance, age and sex. Percept. Mot. Skills 116, 929–952. 10.2466/25.29.pms.116.3.929-952
    1. Hubel K. A., Yund E. W., Herron T. J., Woods D. L. (2013b). Computerized measures of finger tapping: reliability, malingering and traumatic brain injury. J. Clin. Exp. Neuropsychol. 35, 745–758. 10.1080/13803395.2013.824070
    1. Incoccia C., Formisano R., Muscato P., Reali G., Zoccolotti P. (2004). Reaction and movement times in individuals with chronic traumatic brain injury with good motor recovery. Cortex 40, 111–115. 10.1016/s0010-9452(08)70924-9
    1. Iverson G. L. (2001). Interpreting change on the WAIS-III/WMS-III in clinical samples. Arch. Clin. Neuropsychol. 16, 183–191. 10.1093/arclin/16.2.183
    1. Ivins B. J., Kane R., Schwab K. A. (2009). Performance on the automated neuropsychological assessment metrics in a nonclinical sample of soldiers screened for mild TBI after returning from iraq and afghanistan: a descriptive analysis. J. Head Trauma Rehabil. 24, 24–31. 10.1097/htr.0b013e3181957042
    1. Jelicic M., Ceunen E., Peters M. J., Merckelbach H. (2011). Detecting coached feigning using the Test of Memory Malingering (TOMM) and the Structured Inventory of Malingered Symptomatology (SIMS). J. Clin. Psychol. 67, 850–855. 10.1002/jclp.20805
    1. Kaminski T. W., Groff R. M., Glutting J. J. (2009). Examining the stability of Automated Neuropsychological Assessment Metric (ANAM) baseline test scores. J. Clin. Exp. Neuropsychol. 31, 689–697. 10.1080/13803390802484771
    1. Kertzman S., Avital A., Weizman A., Segal M. (2014). Intrusive trauma recollections is associated with impairment of interference inhibition and psychomotor speed in PTSD. Compr. Psychiatry 55, 1587–1594. 10.1016/j.comppsych.2014.05.004
    1. Kertzman S., Grinspan H., Birger M., Shliapnikov N., Alish Y., Ben Nahum Z., et al. . (2006). Simple real-time computerized tasks for detection of malingering among murderers with schizophrenia. Isr. J. Psychiatry Relat. Sci. 43, 112–118.
    1. Kida N., Oda S., Matsumura M. (2005). Intensive baseball practice improves the Go/Nogo reaction time, but not the simple reaction time. Brain Res. Cogn. Brain Res. 22, 257–264. 10.1016/j.cogbrainres.2004.09.003
    1. Larrabee G. J. (2014). False-positive rates associated with the use of multiple performance and symptom validity tests. Arch. Clin. Neuropsychol. 29, 364–373. 10.1093/arclin/acu019
    1. Lemay S., Bédard M. A., Rouleau I., Tremblay P. L. (2004). Practice effect and test-retest reliability of attentional and executive tests in middle-aged to elderly subjects. Clin. Neuropsychol. 18, 284–302. 10.1080/13854040490501718
    1. Makdissi M., Collie A., Maruff P., Darby D. G., Bush A., McCrory P., et al. . (2001). Computerised cognitive assessment of concussed australian rules footballers. Br. J. Sports Med. 35, 354–360. 10.1136/bjsm.35.5.354
    1. Marx B. P., Brailey K., Proctor S. P., Macdonald H. Z., Graefe A. C., Amoroso P., et al. . (2009). Association of time since deployment, combat intensity and posttraumatic stress symptoms with neuropsychological outcomes following iraq war deployment. Arch. Gen. Psychiatry 66, 996–1004. 10.1001/archgenpsychiatry.2009.109
    1. Neselius S., Brisby H., Marcusson J., Zetterberg H., Blennow K., Karlsson T. (2014). Neurological assessment and its relationship to CSF biomarkers in amateur boxers. PLoS One 9:e99870. 10.1371/journal.pone.0099870
    1. Niemi P., Naatanen R. (1981). Foreperiod and simple reaction time. Psychol. Bull. 89, 133–162. 10.1037/0033-2909.89.1.133
    1. Ord J. S., Boettcher A. C., Greve K. W., Bianchini K. J. (2010). Detection of malingering in mild traumatic brain injury with the conners’ continuous performance test-II. J. Clin. Exp. Neuropsychol. 32, 380–387. 10.1080/13803390903066881
    1. Papapetropoulos S., Katzen H. L., Scanlon B. K., Guevara A., Singer C., Levin B. E. (2010). Objective quantification of neuromotor symptoms in parkinson’s disease: implementation of a portable, computerized measurement tool. Parkinsons. Dis. 2010:760196. 10.4061/2010/760196
    1. Plant R. R., Quinlan P. T. (2013). Could millisecond timing errors in commonly used equipment be a cause of replication failure in some neuroscience studies? Cogn. Affect. Behav. Neurosci. 13, 598–614. 10.3758/s13415-013-0166-6
    1. Plant R. R., Hammond N., Whitehouse T. (2003). How choice of mouse may affect response timing in psychological studies. Behav. Res. Methods Instrum. Comput. 35, 276–284. 10.3758/bf03202553
    1. Puopolo C., Martelli M., Zoccolotti P. (2013). Role of sensory modality and motor planning in the slowing of patients with traumatic brain injury: a meta-analysis. Neurosci. Biobehav. Rev. 37, 2638–2648. 10.1016/j.neubiorev.2013.08.013
    1. Plant R. R., Turner G. (2009). Millisecond precision psychological research in a world of commodity computers: new hardware, new problems? Behav. Res. Methods 41, 598–614. 10.3758/brm.41.3.598
    1. Reicker L. I. (2008). The ability of reaction time tests to detect simulation: an investigation of contextual effects and criterion scores. Arch. Clin. Neuropsychol. 23, 419–431. 10.1016/j.acn.2008.02.003
    1. Reicker L. I., Tombaugh T. N., Walker L., Freedman M. S. (2007). Reaction time: an alternative method for assessing the effects of multiple sclerosis on information processing speed. Arch. Clin. Neuropsychol. 22, 655–664. 10.1016/j.acn.2007.04.008
    1. Resch J., Driscoll A., McCaffrey N., Brown C., Ferrara M. S., Macciocchi S., et al. . (2013). ImPact test-retest reliability: reliably unreliable? J. Athl. Train. 48, 506–511. 10.4085/1062-6050-48.3.09
    1. Roebuck-Spencer T. M., Vincent A. S., Gilliland K., Johnson D. R., Cooper D. B. (2013). Initial clinical validation of an embedded performance validity measure within the automated neuropsychological metrics (ANAM). Arch. Clin. Neuropsychol. 28, 700–710. 10.1093/arclin/act055
    1. Sakong J., Kang P. S., Kim C. Y., Hwang T. Y., Jeon M. J., Park S. Y., et al. . (2007). Evaluation of reliability of traditional and computerized neurobehavioral tests. Neurotoxicology 28, 235–239. 10.1016/j.neuro.2006.03.004
    1. Straume-Naesheim T. M., Andersen T. E., Bahr R. (2005). Reproducibility of computer based neuropsychological testing among norwegian elite football players. Br. J. Sports Med. 39(Suppl. 1), i64–i69. 10.1136/bjsm.2005.019620
    1. Strauss E., Spellacy F., Hunter M., Berry T. (1994). Assessing believable deficits on measures of attention and information processing capacity. Arch. Clin. Neuropsychol. 9, 483–490. 10.1016/0887-6177(94)90039-6
    1. Stuss D. T., Stethem L. L., Hugenholtz H., Picton T., Pivik J., Richard M. T. (1989a). Reaction time after head injury: fatigue, divided and focused attention and consistency of performance. J. Neurol. Neurosurg. Psychiatry 52, 742–748. 10.1136/jnnp.52.6.742
    1. Stuss D. T., Stethem L. L., Picton T. W., Leech E. E., Pelchat G. (1989b). Traumatic brain injury, aging and reaction time. Can. J. Neurol. Sci. 16, 161–167.
    1. Swick D., Honzel N., Larsen J., Ashley V. (2013). Increased response variability as a marker of executive dysfunction in veterans with post-traumatic stress disorder. Neuropsychologia 51, 3033–3040. 10.1016/j.neuropsychologia.2013.10.008
    1. Tombaugh T. N. (1996). Test of Memory Malingering (TOMM). San Antionio, TX: PsychCorp.
    1. Tombaugh T. N., Rees L., Stormer P., Harrison A. G., Smith A. (2007). The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP). Arch. Clin. Neuropsychol. 22, 25–36. 10.1016/j.acn.2006.06.013
    1. Turken A. U., Herron T. J., Kang X., O’Connor L. E., Sorenson D. J., Baldo J. V., et al. . (2009). Multimodal surface-based morphometry reveals diffuse cortical atrophy in traumatic brain injury. BMC Med. Imaging 9:20. 10.1186/1471-2342-9-20
    1. van Zomeren A. H., Deelman B. G. (1976). Differential effects of simple and choice reaction after closed head injury. Clin. Neurol. Neurosurg. 79, 81–90. 10.1016/0303-8467(76)90001-9
    1. Vasterling J. J., Proctor S. P., Amoroso P., Kane R., Heeren T., White R. F. (2006). Neuropsychological outcomes of army personnel following deployment to the Iraq war. JAMA 296, 519–529. 10.1001/jama.296.5.519
    1. Verfaellie M., Lafleche G., Spiro A., Bousquet K. (2014). Neuropsychological outcomes in OEF/OIF veterans with self-report of blast exposure: associations with mental health, but not MTBI. Neuropsychology 28, 337–346. 10.1037/neu0000027
    1. Vickery C. D., Berry D. T., Inman T. H., Harris M. J., Orey S. A. (2001). Detection of inadequate effort on neuropsychological testing: a meta-analytic review of selected procedures. Arch. Clin. Neuropsychol. 16, 45–73. 10.1016/s0887-6177(99)00058-x
    1. Warden D. L., Bleiberg J., Cameron K. L., Ecklund J., Walter J., Sparling M. B., et al. . (2001). Persistent prolongation of simple reaction time in sports concussion. Neurology 57, 524–526. 10.1212/wnl.57.3.524
    1. Waters F., Bucks R. S. (2011). Neuropsychological effects of sleep loss: implication for neuropsychologists. J. Int. Neuropsychol. Soc. 17, 571–586. 10.1017/s1355617711000610
    1. Whitney K. A., Davis J. J., Shepard P. H., Bertram D. M., Adams K. M. (2009). Digit span age scaled score in middle-aged military veterans: is it more closely associated with TOMM failure than reliable digit span? Arch. Clin. Neuropsychol. 24, 263–272. 10.1093/arclin/acp034
    1. Willison J., Tombaugh T. N. (2006). Detecting simulation of attention deficits using reaction time tests. Arch. Clin. Neuropsychol. 21, 41–52. 10.1016/j.acn.2005.07.005
    1. Wogar M. A., van den Broek M. D., Bradshaw C. M., Szabadi E. (1998). A new performance-curve method for the detection of simulated cognitive impairment. Br. J. Clin. Psychol. 37, 327–339. 10.1111/j.2044-8260.1998.tb01389.x
    1. Woods D. L., Kishiyama M. M., Yund E. W., Herron T. J., Hink R. F., Reed B. (2011). Computerized analysis of error patterns in digit span recall. J. Clin. Exp. Neuropsychol. 33, 721–734. 10.1080/13803395.2010.550602
    1. Woods D. L., Wyma J. M., Herron T. J., Yund E. W. (2015a). The effects of aging, malingering and brain injury on computerized trail-making test performance. PLoS One 10:e124345. 10.1371/journal.pone.0124345
    1. Woods D. L., Wyma J. M., Yund E. W., Herron T. J., Reed B. (2015b). Factors influencing the latency of simple reaction time. Front. Hum. Neurosci. 9:131. 10.3389/fnhum.2015.00131
    1. Woods D. L., Wyma J. M., Yund E. W., Herron T. J. (2015c). The effects of repeated testing, malingering and traumatic brain injury on visual choice reaction time. Front. Hum. Neurosci. 9:595 10.3389/fnhum.2015.00595
    1. Ylioja S. G., Baird A. D., Podell K. (2009). Developing a spatial analogue of the reliable digit span. Arch. Clin. Neuropsychol. 24, 729–739. 10.1093/arclin/acp078

Source: PubMed

3
購読する