The Effects of Repeated Testing, Simulated Malingering, and Traumatic Brain Injury on Visual Choice Reaction Time

David L Woods, John M Wyma, E W Yund, Timothy J Herron, David L Woods, John M Wyma, E W Yund, Timothy J Herron

Abstract

Choice reaction time (CRT), the time required to discriminate and respond appropriately to different stimuli, is a basic measure of attention and processing speed. Here, we describe the reliability and clinical sensitivity of a new CRT test that presents lateralized visual stimuli and adaptively adjusts stimulus onset asynchronies using a staircase procedure. Experiment 1 investigated the test-retest reliability in three test sessions performed at weekly intervals. Performance in the first test session was accurately predicted from age and computer-use regression functions obtained in a previously studied normative cohort. Central processing time (CentPT), the difference between the CRTs and simple reaction time latencies measured in a separate experiment, accounted for 55% of CRT latency and more than 85% of CRT latency variance. Performance improved significantly across the three test sessions. High intraclass correlation coefficients were seen for CRTs (0.90), CentPTs (0.87), and an omnibus performance measure (0.81) that combined CRT and minimal SOA z-scores. Experiment 2 investigated performance in the same participants when instructed to feign symptoms of traumatic brain injury (TBI): 87% produced abnormal omnibus z-scores. Simulated malingerers showed greater elevations in simple reaction times than CRTs, and hence reduced CentPTs. Latency-consistency z-scores, based on the difference between the CRTs obtained and those predicted based on CentPT latencies, discriminated malingering participants from controls with high sensitivity and specificity. Experiment 3 investigated CRT test performance in military veterans who had suffered combat-related TBI and symptoms of post-traumatic stress disorder, and revealed small but significant deficits in performance in the TBI population. The results indicate that the new CRT test shows high test-retest reliability, can assist in detecting participants performing with suboptimal effort, and is sensitive to the effects of TBI on the speed and accuracy of visual processing.

Keywords: aging; concussion; effort; feigning; head injury; reliability; response selection; timing precision.

Figures

FIGURE 1
FIGURE 1
The visual feature conjunction task. Subjects performed a visual feature conjunction task with colored letters (blue P, blue F, orange P, or orange F) subtending 0.5° of visual angle randomly presented to the left or right hemifield, 1.6° from the fixation cross. Stimulus durations were 200 ms. Right-handed subjects pressed the left mouse button for targets (blue P’s, probability 40%) and pressed the right mouse button for non-targets, i.e., letters which resembled the target in color, shape, or neither feature (probability 20% each). The response button could be spatially compatible (trials 1 and 2) or spatially incompatible (trial 3) with the stimulus visual field. Stimulus onset asynchronies (SOAs) began at 2500 ms and were either reduced by 3% following each pair of successive hits or increased by 3% following each miss. From Woods et al. (2015c).
FIGURE 2
FIGURE 2
Mean choice reaction times (CRTs) as a function of age. CRTs were averaged over stimulus types for subjects of different ages from normative data (Norm), Experiment 1A, Experiment 2 (malinger), and Experiment 3 (TBI). The age-regression slope for the normative data is shown. CRTs for patients with mild TBI (mTBI, filled red circles) and severe TBI (sTBI, circles with vertical red stripes) are shown separately.
FIGURE 3
FIGURE 3
Minimal stimulus onset asynchronies (mSOA) for subjects as a function of age. Showing data from the normative study (Norm), Experiment 1A, Experiment 2 (Malinger), and Experiment 3 (TBI). mSOAs for patients with mild TBI (mTBI, filled red circles) and severe TBI (sTBI, circles with vertical red stripes) are shown separately.
FIGURE 4
FIGURE 4
Mean central processing times (CentPT) as a function of age. CentPTs were averaged over stimulus types from control subjects in the normative study (Norm) and Experiment 1A, simulated malingerers (Experiment 2), and TBI patients (Experiment 3). The age-regression slope for the normative data is shown. The results from patients with mild TBI (mTBI, filled red circles) and severe TBI (sTBI, circles with vertical red stripes) are shown separately.
FIGURE 5
FIGURE 5
Choice reaction time and Omnibus z-scores. Data from control subjects in the normative data (Norm) and Experiment 1A, simulated malingerers (Experiment 2), and TBI patients (Experiment 3). Z-scores were calculated based on age- and computer-use regression slopes from the normative data. Patients with mild TBI (mTBI, filled red circles) and severe TBI (sTBI, circles with vertical red stripes) are shown separately. The data from four simulated malingerers with z-scores outside the range of the figure are not shown.
FIGURE 6
FIGURE 6
Test–retest reliability of CRTs. CRTs in Experiment 1a plotted against CRTs in Experiments 1B,C. Pearson correlations across repeated tests were 0.72 (Experiment 1A vs. Experiment 1B), 0.74 (Experiment 1A vs. Experiment 1C) and 0.77 (Experiment 1B vs. Experiment 1C).
FIGURE 7
FIGURE 7
Z-scores for CentPT (central processing time) and CRT. Data from controls in the normative study (Norm) and Experiment 1A, simulated malingerers, and patients with mild TBI (mTBI, filled red circles) and severe TBI (sTBI, circles with vertical red stripes). Z-scores were calculated using the age- and computer-use regression slopes from the normative data. The solid vertical red lines show limits for abnormally (p < 0.05) short and long CPTs.
FIGURE 8
FIGURE 8
Latency-consistency and Omnibus z-scores. Data are shown from control subjects in the normative population (Norm) and Experiment 1A, simulated malingerers, and patients with mild TBI (mTBI, filled red circles) and severe TBI (sTBI, circles with vertical red stripes). The red lines show abnormality cutoffs (p < 0.05) based on the normative population. The data from 13 malingering subjects with latency-consistency z-scores > 12.0 are not shown.

References

    1. Armistead-Jehle P. (2010). Symptom validity test performance in U.S. veterans referred for evaluation of mild TBI. Appl. Neuropsychol. 17 52–59. 10.1080/09084280903526182
    1. Barr W. B. (2003). Neuropsychological testing of high school athletes. Preliminary norms and test-retest indices. Arch. Clin. Neuropsychol. 18 91–101. 10.1093/arclin/18.1.91
    1. Bashore T. R., Ridderinkhof K. R. (2002). Older age, traumatic brain injury, and cognitive slowing: some convergent and divergent findings. Psychol. Bull. 128 151–198. 10.1037/0033-2909.128.1.151
    1. Bender S. D., Rogers R. (2004). Detection of neurocognitive feigning: development of a multi-strategy assessment. Arch. Clin. Neuropsychol. 19 49–60. 10.1016/S0887-6177(02)00165-8
    1. Bryan C., Hernandez A. M. (2012). Magnitudes of decline on Automated Neuropsychological Assessment Metrics subtest scores relative to predeployment baseline performance among service members evaluated for traumatic brain injury in Iraq. J. Head Trauma Rehabil. 27 45–54. 10.1097/HTR.0b013e318238f146
    1. Busse M., Whiteside D. (2012). Detecting suboptimal cognitive effort: classification accuracy of the Conner’s Continuous Performance Test-II, Brief Test Of Attention, and Trail Making Test. Clin. Neuropsychol. 26 675–687. 10.1080/13854046.2012.679623
    1. Carlozzi N. E., Tulsky D. S., Chiaravalloti N. D., Beaumont J. L., Weintraub S., Conway K., et al. (2014). NIH toolbox cognitive battery (NIHTB-CB): the NIHTB pattern comparison processing speed test. J. Int. Neuropsychol. Soc. 20 630–641. 10.1017/S1355617714000319
    1. Clark A. L., Amick M. M., Fortier C., Milberg W. P., Mcglinchey R. E. (2014). Poor performance validity predicts clinical characteristics and cognitive test performance of OEF/OIF/OND Veterans in a research setting. Clin. Neuropsychol. 28 802–825. 10.1080/13854046.2014.904928
    1. Collie A., Maruff P., Darby D. G., Mcstephen M. (2003a). The effects of practice on the cognitive test performance of neurologically normal individuals assessed at brief test-retest intervals. J. Int. Neuropsychol. Soc. 9 419–428. 10.1017/S1355617703930074
    1. Collie A., Maruff P., Makdissi M., Mccrory P., Mcstephen M., Darby D. (2003b). CogSport: reliability and correlation with conventional cognitive tests used in postconcussion medical evaluations. Clin. J. Sport Med. 13 28–32. 10.1097/00042752-200301000-00006
    1. Collins L. F., Long C. J. (1996). Visual reaction time and its relationship to neuropsychological test performance. Arch. Clin. Neuropsychol. 11 613–623. 10.1093/arclin/11.7.613
    1. Egeland J., Langfjaeran T. (2007). Differentiating malingering from genuine cognitive dysfunction using the Trail Making Test-ratio and Stroop Interference scores. Appl. Neuropsychol. 14 113–119. 10.1080/09084280701319953
    1. Falleti M. G., Maruff P., Collie A., Darby D. G. (2006). Practice effects associated with the repeated assessment of cognitive function using the CogState battery at 10-minute, one week and one month test-retest intervals. J. Clin. Exp. Neuropsychol. 28 1095–1112. 10.1080/13803390500205718
    1. Ferraro F. R. (1996). Cognitive slowing in closed-head injury. Brain Cogn. 32 429–440. 10.1006/brcg.1996.0075
    1. Fong K. N., Chan M. K., Ng P. P., Ng S. S. (2009). Measuring processing speed after traumatic brain injury in the outpatient clinic. NeuroRehabilitation 24 165–173. 10.3233/NRE-2009-0465
    1. Gualtieri C. T., Johnson L. G. (2006). Reliability and validity of a computerized neurocognitive test battery, CNS Vital Signs. Arch. Clin. Neuropsychol. 21 623–643. 10.1016/j.acn.2006.05.007
    1. Haines M. E., Norris M. P. (2001). Comparing student and patient simulated malingerers’ performance on standard neuropsychological measures to detect feigned cognitive deficits. Clin. Neuropsychol. 15 171–182. 10.1076/clin.15.2.171.1891
    1. Hetherington C. R., Stuss D. T., Finlayson M. A. (1996). Reaction time and variability 5 and 10 years after traumatic brain injury. Brain Inj. 10 473–486. 10.1080/026990596124197
    1. Hubel K. A., Yund E. W., Herron T. J., Woods D. L. (2013). Computerized measures of finger tapping: reliability, malingering and traumatic brain injury. J. Clin. Exp. Neuropsychol. 35 745–758. 10.1080/13803395.2013.824070
    1. Iverson G. L., Lovell M. R., Collins M. W. (2005). Validity of ImPACT for measuring processing speed following sports-related concussion. J. Clin. Exp. Neuropsychol. 27 683–689. 10.1081/13803390490918435
    1. Iverson G. L. (2001). Interpreting change on the WAIS-III/WMS-III in clinical samples. Arch. Clin. Neuropsychol. 16 183–191. 10.1016/S0887-6177(00)00060-3
    1. Karr J. E., Areshenkoff C. N., Duggan E. C., Garcia-Barrera M. A. (2014). Blast-related mild traumatic brain injury: a Bayesian random-effects meta-analysis on the cognitive outcomes of concussion among military personnel. Neuropsychol. Rev. 24 428–444. 10.1007/s11065-014-9271-8
    1. Kertzman S., Grinspan H., Birger M., Shliapnikov N., Alish Y., Ben Nahum Z., et al. (2006). Simple real-time computerized tasks for detection of malingering among murderers with schizophrenia. Isr. J. Psychiatry Relat. Sci. 43 112–118.
    1. Klein R. M., Ivanoff J. (2011). The components of visual attention and the ubiquitous Simon effect. Acta Psychol. (Amst.) 136 225–234. 10.1016/j.actpsy.2010.08.003
    1. Kontos A. P., Kotwal R. S., Elbin R. J., Lutz R. H., Forsten R. D., Benson P. J., et al. (2013). Residual effects of combat-related mild traumatic brain injury. J. Neurotrauma 30 680–686. 10.1089/neu.2012.2506
    1. Lapshin H., Lanctot K. L., O’connor P., Feinstein A. (2013). Assessing the validity of a computer-generated cognitive screening instrument for patients with multiple sclerosis. Mult. Scler. 19 1905–1912. 10.1177/1352458513488841
    1. Lee K. S., Jeon M. J., Hwang T. Y., Kim C. Y., Sakong J. (2012). Evaluation of reliability of computerized neurobehavioral tests in Korean children. Neurotoxicology 33 1362–1367. 10.1016/j.neuro.2012.08.013
    1. Lemay S., Bedard M. A., Rouleau I., Tremblay P. L. (2004). Practice effect and test-retest reliability of attentional and executive tests in middle-aged to elderly subjects. Clin. Neuropsychol. 18 284–302. 10.1080/13854040490501718
    1. McNally R. J., Frueh B. C. (2012). Why we should worry about malingering in the VA system: comment on Jackson et al. (2011). J. Trauma. Stress 25 454–456; author reply 457–460 10.1002/jts.21713
    1. Papapetropoulos S., Katzen H. L., Scanlon B. K., Guevara A., Singer C., Levin B. E. (2010). Objective quantification of neuromotor symptoms in Parkinson’s disease: implementation of a portable, computerized measurement tool. Parkinsons Dis. 2010:760196 10.4061/2010/760196
    1. Pellizzer G., Stephane M. (2007). Response selection in schizophrenia. Exp. Brain Res. 180 705–714. 10.1007/s00221-007-0892-5
    1. Plant R. R., Quinlan P. T. (2013). Could millisecond timing errors in commonly used equipment be a cause of replication failure in some neuroscience studies? Cogn. Affect. Behav. Neurosci. 13 598–614. 10.3758/s13415-013-0166-6
    1. Ponsford J., Cameron P., Fitzgerald M., Grant M., Mikocka-Walus A., Schonberger M. (2012). Predictors of postconcussive symptoms 3 months after mild traumatic brain injury. Neuropsychology 26 304–313. 10.1037/a0027888
    1. Rassovsky Y., Satz P., Alfano M. S., Light R. K., Zaucha K., Mcarthur D. L., et al. (2006). Functional outcome in TBI II: verbal memory and information processing speed mediators. J. Clin. Exp. Neuropsychol. 28 581–591. 10.1080/13803390500434474
    1. Reicker L. I. (2008). The ability of reaction time tests to detect simulation: an investigation of contextual effects and criterion scores. Arch. Clin. Neuropsychol. 23 419–431. 10.1016/j.acn.2008.02.003
    1. Resch J., Driscoll A., Mccaffrey N., Brown C., Ferrara M. S., Macciocchi S., et al. (2013). ImPact test-retest reliability: reliably unreliable? J. Athl. Train. 48 506–511. 10.4085/1062-6050-48.3.09
    1. Rogers P. J., Heatherley S. V., Mullings E. L., Smith J. E. (2013). Faster but not smarter: effects of caffeine and caffeine withdrawal on alertness and performance. Psychopharmacology (Berl.) 226 229–240. 10.1007/s00213-012-2889-4
    1. Russo A. C. (2012). Symptom validity test performance and consistency of self-reported memory functioning of Operation Enduring Freedom/Operation Iraqi freedom veterans with positive Veteran Health Administration Comprehensive Traumatic Brain Injury evaluations. Arch. Clin. Neuropsychol. 27 840–848. 10.1093/arclin/acs090
    1. Segalowitz S. J., Mahaney P., Santesso D. L., Macgregor L., Dywan J., Willer B. (2007). Retest reliability in adolescents of a computerized neuropsychological battery used to assess recovery from concussion. NeuroRehabilitation 22 243–251.
    1. Snyder P. J., Cappelleri J. C., Archibald C. J., Fisk J. D. (2001). Improved detection of differential information-processing speed deficits between two disease-course types of multiple sclerosis. Neuropsychology 15 617–625. 10.1037/0894-4105.15.4.617
    1. Straume-Naesheim T. M., Andersen T. E., Bahr R. (2005). Reproducibility of computer based neuropsychological testing among Norwegian elite football players. Br. J. Sports Med. 39(Suppl. 1) i64–i69. 10.1136/bjsm.2005.019620
    1. Stuss D. T., Stethem L. L., Hugenholtz H., Picton T., Pivik J., Richard M. T. (1989a). Reaction time after head injury: fatigue, divided and focused attention, and consistency of performance. J. Neurol. Neurosurg. Psychiatry 52 742–748. 10.1136/jnnp.52.6.742
    1. Stuss D. T., Stethem L. L., Picton T. W., Leech E. E., Pelchat G. (1989b). Traumatic brain injury, aging and reaction time. Can. J. Neurol. Sci. 16 161–167.
    1. Tombaugh T. N., Rees L., Stormer P., Harrison A. G., Smith A. (2007). The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP). Arch. Clin. Neuropsychol. 22 25–36. 10.1016/j.acn.2006.06.013
    1. Van Zomeren A. H., Deelman B. G. (1976). Differential effects of simple and choice reaction after closed head injury. Clin. Neurol. Neurosurg. 79 81–90. 10.1016/0303-8467(76)90001-9
    1. Verfaellie M., Lafleche G., Spiro A., Bousquet K. (2014). Neuropsychological outcomes in OEF/OIF veterans with self-report of blast exposure: associations with mental health, but not MTBI. Neuropsychology 28 337–346. 10.1037/neu0000027
    1. Versavel M., Van Laack D., Evertz C., Unger S., Meier F., Kuhlmann J. (1997). Test-retest reliability and influence of practice effects on performance in a multi-user computerized psychometric test system for use in clinical pharmacological studies. Arzneimittelforschung 47 781–786.
    1. Warden D. L., Bleiberg J., Cameron K. L., Ecklund J., Walter J., Sparling M. B., et al. (2001). Persistent prolongation of simple reaction time in sports concussion. Neurology 57 524–526. 10.1212/WNL.57.3.524
    1. Weintraub S., Dikmen S. S., Heaton R. K., Tulsky D. S., Zelazo P. D., Bauer P. J., et al. (2013). Cognition assessment using the NIH Toolbox. Neurology 80 S54–S64. 10.1212/WNL.0b013e3182872ded
    1. Willison J., Tombaugh T. N. (2006). Detecting simulation of attention deficits using reaction time tests. Arch. Clin. Neuropsychol. 21 41–52. 10.1016/j.acn.2005.07.005
    1. Wogar M. A., Van Den Broek M. D., Bradshaw C. M., Szabadi E. (1998). A new performance-curve method for the detection of simulated cognitive impairment. Br. J. Clin. Psychol. 37(Pt 3) 327–339. 10.1111/j.2044-8260.1998.tb01389.x
    1. Woods D. L., Kishiyama M. M., Yund E. W., Herron T. J., Hink R. F., Reed B. (2011). Computerized analysis of error patterns in digit span recall. J. Clin. Exp. Neuropsychol. 33 721–734. 10.1080/13803395.2010.493149
    1. Woods D. L., Wyma J., Yund E. W., Herron T. J. (2015a). The effects of repeated testing, simulated malingering, and traumatic brain injury on simple visual reaction times. Front. Hum. Neurosci. 9:540 10.3389/fnhum.2015.00540
    1. Woods D. L., Wyma J. W., Herron T. J., Yund E. W. (2015b). The effects of aging, malingering, and traumatic brain injury on computerized trail-making test performance. PLoS ONE 10:e0124345 10.1371/journal.pone.0124345
    1. Woods D. L., Wyma J. W., Herron T. J., Yund E. W., Reed B. (2015c). Age-related slowing of response selection and production in a visual choice reaction time task. Front. Hum. Neurosci. 9:193 10.3389/fnhum.2015.00193
    1. Woods D. L., Yund E. W., Wyma J. M., Ruff R., Herron T. J. (2015d). Measuring executive function in control subjects and TBI patients with question completion time (QCT). Front. Hum. Neurosci. 9:288 10.3389/fnhum.2015.00288

Source: PubMed

3
購読する