Intelligent ICU for Autonomous Patient Monitoring Using Pervasive Sensing and Deep Learning

Anis Davoudi, Kumar Rohit Malhotra, Benjamin Shickel, Scott Siegel, Seth Williams, Matthew Ruppert, Emel Bihorac, Tezcan Ozrazgat-Baslanti, Patrick J Tighe, Azra Bihorac, Parisa Rashidi, Anis Davoudi, Kumar Rohit Malhotra, Benjamin Shickel, Scott Siegel, Seth Williams, Matthew Ruppert, Emel Bihorac, Tezcan Ozrazgat-Baslanti, Patrick J Tighe, Azra Bihorac, Parisa Rashidi

Abstract

Currently, many critical care indices are not captured automatically at a granular level, rather are repetitively assessed by overburdened nurses. In this pilot study, we examined the feasibility of using pervasive sensing technology and artificial intelligence for autonomous and granular monitoring in the Intensive Care Unit (ICU). As an exemplary prevalent condition, we characterized delirious patients and their environment. We used wearable sensors, light and sound sensors, and a camera to collect data on patients and their environment. We analyzed collected data to detect and recognize patient's face, their postures, facial action units and expressions, head pose variation, extremity movements, sound pressure levels, light intensity level, and visitation frequency. We found that facial expressions, functional status entailing extremity movement and postures, and environmental factors including the visitation frequency, light and sound pressure levels at night were significantly different between the delirious and non-delirious patients. Our results showed that granular and autonomous monitoring of critically ill patients and their environment is feasible using a noninvasive system, and we demonstrated its potential for characterizing critical care patients and environmental factors.

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
(a) Intelligent ICU uses pervasive sensing for collecting data on patients and their environment. The system includes wearable accelerometer sensors, video monitoring system, light sensor, and sound sensor. (b) The Intelligent ICU information complements conventional ICU information. Pervasive information is provided by performing face detection, face recognition, facial action unit detection, head pose detection, facial expression recognition, posture recognition, extremity movement analysis, sound pressure level detection, light level detection, and visitation frequency detection. Face Detection Icon, Face Recognition Icon, Facial AU Detection Icon, Facial Expression Recognition Icon, and Head Pose Detection Icon: © iStock.com/bitontawan.
Figure 2
Figure 2
(a) Distribution of intensity-coding facial Action Units (AUs) among delirious and non-delirious patients shown as boxplots where middle line represents median and lower and upper end lines represents 25th and 75th percentiles, respectively. Facial AUs are coded between 0 (absence of the facial AU) to 5 (maximum intensity of facial AU); (b) Percentage of frames with each binary-coding facial AU present among delirious and non-delirious patients during their enrollment period. Binary-coding facial AUs are coded either 0 (absent) or 1 (present). This bar plot shows how often a certain action unit is observed in all recorded video frames as percentage and standard error bars. In (a,b), *shows statistically significant difference between delirious and non-delirious groups (p-value < 0.001).
Figure 3
Figure 3
Percentage of frames with each facial expression present among the delirious and non-delirious patients, calculated based on constituent AUs (Supplementary Table S2). This bar plot shows how often a certain expression is observed in all recorded video frames as percentage and standard error bars. *Shows statistically significant difference between delirious and non-delirious groups (p-value 

Figure 4

( a ) Distribution of…

Figure 4

( a ) Distribution of head poses among delirious and non-delirious patients during…

Figure 4
(a) Distribution of head poses among delirious and non-delirious patients during their enrollment days shown as boxplots. Pitch, yaw, and roll describe the orientation of the head in its three degrees of freedom. Pitch is the rotation around the right-left axis, up and down, as shaking the head “Yes”. Roll is rotation around the inferior-superior axis, as shaking the head “No”. Yaw is rotation around the anterior-posterior axis, side to side, like shaking the head “Maybe”; (b) Percentage of the frames spent in each posture among delirious and non-delirious patients shown along with standard error bars; (c) Percentage of frames with visitors present in the room (disruption) for delirious and non-delirious patients during the day (7AM–7PM) and during the night (7PM–7AM) shown along with standard error bars. In (ac), *shows statistically significant difference between delirious and non-delirious groups (p-value < 0.001).

Figure 5

Delirious and non-delirious group comparisons…

Figure 5

Delirious and non-delirious group comparisons for ( a e ) sensor data…

Figure 5
Delirious and non-delirious group comparisons for (ae) sensor data and (fj) physiological data. Sensor data included accelerometer data recorded on the wrist, arm, and ankle, as well as light intensity level recorded using an actigraph capable of recording light intensity level and the sound pressure level using an iPod on the wall behind the patient’s bed physiological data included heart rate, systolic blood pressure and diastolic blood pressure, respiration rate, and oxygen saturation. Physiological data were collected with a resolution of approximately once per hour as part of the patient’s care. The graphs show the smoothed average value per group, with the transparent band around each average line showing the 95% confidence interval. The bar at the top of each panel shows the night (black) and day (white) time.

Figure 6

( a ) Pipeline of…

Figure 6

( a ) Pipeline of the patient recognition system begins with alignment of…

Figure 6
(a) Pipeline of the patient recognition system begins with alignment of faces in the image using MTCNN (Multi-stage CNN). The aligned images are then provided as input to the faceNet network which extracts features using a pre-trained Inception-Resnet-V1 model, then performs L2 normalization on them, and finally stores them as feature embeddings. These embeddings are used as input for the k-nearest neighbor (KNN) classifier to identify posture. (b) Pipeline of the posture recognition system includes a two-branch three-stage CNN and a KNN classifier. At each stage of the CNN, Branch 1 predicts the confidence maps for the different body joints, and branch 2 predicts the Part Affinity Fields for the limbs. These predictions are combined at the end of a stage and refined over the subsequent stage. After stage 3, the part affinity fields of limbs are used to extract the lengths and angles of the body limbs. Any missing values are imputed using the KNN imputation, and a pre-trained KNN classifier is used to detect posture from the extracted features.
Figure 4
Figure 4
(a) Distribution of head poses among delirious and non-delirious patients during their enrollment days shown as boxplots. Pitch, yaw, and roll describe the orientation of the head in its three degrees of freedom. Pitch is the rotation around the right-left axis, up and down, as shaking the head “Yes”. Roll is rotation around the inferior-superior axis, as shaking the head “No”. Yaw is rotation around the anterior-posterior axis, side to side, like shaking the head “Maybe”; (b) Percentage of the frames spent in each posture among delirious and non-delirious patients shown along with standard error bars; (c) Percentage of frames with visitors present in the room (disruption) for delirious and non-delirious patients during the day (7AM–7PM) and during the night (7PM–7AM) shown along with standard error bars. In (ac), *shows statistically significant difference between delirious and non-delirious groups (p-value < 0.001).
Figure 5
Figure 5
Delirious and non-delirious group comparisons for (ae) sensor data and (fj) physiological data. Sensor data included accelerometer data recorded on the wrist, arm, and ankle, as well as light intensity level recorded using an actigraph capable of recording light intensity level and the sound pressure level using an iPod on the wall behind the patient’s bed physiological data included heart rate, systolic blood pressure and diastolic blood pressure, respiration rate, and oxygen saturation. Physiological data were collected with a resolution of approximately once per hour as part of the patient’s care. The graphs show the smoothed average value per group, with the transparent band around each average line showing the 95% confidence interval. The bar at the top of each panel shows the night (black) and day (white) time.
Figure 6
Figure 6
(a) Pipeline of the patient recognition system begins with alignment of faces in the image using MTCNN (Multi-stage CNN). The aligned images are then provided as input to the faceNet network which extracts features using a pre-trained Inception-Resnet-V1 model, then performs L2 normalization on them, and finally stores them as feature embeddings. These embeddings are used as input for the k-nearest neighbor (KNN) classifier to identify posture. (b) Pipeline of the posture recognition system includes a two-branch three-stage CNN and a KNN classifier. At each stage of the CNN, Branch 1 predicts the confidence maps for the different body joints, and branch 2 predicts the Part Affinity Fields for the limbs. These predictions are combined at the end of a stage and refined over the subsequent stage. After stage 3, the part affinity fields of limbs are used to extract the lengths and angles of the body limbs. Any missing values are imputed using the KNN imputation, and a pre-trained KNN classifier is used to detect posture from the extracted features.

References

    1. Halpern NA, Pastores SM. Critical care medicine in the united states 2000–2005: an analysis of bed numbers, occupancy rates, payer mix, and costs. Crit Care Med. 2010;38:65–71. doi: 10.1097/CCM.0b013e3181b090d0.
    1. Jalali, A., Bender, D., Rehman, M., Nadkanri, V. & Nataraj, C. Advanced analytics for outcome prediction in intensive care units. In Engineering in Medicine and Biology Society (EMBC), IEEE 38th Annual International Conference of the, 2520–2524 (IEEE) (2016).
    1. Arenson BG, MacDonald LA, Grocott HP, Hiebert BM, Arora RC. Effect of intensive care unit environment on in-hospital delirium after cardiac surgery. The J. thoracic cardiovascular surgery. 2013;146:172–178. doi: 10.1016/j.jtcvs.2012.12.042.
    1. Barr J, et al. Clinical practice guidelines for the management of pain, agitation, and delirium in adult patients in the intensive care unit. Critical care medicine. 2013;41:263–306. doi: 10.1097/CCM.0b013e3182783b72.
    1. Schweickert WD, Hall J. Icu-acquired weakness. Chest. 2007;131:1541–9. doi: 10.1378/chest.06-2065.
    1. Parry SM, et al. Assessment of impairment and activity limitations in the critically ill: a systematic review of measurement instruments and their clinimetric properties. Intensive Care Med. 2015;41:744–62. doi: 10.1007/s00134-015-3672-x.
    1. Thrush A, Rozek M, Dekerlegand JL. The clinical utility of the functional status score for the intensive care unit (fssicu) at a long-term acute care hospital: A prospective cohort study. Phys. Ther. 2012;92:1536–1545. doi: 10.2522/ptj.20110412.
    1. Brown H, Terrence J, Vasquez P, Bates DW, Zimlichman E. Continuous monitoring in an inpatient medical-surgical unit: a controlled clinical trial. Am J Med. 2014;127:226–32. doi: 10.1016/j.amjmed.2013.12.004.
    1. Kipnis E, et al. Monitoring in the intensive care. Critical Care Res. Pract. 2012;2012:20. doi: 10.1155/2012/473507.
    1. To, K. B. & Napolitano, L. M. Common complications in the critically ill patient. Surg. Clin. 92, 1519–1557, 10.1016/j.suc.2012.08.018.
    1. Wollschlager CM, Conrad AR, Khan FA. Common complications in critically ill patients. Dis Mon. 1988;34:221–93. doi: 10.1016/0011-5029(88)90009-0.
    1. Rubins HB, Moskowitz MA. Complications of care in a medical intensive care unit. J. Gen. Intern. Medicine. 1990;5:104–109. doi: 10.1007/bf02600508.
    1. Desai SV, Law TJ, Needham DM. Long-term complications of critical care. Crit Care Med. 2011;39:371–9. doi: 10.1097/CCM.0b013e3181fd66e5.
    1. Daily M, Medasani S, Behringer R, Trivedi M. Self-driving cars. Computer. 2017;50:18–23. doi: 10.1109/MC.2017.4451204.
    1. Vincent J-L, Creteur J. Paradigm shifts in critical care medicine: the progress we have made. Critical Care. 2015;19:S10–S10. doi: 10.1186/cc14728.
    1. Hirsch LJ. Continuous eeg monitoring in the intensive care unit: An overview. J. Clin. Neurophysiol. 2004;21:332–340. doi: 10.1097/01.wnp.0000147129.80917.0e.
    1. Freedman NS, Kotzer N, Schwab RJ. Patient perception of sleep quality and etiology of sleep disruption in the intensive care unit. Am J Respir Crit Care Med. 1999;159:1155–62. doi: 10.1164/ajrccm.159.4.9806141.
    1. Meagher D, et al. Development of an abbreviated version of the delirium motor subtyping scale (dmss-4) Int. psychogeriatrics. 2014;26:693–702. doi: 10.1017/S1041610213002585.
    1. Zhang K, Zhang Z, Li Z, Qiao Y. Joint face detection and alignment using multitask cascaded convolutional networks. IEEE Signal Process. Lett. 2016;23:1499–1503. doi: 10.1109/LSP.2016.2603342.
    1. Schroff, F., Kalenichenko, D. & Philbin, J. Facenet: A unified embedding for face recognition and clustering. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 815–823, 10.1109/CVPR.2015.7298682 (2015).
    1. Szegedy, C., Ioffe, S., Vanhoucke, V. & Alemi, A. A. Inception-v4, inception-resnet and the impact of residual connections on learning. In AAAI, 4278–4284 (2016).
    1. Amos, B., Ludwiczuk, B. & Satyanarayanan, M. Openface: A general-purpose face recognition library with mobile applications. Report, CMU School of Computer Science (2016).
    1. Ekman, P. & Friesen, W. V. Manual for the facial action coding system (Consulting Psychologists Press, 1978).
    1. Lucey P, et al. Automatically detecting pain in video through facial action units. IEEE Transactions on Syst. Man, Cybern. Part B (Cybernetics) 2011;41:664–674. doi: 10.1109/TSMCB.2010.2082525.
    1. McDuff, D., Kaliouby, R. E., Kassam, K. & Picard, R. Affect valence inference from facial action unit spectrograms. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, 17–24, 10.1109/CVPRW.2010.5543833 (2010).
    1. Madrigal-Garcia MI, Rodrigues M, Shenfield A, Singer M, Moreno-Cuesta J. What faces reveal: A novel method to identify patients at risk of deterioration using facial expressions. Crit Care Med. 2018;46:1057–1062. doi: 10.1097/ccm.0000000000003128.
    1. Cao, Z., Simon, T., Wei, S.-E. & Sheikh, Y. Realtime multi-person 2d pose estimation using part affinity fields. Comput. Res. Repos. (2016).
    1. Zaal IJ, et al. Intensive care unit environment may affect the course of delirium. Intensive care medicine. 2013;39:481–488. doi: 10.1007/s00134-012-2726-6.
    1. Patel J, Baldwin J, Bunting P, Laha S. The effect of a multicomponent multidisciplinary bundle of interventions on sleep and delirium in medical and surgical intensive care patients. Anaesthesia. 2014;69:540–549. doi: 10.1111/anae.12638.
    1. Ma AJ, et al. Measuring patient mobility in the icu using a novel noninvasive sensor. Crit Care Med. 2017;45:630–636. doi: 10.1097/ccm.0000000000002265.
    1. McKenna HT, Reiss IK, Martin DS. The significance of circadian rhythms and dysrhythmias in critical illness. J. Intensive Care Soc. 2017;18:121–129. doi: 10.1177/1751143717692603.
    1. Scott BK. Disruption of circadian rhythms and sleep in critical illness and its impact on the development of delirium. Curr. Pharm. Des. 2015;21:3443–3452. doi: 10.2174/1381612821666150706110656.
    1. Madrid-Navarro CJ, et al. Disruption of circadian rhythms and delirium, sleep impairment and sepsis in critically ill patients. potential therapeutic implications for increased light-dark contrast and melatonin therapy in an icu environment. Curr. Pharm. Des. 2015;21:3453–3468. doi: 10.2174/1381612821666150706105602.
    1. Darbyshire JL, Young JD. An investigation of sound levels on intensive care units with reference to the who guidelines. Critical Care. 2013;17:R187–R187. doi: 10.1186/cc12870.
    1. Danielson, S. J., Rappaport, C. A., Loher, M. K. & Gehlbach, B. K. Looking for light in the din: An examination of the circadian-disrupting properties of a medical intensive care unit. Intensive Crit Care Nurs, 10.1016/j.iccn.2017.12.006 (2018).
    1. Araújo TC, da Silva LWS. Music: a care strategy for patients in intensive care unit. J. Nurs. UFPE on line. 2013;7:1319–1325.
    1. O’Malley G, Leonard M, Meagher D, O’Keeffe ST. The delirium experience: a review. J Psychosom Res. 2008;65:223–8. doi: 10.1016/j.jpsychores.2008.05.017.
    1. Mistraletti G, Pelosi P, Mantovani ES, Berardino M, Gregoretti C. Delirium: clinical approach and prevention. Best Pract Res Clin Anaesthesiol. 2012;26:311–26. doi: 10.1016/j.bpa.2012.07.001.
    1. Granberg A, Engberg IB, Lundberg D. Intensive care syndrome: a literature review. Intensive Crit Care Nurs. 1996;12:173–82. doi: 10.1016/S0964-3397(96)80537-4.
    1. Van Rompaey B, Van Hoof A, van Bogaert P, Timmermans O, Dilles T. The patient’s perception of a delirium: A qualitative research in a belgian intensive care unit. Intensive Critical Care Nurs. 2016;32:66–74. doi: 10.1016/j.iccn.2015.03.002.
    1. Sprague, E., Reynolds, S. & Brindley, P. G. Patient isolation precautions: Are they worth it? Can Respir J (2015).
    1. Rosa RG, et al. Effectiveness and safety of an extended icu visitation model for delirium prevention: A before and after study*. Critical Care Medicine. 2017;45:1660–1667. doi: 10.1097/ccm.0000000000002588.
    1. Rosenberger ME, et al. Estimating activity and sedentary behavior from an accelerometer on the hip or wrist. Medicine science sports exercise. 2013;45:964–975. doi: 10.1249/MSS.0b013e31827f0d9c.
    1. Montoye AHK, Pivarnik JM, Mudd LM, Biswas S, Pfeiffer KA. Validation and comparison of accelerometers worn on the hip, thigh, and wrists for measuring physical activity and sedentary behavior. AIMS Public Heal. 2016;3:298–312. doi: 10.3934/publichealth.2016.2.298.
    1. Cooke, A. B., Daskalopoulou, S. S. & Dasgupta, K. The impact of accelerometer wear location on the relationship between step counts and arterial stiffness in adults treated for hypertension and diabetes. J Sci Med Sport, 10.1016/j.jsams.2017.08.011 (2017).
    1. Breitbart W, et al. The memorial delirium assessment scale. J Pain Symptom Manag. 1997;13:128–37. doi: 10.1016/S0885-3924(96)00316-8.
    1. Sasaki JE, John D, Freedson PS. Validation and comparison of actigraph activity monitors. J Sci Med Sport. 2011;14:411–6. doi: 10.1016/j.jsams.2011.04.003.
    1. Bourdev, L. & Malik, J. Poselets: Body part detectors trained using 3d human pose annotations. In Computer Vision, IEEE 12th International Conference on, 1365–1372 (IEEE) (2009).
    1. Long, J., Shelhamer, E. & Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, 3431–3440 (2015).
    1. Tipping CJ, et al. The icu mobility scale has construct and predictive validity and is responsive. a multicenter observational study. Ann Am Thorac Soc. 2016;13:887–93. doi: 10.1513/AnnalsATS.201510-717OC.
    1. Titsworth WL, et al. The effect of increased mobility on morbidity in the neurointensive care unit. J Neurosurg. 2012;116:1379–88. doi: 10.3171/2012.2.jns111881.
    1. Council, N. R. Hearing loss: Determining eligibility for social security benefits (National Academies Press, 2004).

Source: PubMed

3
구독하다