The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset

Min S H Aung, Sebastian Kaltwang, Bernardino Romera-Paredes, Brais Martinez, Aneesha Singh, Matteo Cella, Michel Valstar, Hongying Meng, Andrew Kemp, Moshen Shafizadeh, Aaron C Elkins, Natalie Kanakam, Amschel de Rothschild, Nick Tyler, Paul J Watson, Amanda C de C Williams, Maja Pantic, Nadia Bianchi-Berthouze, Min S H Aung, Sebastian Kaltwang, Bernardino Romera-Paredes, Brais Martinez, Aneesha Singh, Matteo Cella, Michel Valstar, Hongying Meng, Andrew Kemp, Moshen Shafizadeh, Aaron C Elkins, Natalie Kanakam, Amschel de Rothschild, Nick Tyler, Paul J Watson, Amanda C de C Williams, Maja Pantic, Nadia Bianchi-Berthouze

Abstract

Pain-related emotions are a major barrier to effective self rehabilitation in chronic pain. Automated coaching systems capable of detecting these emotions are a potential solution. This paper lays the foundation for the development of such systems by making three contributions. First, through literature reviews, an overview of how pain is expressed in chronic pain and the motivation for detecting it in physical rehabilitation is provided. Second, a fully labelled multimodal dataset (named 'EmoPain') containing high resolution multiple-view face videos, head mounted and room audio signals, full body 3D motion capture and electromyographic signals from back muscles is supplied. Natural unconstrained pain related facial expressions and body movement behaviours were elicited from people with chronic pain carrying out physical exercises. Both instructed and non-instructed exercises were considered to reflect traditional scenarios of physiotherapist directed therapy and home-based self-directed therapy. Two sets of labels were assigned: level of pain from facial expressions annotated by eight raters and the occurrence of six pain-related body behaviours segmented by four experts. Third, through exploratory experiments grounded in the data, the factors and challenges in the automated recognition of such expressions and behaviour are described, the paper concludes by discussing potential avenues in the context of these findings also highlighting differences for the two exercise scenarios addressed.

Keywords: Chronic low back pain; automatic emotion recognition; body movement; emotion; facial expression; motion capture; multimodal database; pain behaviour; surface electromyography.

Figures

Fig. 1.
Fig. 1.
Plan view of the configuration of eight high resolution cameras, five cameras mounted on a common rig to cover the frontal 90 degrees, of a circle around the subject to allow for unconstrained natural movement., Two long range cameras for distance exercises and a floor camera, to capture the face during forward flexion.
Fig. 2.
Fig. 2.
IMU and EMG sensor attachments: (a) customized motion capture suit (Animzaoo IGS-190): 18 inertial measuring units attached with, Velcro strapping on all main rigid body segments. The use of minimal, attachment material reduces the sense of restrictiveness and to encourage, naturalistic motion (diagram courtesy of Animazoo/Synertial), (b), Four fully wireless surface electromyographic sensors (BTS FREEEMG, 300). Probes 3 and 4 are placed on the upper fibres of trapezius the, muscles. Probes 1 and 2 are placed bilaterally on the lumbar paraspinal, muscles approximately at the 4/5 lumbar vertebra.
Fig. 3.
Fig. 3.
Cropped video frames from Camera 4 showing an example grimace (above) with all eight temporally concurrent observer’s ratings for pain (below). Vertical axis showing the rating and horizontal axis showing the time index.
Fig. 4.
Fig. 4.
Example of protective behaviour: the top row shows three frames from a CLBP patient undergoing a reaching forward exercise. The bottom row shows the concurrent motion captured avatar (left) and back muscle activity (right) visualized as circles with radii corresponding to the rectified sEMG amplitude. This instance of reaching forward was labelled as guarded by two raters and hesitative by a third. This participant executes a high knee bend and backward shift of the pelvis during the reaching phase as a compensation strategy to alleviate a perceived strain on the back.
Fig. 5.
Fig. 5.
(Left) distribution of frame type over the entire dataset. (Right), proportions of protective behaviour and facial expressions of pain that, directly overlap each other, are within overlapping or close expressive, segments or occur with no relation to each other.
Fig. 6.
Fig. 6.
Example of a normalized face image with highlighted regions from which the features were extracted.

Source: PubMed

3
Abonnere