Call for papers
There is an ever-growing research interest in computer vision and machine learning for automatic analysis and modeling of human behavior to overcome major limitations in clinical context. For instance, the assessment and monitoring of depression severity relies almost entirely on patients’ verbally reported symptoms in clinical interviews (e.g., BDI). Such assessment, while useful, fails to include behavioral indicators that are powerful indices of depression. These behavioral indicators include facial expressiveness and demeanor, body movements, gesture, and other signs of psychomotor agitation. To improve clinical assessment and guide treatment, automatic, reliable, and valid measurement of human behavior in clinical context is needed.
The workshop aims to advance the state-of-the-art in face and gesture analysis for telemedicine and healthcare. We will discuss the strengths and major challenges of automatic face and gesture detection, analysis, and modeling for clinical research and healthcare applications. We invite scientists working in related areas of face and gesture analysis, affective computing, machine learning, psychology, and cognitive behavior to share their achievements and expertise in the emerging field of face and gesture analysis for health informatics.
Topics of interest include, but are not limited to:
- Face, head, and body detection, analysis, and modeling for healthcare applications
- Human-Computer Interaction systems for home healthcare and wellness management
- Physiological sensing and processing platforms for healthcare applications (e.g., wearable devices for self-management)
- Clinically relevant corpora recording and annotation
- Clinical protocols and methods for secure collection and use of patient data (e.g., face and gesture de-identification)
- Applications include but are not limited to: Telemedicine, pain intensity measurement, depression severity assessment, autism screening, heart rate and breathing rate monitoring.