CS6130 Affective Computing

 

Course Objective:

Affective Computing is computing that relates to, arises from, or influences emotions.  This course overviews the theory of human emotion (how it arises from and influences cognition, the body and the social environment), computational techniques for modeling human emotion processes as well as for recognizing and synthesizing emotional behavior. We will discuss how these can be applied to application design. The graduate student will gain a strong background in the theory and practice of affective computing as it relates to numerous applications, including human-machine interaction, games, immersive environments, health interventions and pedagogical applications.

A special goal in this course is to bring together students from different disciplines to work together and learn from each other to apply affective computing knowledge and techniques to their specific areas of interest.

Please note this syllabus will evolve as the course unfolds.

Course Structure

Instructor: Stacy Marsella, email: s.marsella@northeastern.edu

(Put CS6130 in Subject Line)

Office Hours: Friday 1pm over  ZoomLinks to an external site.

TA:  Betül DIncer

Office Hours: 

Meeting time: Monday and Thursday 11:45 am - 1:25 pm
RoomSnell Library 125

Grades: Grades determined by

·       Class participation 10%,

·       Project proposal presentation 15%

·       Homework and in-class assignments 30%

·       Final project presentation 15%,

·       Final project write-up 30%

Class Participation is expected and part of the gradeStudents are expected to attend class and participate in in-class activities, which may include participatory demonstrations and exercises.

Projects: Projects are a key focus of the course.  Students are expected to work in 2-3 person teams, to develop, execute and present their research projects, with a preference for transdisciplinary teams if possible. A list of prior projects is available to students to help spur their thinking. Because of the transdisciplinary nature of the class content, the instructor will be quite open to a wide range of project ideas and can help students formulate project ideas and access appropriate tools and software.

Project Report: The final project paper should be of a form suitable for submission to a conference such as Affective Computing and Intelligent Interaction, HAI (Human-AI Interaction), Ro-MAN, HRI, Autonomous Agents and Multiagent Systems, Intelligent Virtual Agents, CHI. Although submission to a conference is not required and will not impact the grade, it is supported. Details on what the Project Report should include will be provided

Lecture Structure:

·       Lectures with occasional guest lectures

·       Project Presentations

·       Paper presentations

·       Occasional In-Class Experiment or Exercise exploring the use or impact of affect.

Source book: Oxford Handbook of Affective Computing (useful not required).

Other Sources: ACM Handbook on Socially Interactive Agents, Oxford Handbook on Affective Sciences,

Software:  Students will gain knowledge of, and as part of their projects hands-on experience with, software tools related to affective computing including:

·       Emotion Recognition Techniques

·       Emotion Synthesis Techniques

·       Cognitive and Emotional Modeling

·       Software Agents and Virtual Humans

A list of pre-existing software tools is made available to students

Late Assignments: Homework/reports are expected to be turned in on time.  I remove 10% if an assignment is late and an additional 10% for every two days it is still not turned in. If you enroll in the class late (after an assignment is due), there is no penalty but coordinate w/ me on new due dates. I will waive penalties if you have a verified emergency or inform me in advance of a complication (e.g., job interview).

Course Outline

The following list and order of lectures is tentative – it will evolve as the course progresses.

Also the dates are wrong. They are from a prior course where there was one long 200-minute lecture per week. Fall 2025 will be 2-100 minute lectures per week.

 

Sept 5             1. Course Overview.  Introduction to Affective Computing

·       What is affective computing?

·       What are the functions and why should computer science care

·       Applications

·       Homework: Emotion Test

·       Readings

o   Picard retrospective on field of affective computing

Sept 12            2. Theories of Emotion & Emotion Elicitation

·       Scherer’s characterization of the types of affective phenomena (emotion, mood, attitude/sentiment, personality)

·       Alternative theoretical and functional perspectives on emotion

·       Emotion Elicitation Theories

o   Appraisal Theories, dual process theories, constructivist theories

·       Homework: Scenario Analysis

·       Readings

o   OHAC Chapter 3

o   OHAC Chapter 5

o   Optional

§  Scherer (2010), Outlines alternative theories of emotion

§  Viewing: Barrett video interviewLinks to an external site.A black and grey play button

AI-generated content may be incorrect.(first 15min): Outlines alternative theories of emotion

Part I: Emotion Elicitation

Sept 19           3. Stacy makes a Video on Emotion and Aesthetics to cover missed class

Sept 26           4. Models of Emotion Elicitation

·       Discuss ways to make machines “have” emotions

·       Introduce Computational Appraisal Theory

·       Also Speed Dating on projects

·       Suggested Reading

o   Stacy Marsella and Jonathan Gratch, "Computationally modeling human emotionLinks to an external site.", Communications of the ACM, vol. 57, Dec. 2014, pp. 56-67. PDFLinks to an external site.

o   Marsella, Gratch and Petta (2010):Links to an external site. reviews modeling research.

o   Moerland et al. (2018): Survey of Emotion in Reinforcement Learning

Oct 3           4. Models of Emotion Elicitation (cont)

·       Discuss ways to make machines “have” emotions

·       Introduce Computational Appraisal Theory

·       Also presentation of  IVA papers

·       Suggested Reading

o   Stacy Marsella and Jonathan Gratch, "Computationally modeling human emotionLinks to an external site.", Communications of the ACM, vol. 57, Dec. 2014, pp. 56-67. PDFLinks to an external site.

o   Marsella, Gratch and Petta (2010):Links to an external site. reviews modeling research.

o   Moerland et al. (2018): Survey of Emotion in Reinforcement Learning

Part II: Consequences of Emotion

Oct 10            4. Cognitive Consequences of emotion

·       Rational Choice

·       Contrast between Rational Models and human decision-makining

·       Suggested Reading:

o   Lowenstein and Lerner 2003, p620-633. Figure 1 critical

·       Strongly encouraged:

o   Watch NOVA’s “Mind over Money”Links to an external site.

·       Other Reading:

o   Lerner video interview: Outlines alternative theories of emotion

o   Mellers et al 1999: A model of how emotions shape decisions

Oct 17             8. Project Proposal Presentations (check)

Oct 24             6. Physical Consequences of Emotion

·       Overview of physiological and brain Computing

·       Focus on some affective computing approaches to brain measurement

·       Guest Lecture

·       Suggested Reading:

o   Fairclough 2009 – Fundamentals of physiological computing

·       Optional Reading:

o   ????

Oct 24             7. Experiment Design (may skip depending on class projects)

·       Reading:

o   SparkNotes on Research Methods in Psychology

·       Homework 5 (part 2): Experimental design (Due Feb 28, 11:59p)

·       Recommended Reading

o   AHSIA, Chapter 2: Introduction to empirical methods for social agents

Oct 31             9. Emotion Coping and Regulation

·       Overview psychophysiological impacts of emotion

o   Review biopsychosocial model of challenge / threat

o   Review physiological manifestation of coping responses

o   Discuss cardiovascular measures of emotion and coping

·       Reading:

o   Blascovich & Mendes 2010: Reviews psychophysiological findings. Only required to read following sections:

o   Neurophysiological systems, advantages & Indices (p199-203)

o   Uses [affect, attitudes, emotion] (p 210-215)

·       Optional Reading:

o   OHAC, Chap 14: Reviews physiological sensing of emotion

The Emotional Machine

Nov 7              10. Expression of Emotion by Machines

·       How (and why) machines can convey that they experiencing emotion

·       Segue to social emotions: Distinguish realistic vs. communicative approaches

·       Expression synthesis techniques

·       Homework 6: Facial expression analysis (Due Mar 9th, 11:59pm)

·       Suggested Reading:

o   The social function of machine emotional expressions

o   OHAC, Chapter 18, Section 2 only; Digital expression synthesis

o   OR

§  AHSIA, Chapter 7; Gesture

§  OHAC, Chapter 19; Gesture & postures synthesis

Nov 14            11. Recognition of Emotion by Machines

·       Reading:

o   OHAC, Chapter 13; Recognizing affect from text

o   OHAC, Chapter 10; Face expressions

o   Baltrušaitis et al 2018: Survey of Multimodal ML approaches

·       Optional Reading: Barrett, Adolphs, Marsella, Martinez, Pollack

Nov 21             12. Social Interaction

·       Contagion

·       Social Goals

·       Reverse Engineered Appraisal

Dec 5               13. Final Presentations

???                  Personality

???                  Aesthetics

???                  Bias and Ethics

 

List of Old Class projects (Northeastern, USC and Glasgow)

·       Augmenting Live Performance for Audience Emotional Synchronicity: A Pilot Study

·       AWE: investigating awe’s effects on creativity and anxiety using virtual reality (VR) to elicit awe-inducing experiences

·       AffectiveDebugger: Augmenting Intelligent Tutoring System Technology with Affect Tracking and Large Language Models

·       Interactive Emotional Gait Modelling for Personalized Robotic Characters

·       Advertising Color Optimization System Based on consumer Emotion Analysis

·       Hope: An AI Solution for Improving Communication and Alleviating Stress in Parent-Child Interactions

·       SimPatient: Emotionally Realistic Simulated Patients for Counselor Empathy Training

·       Social Contagion in a Twitch Stream Chat

·       Toxic PAL: Can strategically designed judgmental AI effectively encourage increased physical activity,

·       Combining EEG and facial expression signal processing to improve emotion recognition

·       Using machine learning to derive models of human negotiation behavior

·       Video Game Behavior as a Tool for Personality Assessment

o   Deriving a computational model of personality from game data that predicts behavior

·       Quantitative Assessment of Socio-affective Dynamics in Autism Using Interpersonal Physiology

·       Accuracy in detecting emotion expressions from older faces

o   Analysis of automated facial expression recognition software accuracy on young versus old faces

·       Acquiring data to learn a model of facial expression dynamics for more realistic expression synthesis

·       Evaluate Facial feedback hypothesis using EEG signal

·       Game to improve emotion regulation skills

·       Building a Virtual Environment to Study Oppression

o   Study nonverbal influences on feelings of oppression

·       SpeakWatch: Collecting Real World Affective Information via Long Duration Voice Recording

o   Tracking and analyzing user prosody over the course of the day

·       Embodied cognition and the design of game mechanics

·       Modeling Coping within a Decision-Making Theoretic Framework

·       Application of Sentiment Analysis to detect sarcasm in Tweets

·       Emotional Dynamics Through Facial Expression Recognition

·       Inferencing Human Emotions Through Physiological Data Analysis. 

·       Using Virtual Humans to Understand Real Ones 

·       Analysis of Eye Gazes based on Emotions

 

Affect Related Class projects from 2024 2025

·       Emotion-Driven Audio-Visual Experience: Enhancing Human-Computer Interaction through Real-Time Multimodal Feedback

·       Thera.py: An Empathetic AI Assistant for Mental Health Support

·       QuoteSeek: A Retrieval Augmented Generation System for Bridging Ancient Stoic Wisdom and Modern Queries

·       Healthcare Agent for Senior Patient Assistance

·       Enhancing Sentiment Analysis through Layer-wise Relevance
Propagation

·       StoryGen: Advancing Narrative Generation through Small
Language Models and Reinforcement Learning

·       Beyond Guardrails: Assessing GPT-4’s Resilience to Offensive
Prompts with a Conversational AI Framework

·       Facial Emotion Recognition

·       Turn The Beat Around: Modulating Music Through Dance 

Available Software Tools

The following software tools may be available for use by students as part of their project. Some are publically available for download. Others are more restricted.

General TooLS

·       LLMs

·       RIDE and Virtual Human Toolkit: contains a number of sensing, language and synthesis tools along with Virtual Humans  (https://vhtoolkit.ict.usc.edu/)

General Behavior Generation Systems 

·       SmartBodyLinks to an external site. – character animation system (talk to  Stacy)

·       Cerebella – behavior generation system (talk to Stacy)

·       Cerebella + SmartBody + Unreal MetahumansLinks to an external site. (ask Stacy)

·       SoulMachines (very  easy to set up)Links to an external site.

·       NVBG – Nonverbal Behavior Generation System (available as part of VH toolkit – also talk to Stacy)

General audio annotation 

·       MIR toolbox for matlab - extracts several audio features – designed for music analysis but more generally applicable

·       Link: https://www.jyu.fi/hum/laitokset/musiikki/en/research/coe/materials/mirtoolbox

Affective Sensing 

·       OpenSenseLinks to an external site. (recommended - I know the researcher)

·       Older work:

o   MultiSense – multimodal sensing framework available as part of VH toolkit

o   (http://mplab.ucsd.edu/~marni/Projects/CERT.htm)

o   OKAO – smile detector

o   GAVAM – head pose estimation from webcam available as part of VH tookit

o   Open Ear – acoustic signal processing (http://sourceforge.net/projects/openart/)

o   The AAM-FPT (Active Appearance Model-based Facial-point Tracker) can be used to track 40 characteristic

facial points (http://sspnet.eu/2011/03/aam-fpt-facial-point-tracker/)

·       BoRMaN – detects 20 facial points (http://ibug.doc.ic.ac.uk/resources/facial-point-detector-2010/)

Cognitive modeling 

·       PsychSim multi-agent system with Theory of Mind reasoning (ask Stacy)

·       microEMA – a prolog implementation of a subset of the EMA computational model of emotion

·       FAtiMA – an architecture for construction appraisal-based agents

·       Adapt an LLM

(http://sourceforge.net/projects/fatima-modular/files/)

·       NPC-Editor-Query-answering system that can generate appropriate natural language utterances in

response to questions. (part of Virtual Human Toolkit)

Affective speech generation

·       Check out https://arxiv.org/pdf/2210.03538Links to an external site.

·       There are many cloud services t consider

·       Older work:

o   Emofilt is an open source program to simulate emotional arousal in speech written in Java It is largely customizable with an interface to develop own rules and even own modification algorithms. (http://emofilt.syntheticspeech.de/)

o   MARY TTS is an open-source, Text-to-Speech Synthesis platform in Java. A special focus is on exploring the range of options available to control the expressivity of the synthetic voice (http://mary.dfki.de/)

Affect analysis 

·       OpenSenseLinks to an external site. (recommended)

·       Older Work

o   The Social Signal Interpretation (SSI) framework offers tools to record, analyze and recognize human behavior in real-time, such as gestures, mimics, head nods, and emotional speech. It supports streaming from multiple sensors and includes mechanisms for their synchronization.

Data bases 

There are numerous databases - Ask and we will try to track down swhat you need

Annotation Tools 

·       ELAN – another video annotation tool (http://tla.mpi.nl/tools/tla-tools/elan/)

·       GTrace (General Trace program) allows users to play a video of a person and create 'traces' which show how the person's emotions appear to be changing over time. https://sites.google.com/site/roddycowie/work-resources

·       CowLog – a video annotation tool (http://cowlog.org/download/)

·       Try an LMM

Scales: various psychological instruments have been developed to measure self-reported affect. I can point you to where to find these

·       PANAS: measures positive/negative affect

·       PCL-C: measures depression

·       Ways of Coping: measures coping styles

·       Emotion regulation scale

·       SAM: dimensional self-reported emotion measure

·       Social Value Orientation: measure of cooperative/competitive tendencies

Other resources 

HUMAINE Association (http://emotion-research.net/): see Toolbox and Databases Social Signal Processing Network (http://sspnet.eu/): see especial RESOURCES tab