Interaction Analysis Module
Here, we present a description of the Interaction Analysis (IA) module and provide instructions for setting up the IA module for demonstration.
The primary function of the IA module is to update the learner model with an estimate of the user’s valence and arousal during interaction with the EMOTE system. IA receives regular sensor updates from the Perception Module and sends regular affective updates to the Learner Model.
The IA module needs to deal with both positive and negative affect. The first example shows a child displaying positive affect and in the second example another child displays negative affect. These can be considered as extreme examples of the affect expressed by children during interaction with the EMOTE system.
The implementation of the IA module has been informed by the annotation and analysis of a Wizard-of-Oz experiment. In that analysis we found the children were considered to be in a neutral affective state for 96.9% of the interactions, with the states that we are most interested in: Positive Valence – Positive Arousal (2.4%), Negative Valence – Positive Arousal (0.7%) and Negative Valence – Negative Arousal (0.7%) accounting for only 3.8% of the interaction. Therefore, we needed to design a detector which is able to detect these infrequently occurring changes in the user’s affective state, whilst also being robust to noise.
For valence, we use the information from the OKAO basic emotions to determine if the user is currently in the either the positive, neutral or negative state of emotion. The six basic emotions are well documented in terms of the valence and arousal dimensions of emotion. We record this information in memory and then calculate which of the basic emotions occurred most frequently over a five second period, from there we can determine if the valence of the user’s emotions are either positive, neutral or negative.
For arousal, we use the standardised skin conductance information from the Affectiva Q Sensor to determine whether the user’s arousal is increasing or decreasing over time. As with the valence, we record the skin conductance information over a period of five seconds. Skin conductance is an indication of physiological arousal, therefore, with a threshold and running average of the skin conductance the detector is able to classify, using a rule based architecture, whether the user’s arousal is considered as either positive, neutral or negative.
It is essential that the latest information is provided to the learner model as soon as it becomes available, and the multithreaded design of the Interaction Analysis module enables it to respond almost immediately. In fact, the Interaction Analysis module can update the affective state of the user in two different ways, the first method involves the learner model sending a request for an instant affective update and the IA module responds with it most recent information, tagged with a reference provided by the learner model. The second method is an automated state check which occurs within the Interaction Analysis module, computing and then testing the affective output to see if it is different from the previous state. If the state is different, the learner model is automatically updated via Thalamus. In Figure 1, we show how the IA module fits into the EMOTE system architecture.
Figure 1 – How IA fits into the EMOTE system architecture
Output from the IA Module
The output from the IA module is a serialised JSON object (Figure 2), which contains information regarding: state being updated, whether that state is positive, neutral or negative, and the confidence level from the underlying sensor technology (see Figure 2 for an example). The output from the IA module always contains the latest estimations for both valence and arousal in the same message
The learner model is able to de-serialise the message and update its internal records accordingly.
Affective States: [
Figure 2 – A simplified example of the serialised JSON object used to update the learner model.
Download and Setup Instructions
The source code and binaries for the latest version of the Interaction Analysis module can be found here:
In this section, we detail the few steps to setting up the IA module, which includes an explanation of how it will operate when setup correctly. These setup instructions apply to running this module in a basic IA demonstration, therefore, you must first fulfil all of the prerequisites for running Thalamus and Perception, as these are the absolute minimal requirements needed to produce an affective output on-screen. If however, you would like to demonstrate this module as part of the EMOTE system, i.e., whilst also running the activity, you should refer to the full description of the EMOTE system architecture and the various other modules required for scenario one.
Please note: To use Perception, you will need to acquire a licence key from OMRON, to enable you to use OKAO software.
Steps required to set up the IA module:
Step One: Complete the following:
- Run Thalamus
- Run Perception
- Run OKAO
- Run Q Sensor
- Run IA module and connect to Thalamus
- Connect the IA module to Perception.
Step Two: Run the IA module and connect to Thalamus
When you run the IA module, the application loads a console window showing the connection status of the IA module with Thalamus (Figure 3). If the display appears to be continuously rolling and you cannot see output similar to Figure 3, then you need to check that Thalamus is running correctly first. This includes making sure that the security settings of your windows machine are as described in the setup information for Thalamus. Proceed if you see a screen similar to Figure 3.
Figure 3- Screenshot of the Interaction Module upon first running the program.
Step Three: Connect the IA module to Perception
Now that the IA module is connected to Thalamus we need to contact the Perception module. To this open the Perception window, make sure that the users selection box is set to one (highlighted number one in Figure 4), then click on the Start Server button (highlighted number 2 in Figure 4). If it has connected properly it will display the following message in the text area: “Logging data for participant:”
Figure 4 – Steps for connecting the IA module to Perception
Now, if you bring the IA console window back into focus, within five seconds, the console screen will change to show an output similar to Figure 5. However, if it does not change, there could be issue with Perception or the OKAO module. Also, you should notice almost imeadiatley that the program calculates the affective state of the user, and the screen has been updated to display the most current affective output , which is then updated every five seconds.
You have succesfully setup the IA module for demonstration purposes.
Figure 5 – Screenshot of the Interaction Analysis module showing the most up-to-date affective output.