|
||||||||||||||||||||
James the ButlerJames, the Butler operates in the context of a full screen interface
representing an online The objective of this demonstrator is to provide and evaluate the use of an autonomous PSA agent with the necessary skills to exhibit life-like qualities manifested in textual, verbal and visual behaviour and action. The PSA may also take on different functional roles like a Sales Assistant agent, Help agent, Recommendation agent, among others (but is limited to the role of a Personal Sales Assistant). A main interaction that concerns this demonstrator is the business-to-consumer (b2c) interaction involved in retail, in which loyalty is created by both human-human interaction and the personal touch. As such, a question this demonstrator addresses is how to reproduce this kind of service online? Everyone remembers personal service or a knowledgeable shop assistant, and such intelligence and individuality (one-to-one treatment) implies that the multimedia interface could benefit from some kind of similar personality and sensitivity. This is realised by the implementation of agents with a visual animated life-like character and a distinct personality. With the intention for the customer to be in control: that is, the agent is a highly intelligent, highly competent character that is essentially a servant. The character adds the personal touch to the interface, but moreover, it increases in competence over time, thus increasing its intrinsic value to a single customer (i.e. its owner). The concept is reinforced by an Interface Agent with a visual representation of a butler, James. James is an agent that amplifies or modifies the motivational state of an agent and its perceived bodily state. It has the ability to perceive and produce the visual (animated expressions), verbal and non-verbal signals and regulate the flow of information between service agents, the interface agent and the user. These capabilities enable James to engage in complex interactions with customers via natural social communication rather than complex command languages, or direct manipulations. The demonstrator is developed using the SAFIRA Toolkit editor. The main components used are the CRS (Central Registry Service) that provides component interaction APIs and messaging services, the CML (Character Mark-up Language generation and scripting component) for the behaviour description and animation scripting language of Jamess expressive behaviour and animation, the ASM (Affective Speech Module) to generate Jamess affective spoken utterances and the Appraisal component for the reasoning about and the generation of the appropriate emotional signals that are used to influence the planning and generation of Jamess affective behaviour.
FantasyAFantasyA is a computer game where users play the role of an apprentice wizard who is challenged to find the leader of his/her clan in the land of FantasyA. The game has an introductory phase, where the four clans (Air, Earth, Fire and Water), the magic stones, and the duels are explained. Then the player is placed in the land of FantasyA where he will engage in exploration, duels and cooperation in order to find his leader of the clan. However, the first prototype of the game only includes the duels between wizards of different clans. The main goals of FantasyA are:
To control the character players use the Sentoy. The Sentoy is a tangible interface for affective control of a synthetic character. It allows the user to influence the emotions of the character, by expressing gestures associated with six emotions: anger, fear, surprise, gloat, sadness and happiness. According to her/his and the opponent's emotional state the character will select and perform an action that will either damage the opponent or defend herself/himself. These actions will trigger an emotional reaction on both characters. Then the player can influence these new emotions so that new actions will be taken until one of the characters is defeated and the combat ends. The game itself was developed using some of the components of the toolkit (namely the affective body expression and the SenToy), and some other modules which had to be developed like the Graphics Engine, the Game Engine, and the integration with SenToy.
The Influencing MachineTwo people enter a small room. Child-like scribbling appears across a wall: jagged lines, circles, spirals, and other shapes build up, overlap, fade away. Scattered throughout the room are postcards with art prints; on a table stands a wooden mailbox. One person picks up a card and tentatively puts it in the box. Unusual and musical sounds begin to play. Drawings change speed, color, pressure, form. The people begin sorting through cards, dropping them in the box and seeing how the graphics and sound change. Over time, new forms appear. The people play, experiment, discuss: How is this reacting to us? How do you think this works? The Influencing Machine is an interactive installation which explores issues in the enigmatics of affect. In this installation, users influence the emotions of an (invisible) artificial agent, which expresses its emotions by generating real-time, dynamic, childlike scribblings and through an emotionally evocative soundscape. For this system, the underlying technology that the system requires to process emotions and generate expressive behaviour is in the background, with the focus on users having experiences with the system that engage their emotions and critical and interpretive faculties. Users should ask, Which emotions do I see? Do I agree with the machine? Do I believe these drawings express emotions? What is my relationship to this emotional machine? In order to support this questioning, the input and output of the system are deliberately left complex, enigmatic and open to interpretation. Technically, the system works by using the input postcards to influence an internal emotional model. These internal emotions trigger sounds and the selection of drawing behaviours and their dynamic parameters: speed, colour, size, pressure, etc. The Influencing Machine system uses the following components of the SAFIRA toolkit: an affective input object; the developmental model; and affective graphics. |
||||||||||||||||||||
© SAFIRA
Consortium |
||||||||||||||||||||