CORDIS | European Commission



|MOBISERV – FP7 248434 | |

|An Integrated Intelligent Home Environment for the Provision of Health, Nutrition and Mobility Services to the | |

|Elderly | |

| | |

| | |

| |Final Deliverable | |

| |D6.1: First Coordination and Communication system prototype |

| |[pic]Date of delivery: November 30, 2011 (Updated to December 27th 2011) |

| |Contributing Partners: ST, LUT, ROBS |

| |Date: 27-Dec-11 |Version: v1.0 |

Document Control

|Title: |D6.1: First Coordination and Communication system prototype |

|Project: |MOBISERV (FP7 248434) |

|Nature: |Prototype |Dissemination Level: Public |

|Authors: |ST, LUT, ROBS |

|Origin: |ST |

|Doc ID: |MOBISERV_D6.1_v1.0.doc |

Amendment History

|Version |Date |Author |Description/Comments |

|v0.1 |2011-10-27 |ST |Draft outline |

|v0.2 |2011-11-10 |ST, LUT |Updated outline |

|v0.3 |2011-11-29 |ST, LUT |Input form researchers |

|v0.4 |2011-11-30 |ST |Initial version ready for internal review |

|v0.5 |2011-12-20 |ST, LUT |Final version, ready for peer-review |

|v0.6 |2011-12-23 |ROBS |Peer-reviewed version |

|v1.0 |2011-12-27 |ST |Final peer-reviewed version |

Table of contents

Introduction 8

Glossary 10

1 Sequence Diagrams and Description of the Components Involved 11

1.1 Introduction 11

1.2 Function for Reminder and Encouragement to drink (Dehydration Prevention) (F02) 12

1.2.1 Function Requirements Overview 12

1.2.2 Introduction to major components involved 14

1.2.3 Sequence Diagrams 15

1.2.4 Components Description 18

1.2.4.1 NutritionAgenda 18

1.2.4.2 NutritionActivityDetection 22

1.2.4.3 NutritionActivity 23

1.2.4.4 InteractionManager 25

1.2.4.5 GUI(Nutrition) 28

1.2.4.6 SpeechRecognition 29

1.2.4.7 SpeechSynthesis 30

1.2.4.8 LocaliseOlderPerson 31

1.2.4.9 GoNextToOlderPerson 32

1.3 Function for reminder and encouragement to eat (Nutritional Assistance) (F01) 33

1.3.1 Function Requirements Overview 33

1.3.2 Introduction to major components involved 35

1.3.3 Sequence Diagrams 35

1.3.4 Components Description 35

1.4 Function for a mobile intercom for enabling front door entry (F14) 36

1.4.1 Function Requirements Overview 36

1.4.2 Introduction to major components involved 37

1.4.3 Sequence Diagrams 39

1.4.4 Components Description 46

1.4.4.1 Visitors 46

1.4.4.2 Door 48

1.4.4.3 Skype 49

1.4.4.4 InteractionManager 50

1.4.4.5 GUI(Visitors) 54

1.4.4.6 SpeechRecognition 55

1.4.4.7 SpeechSynthesis 56

1.4.4.8 GoNextToOlderPerson 57

1.4.4.9 LocaliseOlderPerson 58

1.5 Function for Encouragement for exercising (F08) 59

1.5.1 Function Requirements Overview 59

1.5.2 Sequence Diagrams & Components Description 60

1.5.2.1 Function "Exercises" (part of F08) 61

1.5.2.1.1 Introduction to major components involved 61

1.5.2.1.2 Sequence Diagrams 62

1.5.2.1.3 Components Description 64

1.5.2.2 Function "Exercise Coach" (part of F08) 73

1.5.2.2.1 Introduction to major components involved 73

1.5.2.2.2 Sequence diagrams 74

1.5.2.2.3 Components Description 76

1.6 Function for Voice/Video/SMS via robot communication with friends and relatives (F11) 88

1.6.1 Functional Requirement Overview 88

1.6.2 Introduction to major components involved 91

1.6.3 Sequence Diagrams 92

1.6.4 Components Description 94

1.6.4.1 SkypeAPI 94

1.6.4.2 Skype 96

1.6.4.3 InteractionManager 98

1.6.4.4 GUI(Skype) 101

1.6.4.5 SpeechRecognition 102

1.6.4.6 SpeechSynthesis 103

2 Information Fusion and Channelling: InteractionManager Component 104

3 Remotely Accessible Services for Secondary Users 115

3.1 Introduction 115

3.2 Overview of Recommendations for Remotely Accessible Secondary User Interface 116

3.2.1 Recommendations for Data Logs 116

3.2.1.1 Wireframe designs 116

3.2.1.2 Comments on the secondary user log screens 119

3.2.2 Recommendations for Settings Interface, related to Nutrition, Hydration and Exercises and Physiological 119

3.2.2.1 Nutrition Settings Screen 119

3.2.2.2 Hydration settings screens 121

3.2.2.3 Exercises setting screen 123

3.3 Development Environment and Implementation Issues 123

4 Interaction between the MOBISERV robotic system (PRU) and the smart home assistive infrastructure 125

5 MOBISERV Communication System 129

6 Initial Security Analysis of the MOBISERV system 131

6.1 Potential point of attacks 131

6.1.1 Information sources 131

6.1.2 Information targets 133

6.1.3 Information storage 133

6.1.4 Information transfer 133

6.2 Analysis of the handled data, threats and risks 134

6.2.1 Activity video (for eating and drinking reminder and encouragement functionalities) 134

6.2.2 Door ring 134

6.2.3 Door video 135

6.2.4 WebInterface (Nutrition, Hydration, Exercises and Physiological Settings and Logs) 135

6.2.5 Nutrition, Hydration, Exercises and Physiological- related alert 136

6.2.6 Video conferencing (Skype) 136

6.2.7 MOBISERV Application start 136

6.2.8 Application commands 137

6.2.9 Robot Audio and video 137

6.2.10 Smart garment sensor data 137

7 References 139

Table of Figures

Figure 1: Process specification for the Dehydration Prevention function 13

Figure 2: Sequence Diagram for Reminder & Encouragement to eat or drink - Part 1 15

Figure 3: Sequence Diagram for Reminder & Encouragement to eat or drink - Part 2 16

Figure 4: Sequence Diagram for Reminder & Encouragement to eat or drink - Part 3 17

Figure 5: Process specification for the Nutritional Assistance Function 34

Figure 6: Process specification for mobile intercom 37

Figure 7: Sequence Diagram for a mobile screen connected to the front door - Part1 39

Figure 8: Sequence Diagram for a mobile screen connected to the front door - Part2 40

Figure 9: Process specification for encouragement for exercising 60

Figure 10: Sequence Diagram for Exercises - Part 1 62

Figure 11: Sequence Diagram for Exercises - Part 2 63

Figure 12: Sequence Diagram for Exercises - Part 3 64

Figure 13: Sequence Diagram for Exercise Coach - Part 1 74

Figure 14: Sequence Diagram for Exercise Coach - Part 2 75

Figure 15: Process specification for video/voice/SMS Outgoing communication 89

Figure 16: Process specification for video/voice/SMS Incoming communication 90

Figure 17: Sequence Diagram for Voice/Video/SMS via robot communication with friends and relatives - Part1 92

Figure 18: Sequence Diagram for Voice/Video/SMS via robot communication with friends and relatives - Part2 93

Figure 19: InteractionManager - constituents and collaborating components 105

Figure 20: Design recommendation – secondary user main screen – data logs 116

Figure 21: Design recommendation – secondary user – Nutrition Logs 117

Figure 22: Design recommendation – secondary user – Hydration Logs 117

Figure 23: Design recommendation – secondary user – Exercise Logs (Kcals) 118

Figure 24: Design recommendation – secondary user – Exercise Logs (Heart and respiration) 118

Figure 25: Design recommendation - Nutrition Settings screen 119

Figure 26: Design recommendation - Nutrition Settings – Set-up Meal screen 120

Figure 27: Design recommendation - Hydration Settings screen 121

Figure 28: Design recommendation - Hydration Settings – Drink set-up screen 122

Figure 29: Design recommendation - Exercise Settings screen 123

Figure 30: Adding new meal (and viewing defined meals) screen 124

Figure 31: Adding new message (and viewing defined messages) screen 124

Figure 32: Home-automation components 126

Figure 33: Communications definitions use process defines steps in use of the definition language 130

List of Tables

Table 1: Sub-Functions for F_2 12

Table 2: Sub-Functions for F_1 33

Table 3: Sub-Functions for F_14 36

Table 4: Sub-Functions for F_8 59

Table 5: Sub-Functions for F_11 88

Introduction

This report provides a description of the first prototype of the MOBISERV coordination and communication system (D6.1) that is the main outcome of the "WP6: Development of the Information Coordination and Communication support system" activities, from the sixth project month until the release of the first prototype.

The MOBISERV Information Coordination and Communication support system is responsible for:

• coordinating information across the various sensors/ modalities, which are used within MOBISERV for supporting the implementation scenarios identified;

• supporting services targeted to secondary users, which can be accessed remotely;

• providing the communication mechanism among the MOBISERV system components and sub-systems.

WP6 activities have been initially informed by user requirements elicitation activity (T2.3) and subsequent framework architecture design and specification (WP3). Though, the initial specifications of the Information Coordination and Communication support system - prepared as part of Task 3.5 (and reported in D3.1[1]) - have been developed in more detail under WP6 (see Chapter 1).

WP6 activities have been also informed by additional user studies conducted as part of the year 2 activities (especially under WP2) so as to cater for the potential needs of all the "categories of" users - primary, secondary, and tertiary. Based on these, the design of the Information Coordination and Communication support system has been refined.

Overall, a set of sequence diagrams for each one of the five high-level functionalities selected for evaluation have been prepared, and the components involved - especially those interacting directly with the MOBISERV Information Fusion and Channelling components, i.e. with the InteractionManager - have been described in more detail (see paragraph 1.2.4).

Based on these diagrams, development activities of the InteractionManager (see Chapter 2) as well as the MOBISERV communication system, took place (see Chapter 4). WP6 activities involved also the design and development of services targeted to secondary users, which can be accessed remotely, based on feedback received from secondary and tertiary users (as a result of the additional studies conducted under WP2, within the second project year) (see Chapter 3).

Furthermore, as a broader goal of the MOBISERV project is the integration with smart home / home automation environments, this WP involved activities with respect to the design and development of a collaborative interface between the MOBISERV robotic system (PRU) and the smart home assistive infrastructure (see Chapter 5). The Smartest House of the Netherlands, Eindhoven, The Netherlands (a facility provided by partner SMH) has been used to facilitate the implementation of the respective interface. This involved the definition and implementation of a simple protocol for communicating information between the PRU and the smart home assistive infrastructure.

Initial developments of the MOBISERV Information Coordination and Communication support has been informed by integration activities conducted under WP7 and updated accordingly.

Finally, initial security analysis of the MOBISERV system has been conducted (see Chapter 6).

It should be noted that D6.1 (originally planned for July 2011) was delayed by almost four months. The main reasons for this delay was that - following the user-centred design approach adopted in MOBISERV - special attention was put on catering for the potential needs of all the "categories of" users (primary, secondary, and tertiary), as these are identified in user studies conducted under Year 1 and Year 2 activities, and their subsequent transformation to technical and technological developments. The design of the Information Coordination and Communication Support System had to be refined, and its implementation to be accordingly informed. These, evident development steps, have proved more time consuming than originally expected. Another important reason was the fact that the development of the Information Coordination and Communication Support System was providing and getting feedback from integration related activities (conducted under WP7); this link to integration activities has proven more tight than originally expected (and presented in the DoW).

Glossary

|Term |Explanation |

|AMX |AMX Corporation |

|API |Application Programming Interface |

|CCR |Concurrency and Coordination Runtime |

|DSS |Decentralized Software Services |

|DSSP |Decentralized Software Services Protocol |

|GUI |Graphical User Interface |

|HMI |Human Machine Interface |

|ILAEXP |Independent Living & Ageing & cross-industrial committee of experts |

|KNX |KNX Association, technology and ISO/IEC 14543-3 certified devices |

|MOBISERV |An Integrated Intelligent Home Environment for the Provision of Health, Nutrition and Mobility Services to the |

| |Elderly |

|MRDS |Microsoft Robotics Developer Studio |

|ORU |Optical Recognition Unit |

|PRU |Physical Robotic Unit |

|SHACU |Smart Home Automation and Communication Unit |

|WHSU |Wearable Health Supporting Unit |

|WP |Work package |

|WPF |Windows Presentation Foundation |

Sequence Diagrams and Description of the Components Involved

1 Introduction

The user requirements gathering and analysis phase resulted to a set of scenarios and high-level functions (high-level functionalities) to help us define what might be expected from MOBISERV in a given situation. From the high-level functionalities identified, nine of them gained high ranking in the Independent Living & Ageing & cross-industrial committee of experts (ILAEXP) workshop (on prioritising MOBISERV requirements) held on September 21, 2010 in Paris, France; five of them have been selected for evaluation, taking also into consideration the exploitation potential, with a view to maximizing benefits to the target user groups’ needs addressed by the scenario. These are:

• Function for Reminder and Encouragement to drink (Dehydration Prevention) (F02);

• Function for Reminder and Encouragement to eat (Nutritional Assistance) (F01);

• Function for a mobile intercom for enabling front door entry (F14);

• Function for Encouragement for exercising (F08);

• Function for Voice/Video/SMS via robot communication with friends and relatives (F11);

and they will be supported by the First MOBISERV System Prototype.

Whereas in D3.1 these high-level functionalities have been initially analysed and modelled to determine what components are needed for the initial MOBISERV system and the tasks of each one of these components, the information they need to fulfil the tasks and the information they can provide to other components have been described, in WP6 they have been analysed in more detail: for each one of the high-level functionalities a set of sequence diagrams have been prepared; also the components involved (especially those interacting directly with the MOBISERV Information Fusion and Channelling components, i.e. with the InteractionManager) have been further described.

The reasons behind selecting to prepare this set of sequence diagrams are that sequence diagrams:

• can map the high-level functionalities in step by step detail to define how components collaborate to achieve an application's goals;

• model the flow of logic within a system in a visual manner, enabling both the documentation and validation of this logic (focusing on identifying the behaviour within a system);

• can show a) the messages communicated among the various components, while they interact to complete a certain task, and b) the sequential / interaction logic, that are particularly useful for analysing and designing the MOBISERV Coordination and Communication system prototype.

Sequence diagrams have been prepared taking into account:

• The MOBISERV System Requirements Specification, as defined in D2.3 - Volume II [2];

• The MOBISERV Design specifications, as defined in D3.1 [1];

• The MOBISERV HMI specifications for the Physical Robotic Unit [3]; this has been an internal report presenting recommendations for the graphical user interfaces for primary and secondary users to be implemented for interaction with MOBISERV-Kompaï and personalisation of the system features and functionality based on individual persons’ physical and cognitive needs. It was based on research with older people and carers.

Sequence diagrams have been consolidated with the help of all MOBISERV technical partners and informed by components integration activities. Prior to presenting them - along with the components collaborating in functionality - we provide an overview of the respective high-level functionalities (sub-function requirements and diagrams showing the flow of information among the components), taken by D2.3-Volume II document to facilitate the reading.

2 Function for Reminder and Encouragement to drink (Dehydration Prevention) (F02)

This function will serve to prevent dehydration for people that do not drink enough, or tend to forget to drink.

1 Function Requirements Overview

Sub-functions requirements

|Sub-Function ID |Sub-Functions |Requirement |

|F_2.1 |Ability to turn function ON or OFF |It should be clear to the user and others whether|

| | |this function is in the ON or OFF mode |

|F_2.2 |Detect lack of drinking activity | |

|F_2.3 |Issue a reminder to drink, together with a eating reminder | |

|F_2.4 |Detect drinking activity | |

|F_2.5 |Issue a reminder to drink, when detecting an eating activity | |

|F_2.6 |Detect and log the user’s response to the reminder | |

|F_2.7 |Keep a list of drinks, and preferences of the user | |

|F_2.8 |Issue a highly persuasive encouragement to drink to the person, after|Voice / on screen, persuade for instance by |

| |not responding to a reminder. |showing favourite drinks. |

Table 1: Sub-Functions for F_2

Process Specification

[pic]

Figure 1: Process specification for the Dehydration Prevention function

2 Introduction to major components involved

The major components involved in this function are introduced below:

• NutritionAgenda: This component retrieves information (stored in a database or configuration file or other) concerning eating & drinking events (i.e., nutrition-related events) and initiates a reminder or encouragement to eat/drink interaction on the InteractionManager (based on information stored in the database). It logs relevant information (e.g. missed meals).

It corresponds to the "ReminderAndEncouragement" component identified in D3.1 [1] diagrams.

• NutritionActivityDetection: It corresponds to the EatingDetection and DrinkingDetection components identified in D3.1. These two components have been merged to one, since they both rely on the same infrastructure and share the same computer vision algorithms. These algorithms are programmed in unmanaged C++ code; a managed C++ wrapper (.dll) has been implemented in order to be integrated into Kompai-RobuBOX.

• NutritionActivity: MRDS service providing an interface to the NutritionActivityDetection component.

• InteractionManager: This component is responsible for coordinating information across the various sensors/modalities, which are used within MOBISERV for supporting the implementation scenarios identified.

It is merged with the "ApplicationManager" component identified in D3.1.

• GUI(Nutrition): (former TabletPCUI) Graphical User Interface provided by the robot, the one related to F01 and F02.

• SpeechRecognition: Speech input recognition component, on the robot.

• SpeechSynthesis: Text to speech voice generator, on the robot.

• LocaliseOlderPerson: It detects the position and if possible the orientation of the older person in his/her apartment.

• GoNextToOlderPerson: It generates a path for the robot to go dynamically next to the older person.

Note: The SpeechHMI component identified in D3.1's diagrams has been now split to the following two components: a) SpeechRocognition and b) SpeechSynthesis.

3 Sequence Diagrams

[pic]

Figure 2: Sequence Diagram for Reminder & Encouragement to eat or drink - Part 1

[pic]

Figure 3: Sequence Diagram for Reminder & Encouragement to eat or drink - Part 2

[pic]

Figure 4: Sequence Diagram for Reminder & Encouragement to eat or drink - Part 3

4 Components Description

1 NutritionAgenda

Description: The component retrieves information (stored in a database or configuration file or other) concerning eating & drinking events (i.e., nutrition-related events). Each event may have the following properties:

• Kind: EATING | DRINKING

• Title: e.g. Breakfast | Lunch | Dinner …

• StartDateTime: Start date and/or time of the event, e.g. 08:00 or 28-06-2011 08:00 (for breakfast)

• EndDateTime: Start date and time of the event, e.g. 10:00 or 28-06-2011 10:00 (for breakfast)

In addition to the above, the components may internally handle the following information for each "Nutrition" Event:

• EventActivated: (True/False) It provides an indication whether the event has been activated. For example, when the time is "08:00" then the event needs to be activated.

• ReminderEnabled: (True | False) It provides an indication whether at least one reminder has been issued concerning the event (see the process specification diagram in 1.2.1).

• ReminderTime: (Next) Time a reminder will be issued for that event.

• ReminderCounter: (integer counter) How many times a reminder has been issued.

• EncouragementEnabled: (True | False) It provides an indication whether an encouragement has been issued concerning the event (see the process specification diagram in 1.2.1)

• EncouragementTime: (Next) Time an encouragement should take place for that event.

• EncouragmentCounter: (integer counter) how many times the user has been encouraged to do what the event says.

Furthermore, the component may handle information (stored in a database or configuration file) about the text, image, video that will be delivered to the older person when a reminder or encouragement is issued.

When a reminder or encouragement is issued, the component sends a message to the InteractionManager. The InteractionManager informs the component about the user's input (active or passive).

The component also logs relevant information (e.g. missed meals, etc).

Development environment: C# (MRDS service)

Resources (devices): Tablet PC (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |GetAgendaEvents() |

| |//Requests the list of NutritionEvents. |

| |NutritionEventActivated(Event) |

| |//Notifies that the "Event" has been activated. |

| |//Parameter: Event |

| |EventReached(Event) |

| |//Notifies that the "Event" has been reached. |

| |//Parameter: Event |

| |UpdateEvent(Event) |

| |//Notifies to update the event "Event". |

| |//Parameter: Event |

| |NotifyOlderPersonNotFound() |

| |//Notifies that the older person has not been found in his / her apartment. |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |AgendaEvents |

| |//It sends the list of all the "Nutrition" events. |

| |InitiateNutritionActivity() |

| |//It sends a request to the InteractionManager to initiate the NutritionActivity detection |

| |service. Cameras are on only in specific time slots and not all the day, for privacy |

| |reasons. |

| |NutritionEventReminder(Event, Text, Voice, Message, Image, Video) |

| |// It sends a request to the InteractionManager to issue a reminder. |

| |// Parameters: Event, Text, Voice, Message, Image, Video |

| |(The values of the Text, Voice, Message, Image, Video have been set by secondary users.) |

| |Text: True | False |

| |Voice: True | False |

| |Message: String |

| |Image: String (file path) // If Image==NULL no image will be displayed |

| |Video: String (file path) // if Video==NULL no video will be displayed |

| |NutritionEventEncouragment(Event, Text, Voice, Message, Image, Video) |

| |// It sends a request to the InteractionManager to issue an encouragement. |

| |// Parameters: Event, Text, Voice, Message, Image, Video |

| |(The values of the Text, Image, Video have been set by secondary users.) |

| |TerminateNutritionActivity() |

| |//It sends a request to the InteractionManager to terminate the NutritionActivity detection|

| |service. Cameras will be switched off. |

Settings data (with types):

• NutritionAssistance: True | False // It is set by a secondary user.

• DehadrationPrevention: True | False // It is set by a secondary user.

• List of Nutrition Events: As said before, each event may have the following properties: // As set by a secondary user:

o Kind: EATING | DRINKING

o Title: e.g. Breakfast | Lunch | Dinner …

o StartTime: Start time of the event, e.g. 08:00 (for breakfast)

o EndTime: Start time of the event, e.g. 10:00 or 28-06-2011 10:00 (for breakfast)

o Recurrence

• ReminderOnScreenText: True | False

• ReminderVoice: True | False

• ReminderImages: True | False

• ReminderVideo: True | False

• ReminderMessages: Array/List of Strings

• ReminderImages: Array/List of strings [file paths to images]

• ReminderVideos: Array/List of strings [file paths to videos]

• EncouragmentOnScreenText: True | False

• EncouragmentVoice: True | False

• EncouragmentImages: True | False

• EncouragmentVideo: True | False

• EncouragmentMessages: Array/List of strings

• EncouragmentImages: Array/List of strings [file paths to images]

• EncouragmentVideos: Array/List of strings [file paths to videos]

Other settings:

• Time since the start time of an event has reached (and subsequently cameras are switched on) until the system checks if the user has done the corresponding activity (e.g. has eaten) [for example if breakfast starts at 08:00, the cameras are switched at 08:00. After XX minutes the system would check if the older person has eaten. We need to set this XX period]

• MaxReminderTime: From the process specification diagram, one understands that a reminder will be issued only once. If the persons, does not respond, then we need to re-issue the reminder. We need to define the max number of times a reminder will be issued before going into the "encouragement phase"

• MaxEncouragementTime: Maximum number of times an encouragement will be issued before exiting this phase (and thus report / log that the events has not been reached).

Logs (with types) and how to display them:

Logged missed nutrition events. The system would display (in a calendar daily view) the nutrition events set for the older person (type, start time, end time, etc), along with a short description or indication whether the older person has done this.

2 NutritionActivityDetection

Description: It detects if eating or drinking activity has been detected in a specific time interval. It replies with a yes or no together with a confidence level.

Development environment: These algorithms are programmed in unmanaged C++ code; a managed C++ wrapper (.dll) has been implemented in order to be integrated in Kompai-RobuBOX.

Resources (devices): Cameras and PC in the user's apartment (web camera of the tablet PC)

Inputs (with types):

Video stream for eating/drinking situation

|Messages Received From |Message |

|Cameras in the user's apartment |Video stream from eating situation |

|NutritionActivity |DetectActivity(EventKind, StartTime) |

| |//EventKind: EATING | DRINKING |

| |//StartTime: Start time to detect eating or drinking activity. |

| |//Actually the NutritionActivity asks the NutritionActivityDetection if an EATING or |

| |DRINKING activity has been detected in the time period [StartTime, now]. |

|NutritionActivity |Init() |

| |//NutritionActivity asks the NutritionActivityDetection module to start monitoring for |

| |EATING or DRINKING activities. This function will start the camera and perform |

| |Eating/Drinking activity detection and recognition. |

|NutritionActivity |Terminate() |

| |//NutritionActivity asks the NutritionActivityDetection module to stop monitoring for |

| |EATING or DRINKING activities. This function will switch-off the camera and stop performing|

| |Eating/Drinking activity detection and recognition. |

Outputs (with types):

Video stream from eating situation

|Messages Sent To |Message |

|NutritionActivity |ActivityDetectionResult(EventResult, EventConfidence) |

| |//EventResult: TRUE | FALSE |

| |//EventConfidence: float |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

3 NutritionActivity

Description: MRDS service providing an interface to the NutritionActivityDetection component.

Development environment: C# (MRDS service)

Resources (devices): Tablet PC (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |InitiateNutritionActivity() |

| |//Starts the service in run-time, on demand. |

| |DetectActivity(Event) |

| |//Asks if a certain event has been detected. |

| |TerminateNutritionActivity() |

| |//Terminates the service in run-time, on demand. |

|NutritionActivityDetection |ActivityDetectionResult(EventResult, EventConfidence) |

| |//EventResult: TRUE | FALSE |

| |//EventConfidence: float |

Notes:

In the "DetectActivity(Event)" message, the Event parameter is a type, which holds (among others) information about the start time of the event.

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |Acknowledgment whether the service has been started or terminated. |

| |ActivityDetectionResult(result, confidence) |

| |//Result: TRUE | FALSE |

| |//Confidence: float |

| |//Relays the message received from NutritionActivityDetection, whether a certain eating or |

| |drinking activity has been detected and what has been the confidence. |

|NutritionActivityDetection |Init() |

| |//NutritionActivity asks the NutritionActivityDetection module to start monitoring for |

| |EATING or DRINKING activities. This function will start the camera and perform |

| |Eating/Drinking activity detection and recognition. |

|NutritionActivityDetection |Terminate() |

| |//NutritionActivity asks the NutritionActivityDetection module to stop monitoring for |

| |EATING or DRINKING activities. This function will switch-off the camera and stop performing|

| |Eating/Drinking activity detection and recognition. |

|NutritionActivityDetection |DetectActivity(EventKind, StartTime) |

| |//EventKind: EATING | DRINKING |

| |//StartTime: Start time to detect eating or drinking activity |

| |//Actually the NutritionActivity asks the NutritionActivityDetection if an EATING or |

| |DRINKING activity has been detected in the time period [StartTime, now]. |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

4 InteractionManager

Description: The InteractionManager handles all events that the other Components generate. The resulted information will be, then, used for accessing and updating the appropriate application data and then channelled to the appropriate output modalities.

Development environment: C# (MRDS service) (extended version of the "Control" service in Kompai-RobuBOX)

Resources (devices): TabletPC (with Kompai-RobuBOX)

Inputs (with types):

[Active input]

• Voice Commands issued by the older persons (e.g. answer to the question if the older person has eaten);

• Tactile commands issued by the older persons though the robot's touch screen (GUI) (e.g. when the older person presses the "Yes" button, in the GUI).

[Passive input]

• Events/Data coming from the NutritionAgenda;

• NutritionActivityDetection results.

In particular:

|Messages Received From |Message |

|NutritionAgenda |AgendaEvents |

| |//List of nutrition events |

| |InitiateNutritionActivity() |

| |//Request to initiate the service "NutritionActivity" |

| |NutritionEventReminder(Event, Text, Voice, Message, Image, Video) |

| |//Event as set by the secondary user |

| |//Text: TRUE | False |

| |//Voice: TRUE | False |

| |//Message: message set by the secondary user. It will be delivered through the GUI and/or |

| |SpeechSynthesis based on the vales of "Text" and "Voice" |

| |//Image: path to an image file |

| |//Video: path to a video file |

| |NutritionEventEncouragement(Event, Text, Voice, Message, Image, Video |

| |// similar to the above |

|LocaliseOlderPerson |FindOlderPersonResult(result) |

| |//acknowledgment that the older person has been found and/or location of the older person |

| |in his/her apartment |

| |Acknowledgement of initiation or termination of the NutritionActivity component (MRDS |

|NutritionActivity |service) |

|NutritionActivity |ActivityDetectionResult(EventResult, EventConfidence) |

| |//EventResult: TRUE | FALSE |

| |//EventConfidence: float |

|GUI |GUIIsNutritionEventReached() |

| |//Event received from the GUI. It means that the older person has been asked whether he/she|

| |has eaten/drunk and he/she answered yes through the GUI (by pressing a button). |

|SpeechRecognition |SpeechIsNutritionEventReached() |

| |//Event received from the SpeechRecognition component. It means that the older person has |

| |been asked whether he/she has eaten/drunk and he/she answered yes through speech. |

Outputs (with types):

• Information channelled to the SpeechSynthesis component;

• Information channelled to the GUI component;

• Commands issued to the LocaliseOlderPerson component;

• Commands issued to the GoNextToOlderPerson component;

• Pass events, access or update applications / application components (e.g. NutritionAgenda and NutritionActivity).

|Messages Sent To |Message |

|NutritionAgenda |GetAgendaEvents() |

| |// Requests the list of the nutrition events. |

| |EventReached(Event) |

| |//Informs the NutritionAgenda that the event has been reached (the user has eaten/drunk). |

| |UpdateEvent(Event) |

| |//Sends a command to the NutritionAgenda to update the event (the event has not been |

| |reached). |

| |NotifyOlderPersonNotFound() |

| |//Notifies the NutritionAgenda that the older person is probably not in his/her apartment. |

|NutritionActivity |InitiateNutritionActivity() |

| |//Starts the respective MRDS time on demand. |

| |DetectActivity(event) |

| |//Terminates the respective MRDS time on demand. |

|LocaliseOlderPerson |FindOlderPerson() |

| |//Sends a command to locate an older person in his/her apartment. |

|GoNextToOlderPerson |GoNextToOlderPerson(result) |

| |// Generates a path for the robot to go dynamically next to the older person. |

|GUI |NotificationToGUI(NotificationType) |

| |//NotificationType: NUTRITION_DETECTION_INITIATED |

| |//Notifies the user - though the GUI - that the cameras have been switched on. |

| |NotificationToGUI(Event, Message, Image, Video) |

| |//Event as set by the secondary user |

| |//Message: message set by the secondary user. It will be delivered through the GUI. |

| |//Image: path to an image file |

| |//Video: path to a video file |

|SpeechSynthesis |SendToTextToSpeech("NUTRITION_DETECTION_INITIATED") |

| |//Notifies the user - though speech - that the cameras have been switched on. |

| |TextToSpeechNotification(Event, Message) |

| |//Event as set by a secondary user |

| |//Message: message set by a secondary user. It will be delivered by voice. |

Settings data (with types):

Information about the background functions enabled or disabled

• NutritionAssistance: TRUE | FALSE

• Dehydrationprevention: TRUE | FALSE

• MedicalReminders: TRUE | FALSE

• ExercisingCoach: TRUE | FALSE

• PanicResponder: TRUE | FALSE

Information about main menu functions enabled or disabled

• AudioVideoContact: TRUE | FALSE

• Games: TRUE | FALSE

• Exercises: TRUE | FALSE

• SelfCheck: TRUE | FALSE

• HomeControl: TRUE | FALSE

• RobotControl: TRUE | FALSE

• AudioVideoContactIcon: string [file path]

• GamesIcon: string [file path]

• ExercisesIcon: string [file path]

• SelfCheckIcon: string [file path]

• HomeControlIcon: string [file path]

• RobotControlIcon: string [file path]

Logs (with types) and how to display them:

N/A

5 GUI(Nutrition)

Description: It displays information about reminders or encouragements related to nutrition assistance and dehydration prevention.

Development environment: This component is being implemented in C# (as an MRDS service) where all the graphics elements of the Human Machine Interface (HMI) are bundled. It has a single Windows Presentation Foundation (WPF) window where the specific graphics controls of each application can be shown.

Resources (devices): TabletPC on the robot (roboBOX)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |NotificationToGUI(NotificationType) |

| |//NotificationType: NUTRITION_DETECTION_INITIATED |

| |//Notifies the user - though the GUI - that the cameras have been switched on. |

| |NotificationToGUI(Event, Message, Image, Video) |

| |//Event as set by a secondary user |

| |//Message: message set by a secondary user. It will be delivered through the GUI. |

| |//Image: path to an image file |

| |//Video: path to a video file |

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |GUIIsNutritionEventReached() |

| |//Event received from the GUI. It means that the older person has been asked whether he/she|

| |has eaten/drunk and he/she answered yes through the GUI (by pressing a button). |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

6 SpeechRecognition

Description: Speech input recognition component, on the robot.

Development environment: It is a basic component of Microsoft Windows, integrated within MRDS and Kompai-RobuBOX Software. It can be used to recognise any sentence.

Resources (devices): Tablet PC (RobuBOX) and microphone

Inputs (with types):

-

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |SpeechIsNutritionEventReached() |

| |//Event received from the SpeechRecognition component. It means that the older person has |

| |been asked whether he/she has eaten/ drunk and he/she answered yes through speech. |

Settings data (with types):

Dialogue commands (yes, no) to recognise

Logs (with types) and how to display them:

N/A

7 SpeechSynthesis

Description: Text to speech voice generator, on the robot

Development environment: C# (MRDS service)

Resources (devices): TabletPC, robubox and speakers

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |SendToTextToSpeech("NUTRITION_DETECTION_INITIATED") |

| |//Notifies the user - though speech- that the cameras has been switched on. |

| |TextToSpeechNotification(Event, Message) |

| |//Event as set by a secondary user |

| |//Message: message set by a secondary user. It will be delivered by voice. |

Outputs (with types):

-

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

8 LocaliseOlderPerson

Description: This component is responsible for locating the older person in his/her apartment.

Development environment: C# (MRDS Service)

Resources (devices): Motion detection sensors in the smart home, Tablet PC

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |FindOlderPerson() |

| |// command to locate an older person in his apartment |

|Sensors on the robot |Odometers, Laser |

|Sensors in the smart home |Motion detection sensors |

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |FindOlderPersonResult(location) |

| |//Acknowledgment that the older person has been found and/or location of the older person |

| |in his/her apartment. |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

9 GoNextToOlderPerson

Description: It generates a path for the robot to go dynamically next to the older person.

Development environment: C# (MRDS Service)

Resources (devices): robots sensors (odometers, laser), Tablet PC

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |GoNextToOlderPerson (result) |

| |// Generates a path for the robot to go dynamically next to the older person. |

|Sensors on the robot |Odometers, Laser |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

3 Function for reminder and encouragement to eat (Nutritional Assistance) (F01)

As mentioned in [2], this function will serve to prevent malnutrition for people that tend to forget to eat or do not feel like eating, and tend to skip meals.

1 Function Requirements Overview

Sub-functions requirements

|Sub-Function ID |Sub-Functions |Requirement |

|F_1.1 |Ability to turn function ON or OFF |It should be clear to the user and others whether this function is |

| | |in the ON or OFF mode. |

|F_1.2 |Setup meal timeslots and periods of eating for each|Touch screen interface for carer or user |

| |meal | |

|F_1.3 |Detect a missed meal. |Camera monitoring status should be visible to the user. |

|F_1.4 |Locate the person. |Option of actions to take if the person can’t be located within a |

| | |pre-set time limit. |

|F_1.5 |Issue a missed meal reminder to the person. |Voice or melody or screen (Allowed for selection by user or carer) |

|F_1.6 |Detect an acknowledgement of meal reminder by the |If the reminder has not been acknowledged within 10 minutes (for |

| |user. |example), reissue x number of times, after which take a |

| | |pre-determined course of action. |

|F_1.7 |Detect the action by the user in response to the |Request verbal response if no action is identified. |

| |reminder. | |

|F_1.8 |Issue a highly persuasive encouragement to eat to |Voice / on screen, persuade for instance by showing favourite food, |

| |the person, after not responding to a reminder. |or available food. |

|F_1.9 |Log a missed meal. |Log date and time. Take pre-set action when a pre-set number of |

| | |meals are missed. |

Table 2: Sub-Functions for F_1

Process Specification

[pic]

Figure 5: Process specification for the Nutritional Assistance Function

2 Introduction to major components involved

F01 and F02 share the same components. These were introduced in 1.2.2.

3 Sequence Diagrams

Sequence diagrams for F01 are the same as for F01.

4 Components Description

F01 and F02 share the same components. For more information about these components, please see paragraph 1.2.4.

4 Function for a mobile intercom for enabling front door entry (F14)

This function will support people with mobility problems to increase control over safety.

1 Function Requirements Overview

Sub-functions requirements

|Sub-Function ID |Sub-Functions |Requirement |

|F_14.1 |Ability to turn function ON or OFF |It should be clear to the user and others whether|

| | |this function is in the ON or OFF mode |

|F_14.2 |Recognise the door-bell |Either by direct integration with the door-bell |

| | |system or sound recognition. |

|F_14.3 |Locate and move to the user | |

|F_14.4 |Enable the user to select mode of communication, voice or video | |

|F_14.5 |Start link | |

|F_14.6 |Ability for the user to initiate desired action, open an electronic | |

| |lock, call for help | |

Table 3: Sub-Functions for F_14

Process Specification

[pic]

Figure 6: Process specification for mobile intercom

2 Introduction to major components involved

• Visitors: MRDS service providing an interface to the smart home automation infrastructure (particularly to devices / sensors such as Doorbell, camera connected to the front door, microphone, loudspeakers, etc).

• Door: an external set of devices on door, which can be remotely used.

• Skype: Skype API (integrated into Kompai-RobuBOX). It is used to enable an older person to make a call to a remote party, calling for help.

• InteractionManager: This component is responsible for coordinating information across the various sensors/ modalities, which are used within MOBISERV for supporting the implementation scenarios identified.

It is merged with the "ApplicationManager" component identified in D3.1.

• GUI(Visitor): (former TabletPCUI) Graphical User Interface provided by the robot, the one related to F14.

• SpeechRecognition: Speech input recognition component, on the robot.

• SpeechSynthesis: Text to speech voice generator, on the robot.

• LocaliseOlderPerson: This component is responsible for locating the older person in his/her apartment.

• GoNextToOlderPerson: It generates a path for the robot to go dynamically next to the older person.

Notes:

• The SpeechHMI component identified in the D3.1's diagram has been now split to the following two components: a) SpeechRecognition and b) SpeechSynthesis.

• TabletHMI component identified in the D3.1's diagram is renamed to GUI(Visitors).

• The SpeechMicrophone component identified in the D3.1's diagram does not appear in the sequence diagram. Though it is still part of the robot (input device).

• The Communicator component identified in the D3.1's diagram does not appear in the sequence diagram. Though, it is still needed to establish video and audio communication link between the Door and the robot.

3 Sequence Diagrams

[pic]

Figure 7: Sequence Diagram for a mobile screen connected to the front door - Part1

[pic]

Figure 8: Sequence Diagram for a mobile screen connected to the front door - Part2

A visitor presses the door bell => Door bell is ringing (Handling Incoming Door Calls)

1. The Visitors service, which provides an interface to the smart home automation (particularly to devices / sensors such as Doorbell, camera connected to the front door, microphone, loudspeakers, etc), sends a notification to the InteractionManager:

IncomingDoorCallNotification(notification, link)

//link: Link to be established between the robot and the Door for displaying video/ audio.

2. The InteractionManager sends a message to the LocaliseOlderPerson service to locate the older person in his/her apartment:

FindOlderPerson()

3. The LocaliseOlderPerson service replies to the InteractionManager:

FindOlderPersonResult(location)

This reply can be either the location of the older person or null

Obviously, the InteractionManager shouldn't wait forever the LocaliseOlderPerson to respond. If the LocaliseOlderPerson service cannot locate the older person within a pre-specified time interval (which would be set as part of the robot's configuration, e.g. 5 minutes), its response would be "FindOlderPersonResult (location), where location=null".

If (location == null)

The InteractionManager notifies the Visitors service that the person was not found.

NotifyOlderPersonNotFound()

The process is finished.

If (location !=null)

The InteractionManager commands the GoNextToOlderPerson to move the robot to the specific location.

GoNextToOlderPerson(location)

[For the rest of this part of the sequence diagram, we assume that the older person has been located]

4. As soon as the IntertactionManager receives a notification that the robot went to the older person, it sends a notification to the GUI and SpeechSynthesis services:

NotificationToGUI (NotificationToGUIType)

[Where: NotificationToGUIType = INCOMING_DOOR_CALL]

TextToSpeechNotification(Notification)

This means, that a) a specific GUI will be presented to the older person, though which the older person may select the desired mode of communication, start a link (e.g. buttons: Answer, Open the Door, Call for Help, etc) and b) the robot will say to the older person that there is a visitor in the front door.

Also, video from the door's camera is displayed in the robot's GUI (the robot needs to establish a link to the "door communication system" or to the gateway (in smart home) that handles this).

The older person selects to talk to a visitor (Answering to the Door Call)

o When the older person selects to answer the call (and talk to the visitor) she/he needs either to press a button in the GUI or issue a spoken command. Then, the respective service (GUI or SpeechRecognition) sends a message to the InteractionManager.

VisitorRequest(RequestType) //RequestType= ANSWER_DOOR_CALL

5. The InteractionManager sends message to the Visitor service to answer the door call.

AnswerDoorCall()

6. The Visitor relays the message to the Door.

AnswerDoorCall()

7. When the Visitor service responds to the InteractionManager, a notification is sent to the GUI that the door call is answered.

NotificationToGUI (NotificationToGUIType)

//NotificationToGUIType = DOOR_CALL_ANSWERED

Also, video as well as audio from the front door is available through the robot.

The older person selects to hang-up the call (stop audio/video communication with the visitor)

1. When the older person selects to Hang-Up the call (and stop talking to the visitor), she/he needs to press a button in the GUI. Then, the GUI sends a message to the InteractionManager.

VisitorRequest(RequestType) //RequestType= HANG_UP_DOOR_CALL

8. The InteractionManager sends message to the Visitor service to hang-up the door call.

HangUpDoorCall()

9. The Visitor relays the message to the Door.

HangUpDoorCall()

10. When the Visitor service responds to the InteractionManager a notification is sent to the GUI that the door call is hanged up.

NotificationToGUI (NotificationToGUIType)

// NotificationToGUIType= DOOR_CALL_HANGEDUP

The established link with the front door is stopped.

The older person selects to open the door (Opening the door)

1. When the older person selects to open the door (by pressing the respective button in GUI), the GUI service sends a message to the InteractionManager.

VisitorRequest(RequestType) // RequestType=OPEN_DOOR

11. The InteractionManager relays the message to the Visitor (waiting for a response)

OpenDoorNotification()

12. When a response is received, the InteractionManager sends a notification to the GUI (to disappear the "Visitor dialogue window"?) and to the SpeechSynthesis service (to inform the user that the door has opened).

NotificationToGUI (NotificationToGUIType)

// NotificationToGUIType=DOOR_OPENED

SendToTextToSpeech("DOOR_OPENED")

13. If the older person has previously answered the call, the InteractionManager sends a request to the Visitor service to HangUp the Door Call.

HangUpDoorCall()

The older person calls for help (=> A Skype CALL to a pre-specified person is initiated)

1. When the older person selects to call for help (by pressing the respective button in the GUI), the GUI service sends a message to the InteractionManager.

VisitorRequest(RequestType) // RequestType= CALL_FOR_HELP

14. If the older person has previously answered the call, the InteractionManager sends a request to the Visitor service to HangUp the Door Call.

HangUpDoorCall()

15. The InteractionManager sends a message to the Skype service to initiate a call to a pre-specified person. In parallel, it sends a notification to the GUI "START_SKYPE_VIDEOCONFERENCE" (=> the respective skype GUI is being displayed)

PlaceCall(ContactID)

NotificationToGUI (NotificationToGUIType, PartnerName)

// NotificationToGUIType=START_SKYPE_VIDEOCONFERENCE

Note: From this point forward, please see paragraph "1.6 Function for Voice/Video/SMS via robot communication with friends and relatives (F11)".

The older person selects to see the missing door calls (those not answered at all)

1. When the older person selects to view the missing calls (e.g., through a selection in main menu or by issuing a spoken command), the GUI or SpeechRecognition service respectively, sends a message to the InteractionManager.

GetMissingDoorCalls()

16. Subsequently, the InteractionManager sends a request to the Visitor service:

GetMissingDoorCalls()

17. As soon as the InteractionManager receives information about the missing calls (MissingDoorCallsInfo), it forwards this to the GUI:

GetMissingDoorCalls()

The GUI service will be then responsible to present the information to the older person.

Note: The MissingDoorCallsInfo it might be a list holding information about:

Call No:

Calling Time

Video / Photo (link)

The older person selects to clear the list of the missing calls

1. The older person may select to clear the list of the missing door calls (e.g. through a button in the GUI or by issuing a spoken command). The GUI or SpeechRecognition service respectively, sends a message to the InteractionManager.

VisitorRequest(RequestType) // RequestType= CLEAR_MISSING_DOOR_CALLS

18. Subsequently, the InteractionManager sends a request to the GUI and the SpeechSynthesis service is asking the user to confirm:

NotificationToGUI (NotificationToGUIType)

// NotificationToGUIType= SHOW_CONFIRMATION_WINDOW

SendToTextToSpeech("CONFIRM_CLEAR_MISSINGCALLS_LIST")

19. As soon as the InteractionManager receives a positive confirmation, it sends a message to the Visitor service to clear the list. In parallel, it sends a notification to the GUI and SpeechSynthesis services to inform the older person accordingly:

ClearMissingCallsList()

NotificationToGUI (NotificationToGUIType)

// NotificationToGUIType= UPDATE_MISSINGCALLS_LIST

SendToTextToSpeech("MISSINGCALLS_LIST_CLEARED")

4 Components Description

1 Visitors

Description: MRDS service providing an interface to the smart home automation infrastructure (particularly to devices / sensors such as Doorbell, camera connected to the front door, microphone, loudspeakers, etc)

Development environment: C# (MRDS Service)

Ressources (devices): TabletPC (with MRDS installed) or SHACU (deployment decision not yet taken)

Inputs (with types):

|Messages Received From |Message |

|Door |IncomingDoorCallNotification() |

| |//Event sent to the Visitors component when a visitor presses the door bell => Door bell |

| |is ringing (Handling Incoming Door Calls) |

|InteractionManager |NotifyOlderPersonNotFound() |

| |//The InteractionManager notifies the Visitors service that the person was not found. |

| |AnswerDoorCall() |

| |//The InteractionManager sends message to the Visitor service to answer the door call. |

| |HangUpDoorCall() |

| |//The InteractionManager sends message to the Visitor service to hang up the door call. |

| |OpenDoorNotification() |

| |//The InteractionManager sends a message to the Visitor to open the door. |

| |GetMissingDoorCalls() |

| |//The InteractionManager sends a message to the Visitors requesting the list of the |

| |missing calls. |

| |ClearMissingCallsList() |

| |//Request sent to the Visitors service to clear the list of the missing calls |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |IncomingDoorCallNotification(notification, link) |

| |//Link: Link to be established between the robot and the Door for displaying video/ audio|

|Door |AnswerDoorCall() |

| |// The Visitor relays the message to the Door to answer the door call. |

| |HangUpDoorCall() |

| |// The Visitor relays the message to the Door to hang up the door call. |

| |OpenDoorNotification() |

| |//The Visitor relays the message to the Door to open the door call. |

Settings data (with types):

• Path to store information about the missing calls

Logs (with types) and how to display them:

• List of missing calls holding info about:

o Call No

o Calling Time

o Video / Photo (link)

2 Door

Description: (Smart home infrastructure) An external set of devices on door can be remotely used/controlled.

Development environment: C#, communication with Door is taking place through IP connection to AMX/ KNX gateway. NetLinx Studio has been used for developing the AMX MOBISERV module.

Resources (devices): Camera, microphone, lock, loudspeakers, door bell in the smart home, plus central home control of them

Inputs (with types):

|Messages Received From |Message |

|Visitors |AnswerDoorCall() |

| |// The Visitor sends a message to the Door to answer the door call. |

| |HangUpDoorCall() |

| |// The Visitor relays the message to the Door to hang up the door call. |

| |OpenDoorNotification() |

| |//The Visitor relays the message to the Door to open the door call. |

Outputs (with type):

|Messages Sent To |Message |

|Visitors |IncomingDoorCallNotification() |

| |//Event sent to the Visitors component when a visitor presses the door bell => Door bell |

| |is ringing (Handling Incoming Door Calls). |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

3 Skype

Description: Skype API (integrated into Kompai-RobuBOX). In this particular high-level functionality, it is used to enable an older person to make a call to a remote party, calling for help.

Development environment: C# (MRDS Service)

Resources (devices): Internet Connection, TabletPC

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |PlaceCall(ContactID) |

| |//Sends a message to the Skype API to make a call to a certain contact. |

Settings data (with types):

Older persons' SkypeID, password

Logs (with types) and how to display them:

N/A

Note

Not all "messages" going to/coming from the Skype API are presented here. The full list of messages is presented in "1.6 Function for Voice/Video/SMS via robot communication with friends and relatives (F11)".

4 InteractionManager

Description: The InteractionManager handles all events that the other Components generate. The resulted information will be, then, used for accessing and updating the appropriate application data and then channelled to the appropriate output modalities. [This component is merged with the "ApplicationManager" component identified in D3.1].

Development environment: C# (MRDS Service) (extended version of the "Control" service that is presented in Kompai-RobuBOX)

Resources (devices): TabletPC

Inputs (with types):

[Active input]

• Voice Commands issued by the older persons (e.g. "Open the Door")

• Tactile commands issued by older persons though the robot's touch screen (GUI) (e.g. when the older person presses the "Answer" button, in the GUI)

[Passive input]

• Events/Data coming from the Visitors component

In particular:

|Messages Received From |Message |

|Visitors |IncomingCallNotification(notification, link) |

| |// link = the link to be established between the robot and the Door for displaying video/|

| |audio |

|LocaliseOlderPerson |FindOlderPersonResult(location) |

| |//Acknowledgment that the older person has been found and/or location of the older person|

| |in his/her apartment |

|GUI |VisitorsRequest(RequestType) |

| |//The older person sends an event to the InteractionManager by pressing a button in the |

| |respective GUI. |

| |// RequestType = ANSWER_DOOR_CALL | HANG_UP_DOOR_CALL | OPEN_DOOR | CALL_FOR_HELP | |

| |CLEAR_MISSING_DOOR_CALLS |

| |GetMissingDoorCalls() |

| |//Sends a command to the InteractionManager to get the list of missing door calls. |

|SpeechRecognition |VisitorsRequest(RequestType) |

| |//The older person sends an event to the InteractionManager by issuing the respective |

| |command. |

| |//RequestType = ANSWER_DOOR_CALL | HANG_UP_DOOR_CALL | OPEN_DOOR | CALL_FOR_HELP | |

| |CLEAR_MISSING_DOOR_CALLS |

| |GetMissingDoorCalls() |

| |//Sends a command to the InteractionManager to get the list of missing door calls. |

Outputs (with type):

• Information channelled to the SpeechSynthesis component;

• Information channelled to the GUI component;

• Commands issued to the LocaliseOlderPerson component;

• Commands issued to the GoNextToOlderPerson component;

• Pass events, access or update of applications/application components (e.g. Visitors, Skype).

|Messages Sent To |Message |

|LocaliseOlderPerson |FindOlderPerson() |

| |//Message sent to the LocaliseOlderPerson to locate the older person in his/her apartment |

|Visitors |NotifyOlderPersonNotFound() |

| |//Notification that the older person has not been found |

| |AnswerDoorCall() |

| |// Sends a message to the Visitors component to answer the door call. |

| |HangUpDoorCall() |

| |//Sends a message to the Visitors component to hang-up the door call. |

| |OpenDoorNotification() |

| |//Sends a message to the Visitors component to open the door. |

|GoNextToOlderPerson |GoNextToOlderPerson(location) |

| |//Commands the GoNextToOlderPerson to draw a path and move (the robot) in the specific |

| |location; where location: |

| |//Position X (meter): double |

| |//Position Y (meter) : double |

| |//Orientation (radian) : double (if available) |

|GUI |NotificationToGUI(NotificationToGUIType) |

| |// Commands the GUI to display a notification. |

| |//NotificationToGUIType= INCOMING_DOOR_CALL | DOOR_CALL_ANSWERED | DOOR_CALL_HANGEDUP | |

| |DOOR_OPENED | START_SKYPE_VIDEOCONFERENCE | SHOW_CONFIRMATION_WINDOW | UPDATE_MISSINGCALLS_LIST|

| |NotificationToGUI(NotificationToGUIType, PartnerName) |

| |//Commands the GUI to display a notification that the skype conference with the contact name |

| |"PartnerName" has been started. |

| |//NotificationToGUIType= START_SKYPE_VIDEOCONFERENCE |

| |//PartnerName: The name of the person contacted (string) |

|SpeechSynthesis |SendToTextToSpeech("INCOMING_DOOR_CALL") |

| |//Sends a notification to the SpeechSynthesis to say to the user that there is an incoming door|

| |call. |

| |SendToTextToSpeech("DOOR_OPENED") |

| |//Sends a notification to the SpeechSynthesis to say to the user that the door is opened. |

| |SendToTextToSpeech("CONFIRM_CLEAR_MISSINGCALLS_LIST") |

| |//Sends a notification to the SpeechSynthesis, asking the user to confirm that the list of |

| |missing calls will be cleared. |

| |SendToTextToSpeech("MISSINGCALLS_LIST_CLEARED") |

| |//Sends a notification to the SpeechSynthesis to say to the user that the list of missing calls|

| |was cleared. |

|Skype |PlaceCall(ContactID) |

| |//Sends a message to the Skype service to make a call to a certain skype contact. |

Settings data (with types):

Information about the background functions enabled or disabled

• NutritionAssistance: TRUE | FALSE

• Dehydrationprevention: TRUE | FALSE

• MedicalReminders: TRUE | FALSE

• ExercisingCoach: TRUE | FALSE

• PanicResponder: TRUE | FALSE

Information about main menu functions enabled or disabled

• AudioVideoContact: TRUE | FALSE

• Games: TRUE | FALSE

• Exercises: TRUE | FALSE

• SelfCheck: TRUE | FALSE

• HomeControl: TRUE | FALSE

• RobotControl: TRUE | FALSE

• AudioVideoContactIcon: string [file path]

• GamesIcon: string [file path]

• ExercisesIcon: string [file path]

• SelfCheckIcon: string [file path]

• HomeControlIcon: string [file path]

• RobotControlIcon: string [file path]

Logs (with types) and how to display them:

N/A

5 GUI(Visitors)

Description: Graphical User Interface provided by the robot (the one related to F14)

Development environment: This component is being implemented in C# (as an MRDS service) where all the graphics elements of the HMI are bundled. It has a single WPF window where the specific graphics controls for this high-level functionality can be shown.

Resources (devices): TabletPC on the robot

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |NotificationToGUI(NotificationToGUIType) |

| |//Commands the GUI component to display a notification. |

| |//NotificationToGUIType= INCOMING_DOOR_CALL | DOOR_CALL_ANSWERED | DOOR_CALL_HANGEDUP | |

| |DOOR_OPENED | START_SKYPE_VIDEOCONFERENCE | SHOW_CONFIRMATION_WINDOW | |

| |UPDATE_MISSINGCALLS_LIST |

| |NotificationToGUI(NotificationToGUIType, PartnerName) |

| |//Commands the GUI component to display a notification that the skype conference with the|

| |contact name "PartnerName" has been started. |

| |//NotificationToGUIType= START_SKYPE_VIDEOCONFERENCE |

| |//PartnerName: The name of the person contacted (string) |

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |VisitorsRequest(RequestType) |

| |//The older person sends an event to the InteractionManager by pressing a button in the |

| |respective GUI. |

| |//RequestType = ANSWER_DOOR_CALL | HANG_UP_DOOR_CALL | OPEN_DOOR | CALL_FOR_HELP | |

| |CLEAR_MISSING_DOOR_CALLS |

| |GetMissingDoorCalls() |

| |//Sends a command to the InteractionManager to get the list of the missing door calls. |

Settings data (with types):

Information about a pre-specified person's skype contact details to call for help

Logs (with types) and how to display them:

N/A

6 SpeechRecognition

Description: Speech input recognition component, on the robot

Development environment: It is a basic component of Microsoft Windows, integrated within MRDS and Kompai-RobuBOX Software. It can be used to recognise any sentence.

Resources (devices): Tablet PC and microphone

Inputs (with types):

Spoken commands issued by the older person

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |VisitorsRequest(RequestType) |

| |//The older person (and the SpeechRecognition component) sends an event to the |

| |InteractionManager by issuing the respective command. |

| |// RequestType = ANSWER_DOOR_CALL | HANG_UP_DOOR_CALL | OPEN_DOOR | CALL_FOR_HELP | |

| |CLEAR_MISSING_DOOR_CALLS |

| |GetMissingDoorCalls() |

| |//Sends a command to the InteractionManager to get the list of missing door calls. |

Settings data (with types):

• Dialogue commands (yes, no) to recognise

• Commands: ANSWER_DOOR_CALL | HANG_UP_DOOR_CALL | OPEN_DOOR | CALL_FOR_HELP | CLEAR_MISSING_DOOR_CALLS to recognise

• Command "GetMissingDoorCalls" to recognise

• Information about a pre-specified person's skype contact details to call for help

Logs (with types) and how to display them:

N/A

7 SpeechSynthesis

Description: Text to speech voice generator, on the robot

Development environment: C# (MRDS Service)

Resources (devices): TabletPC (with Kompai-RobuBOX) and speakers

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |SendToTextToSpeech("INCOMING_DOOR_CALL") |

| |//Sends a notification to the SpeechSynthesis component to say to the user that there is an |

| |incoming door call. |

| |SendToTextToSpeech("DOOR_OPENED") |

| |//Sends a notification to the SpeechSynthesis component to say to the user that the door was |

| |opened. |

| |SendToTextToSpeech("CONFIRM_CLEAR_MISSINGCALLS_LIST") |

| |//Sends a notification to the SpeechSynthesis component to ask the user to confirm that the list|

| |of missing calls will be cleared. |

| |SendToTextToSpeech("MISSINGCALLS_LIST_CLEARED") |

| |//Sends a notification to the SpeechSynthesis component to say to the user that the list of |

| |missing calls was cleared. |

Outputs (with types):

N/A

Settings data (with types):

Text to synthesise for:

• INCOMING_DOOR_CALL

• DOOR_OPENED

• CONFIRM_CLEAR_MISSINGCALLS_LIST

• MISSINGCALLS_LIST_CLEARED

Logs (with types) and how to display them:

N/A

8 GoNextToOlderPerson

Description: It generates a path for the robot to go dynamically next to the older person.

Development environment: C# (MRDS Service)

Resources (devices): TabletPC (with Kompai-RobuBOX), PURE, robot sensors (odometers, laser)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |GoNextToOlderPerson(location) |

| |//Command sent by the InteractionManager to move (the robot) in the specific location; |

| |where location is: |

| |//Position X (meter): double |

| |//Position Y (meter) : double |

| |//Orientation (radian) : double (if available) |

Outputs (with type):

N/A

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

9 LocaliseOlderPerson

Description: This component is responsible for locating the older person in his/her apartment.

Development environment: C# (MRDS Service)

Resources (devices): Motion detection sensors in the smart home, Tablet PC

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |FindOlderPerson() |

| |// command to locate an older person in his apartment |

|Sensors on the robot |Odometers, Laser |

|Sensors in the smart home |Motion detection sensors |

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |FindOlderPersonResult(location) |

| |//Acknowledgment that the older person has been found and/or location of the older person in |

| |his/her apartment. |

| |//Where location: |

| |//Position X (meter): double |

| |//Position Y (meter) : double |

| |//Orientation (radian) : double (if available) |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

5 Function for Encouragement for exercising (F08)

This function will support and encourage physical activity in order to stay healthy, stay in good condition and stay independent for as long as possible.

1 Function Requirements Overview

Sub-functions requirements

|Sub-Function ID |Sub-Functions |Requirement |

|F_8.1 |Ability to turn function ON or OFF |It should be clear to the user and others whether|

| | |this function is in the ON or OFF mode. |

|F_8.2 |Detects and rates physical activity. | |

|F_8.3 |Enables a carer or doctor to insert new exercises. | |

|F_8.4 |Issues persuading messages to do an exercise. |Ensures persuasiveness and credibility. |

|F_8.5 |Detects patterns in the user’s response to the persuading messages. | |

|F_8.6 |Adjusts the schedule of messages to the user’s pattern and his other | |

| |activities. | |

|F_8.7 |Keeps a basic log of the exercises done. | |

Table 4: Sub-Functions for F_8

Process Specification

[pic]

Figure 9: Process specification for encouragement for exercising

Please note that the process specification diagram included in D2.3 - Volume II has been updated to the one depicted in the figure above.

2 Sequence Diagrams & Components Description

At this point it should be pointed out that during the second project year additional studies have been conducted under WP2. These studies focused - among others - on research with older people and carers to identify both the design and content that will be most usable and acceptable, as well as to provide the required support for the MOBISERV functionality and requirements. As a result of this research, a report presenting recommendations for the graphical user interfaces for primary and secondary users to be implemented for interaction with MOBISERV - Kompaï and personalisation of the system features and functionality based on individual persons’ physical and cognitive needs has been prepared. According to this report and recommendations [3], within MOBISERV, two functionalities/user services will be implemented in relation to the "Encouragement for exercising" high-level functionality:

1. Exercise coach: This will be responsible for issuing reminders/encouragements to older persons to do some exercise or activity.

It will be a "Background function", which implies that it will be initiated by the system (and not by the user), based on information coming from:

o The "Encouragements" schedule, as defined by a secondary user;

o Data coming from the data logger (only in case the older person is carrying on a data logger, which is functioning);

o Data coming from the smart home (only in case such data is available and no data logger is being used).

2. Exercises: This is the functionality called when an older person wants to do some exercise / activity. It is presented in the robot's "Main menu" and can be either initiated by the older person (at any time) or by the system (when the user replies positively to an encouragement issued by the "Exercise coach").

In the following sub-section we present the sequence diagrams and components involved in each one of the above mentioned "constituents" of the "Encouragement for exercising" function (F08).

1 Function "Exercises" (part of F08)

1 Introduction to major components involved

The major components involved in the "Exercises" function are introduced below:

• Database: Database where information about the different exercises and activities, as well as the schedule for the encouragements, and the logs, is stored and organised.

• ExercisesAndSchedule: Component that interfaces with the database.

• InteractionManager: This component is responsible for coordinating information across the various sensors/modalities, which are used within MOBISERV for supporting the implementation scenarios identified.

• GUI(Exercises): (former TabletPCUI) Graphical User Interface provided by the robot, the one related to the "Exercises" service.

• SpeechRecognition: Speech input recognition component, on the robot.

• SpeechSynthesis: Text to speech voice generator, on the robot.

Note: The SpeechHMI component identified in D3.1's diagrams has been now split to the following two components: a) SpeechRocognition and b) SpeechSynthesis.

2 Sequence Diagrams

[pic]

Figure 10: Sequence Diagram for Exercises - Part 1

[pic]

Figure 11: Sequence Diagram for Exercises - Part 2

[pic]

Figure 12: Sequence Diagram for Exercises - Part 3

3 Components Description

1 ExercisesAndSchedule

Description: Component that interfaces with the database, where information about the different exercises and activities, as well as the schedule for the encouragements, and the logs, is stored and organised.

In the "Exercises" functionality, this component will retrieve information about the different exercises, activities and each day's schedule for them.

Exercises may have the following properties:

o Id //integer

o Exercise //string

o Type // string

o Image //string (file path) // If image==NULL no image will be displayed.

o Video //string (file path) // if video==NULL no video will be displayed.

Activities may have the following properties:

o Id //integer

o Activity //string

o Type //string

o Image //string (file path) // If image==NULL no image will be displayed.

o Video //string (file path) // if video==NULL no video will be displayed.

Schedule may have the following properties:

o Id //integer

o What //foreign key (id of activity or exercise)

o Repetitions // integer

o Time //time

o Date //date

o Text //True | False

o Voice //True | False

o Message //string

o Image //string (file path) // If Image==NULL no image will be displayed

o Video //string (file path) // if Video==NULL no video will be displayed

Development environment: C# (MRDS Service)

Resources (devices): Tablet PC (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |GetExercises() |

| |// Requests the list of Exercises. |

| |GetActivities() |

| |// Requests the list of Activities. |

| |GetEncouragementsSchedule() |

| |// Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person should do each|

| |day. |

| |ExercisesAndScheduleRequest(RequestType) |

| |//Requests to the "ExercisesAndSchedule". |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |(List of) Exercises |

| |(List of) Activities |

| |(List) encouragementSchedule |

| |//List of "encouragement events" |

| |Exercise |

| |// In case it has been requested to give the NEXT or PREVIOUS Exercise |

| |Activity |

| |//In case it has been requested to give the NEXT or PREVIOUS Activity |

Settings data (with types):

• List of Exercises: As mentioned before, each exercise may have the following properties (set by a secondary user):

o Exercise //string

o Type // string

o Date //date

o Image //string (file path)

o Video //string (file path)

• List of Activities: As said before, each activity may have the following properties (set by a secondary user):

o Activity //string

o Type //string

o Date //date

o Image //string (file path)

o Video //string (file path)

• EncouragmentSchedule: Schedule of exercises and activities the user should do. This schedule will be used for presenting to the user (upon his/her request) the exercises and activities recommended (by the secondary user) on each day, along with information for repetitions.

o Id //integer

o What //foreign key (id of activity or exercise)

o Repetitions // integer

o Time //time

o Date //date

o Text //True | False

o Voice //True | False

o Message //string

o Image //string (file path) // If Image==NULL no image will be displayed

o Video //string (file path) // if Video==NULL no video will be displayed

Logs (with types) and how to display them:

-

2 InteractionManager

Description: This component is responsible for coordinating information across the various sensors/ modalities, which are used within MOBISERV for supporting the implementation scenarios identified (in particular for supporting the "Exercises" scenario).

Development environment: C# (MRDS service)

Resources (devices): Tablet PC (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|ExercisesAndSchedule |(List of) Exercises |

| |(List of) Activities |

| |(List) encouragementSchedule |

| |//List of "encouragement events" |

| |Exercise |

| |//In case it has been requested to provide the NEXT or PREVIOUS Exercise |

| |Activity |

| |//In case it has been requested to provide the NEXT or PREVIOUS Activity |

|GUI |GetExercises() |

| |// Requests the list of Exercises. |

| |GetActivities() |

| |// Requests the list of Activities. |

| |GetEncouragementsSchedule() |

| |//Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person should do each|

| |day. |

| |ExercisesAndScheduleRequest(RequestType) |

| |// Requests that go through the "InteractionManager" |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

| |GUIRequest(RequestType, service) |

| |// GUI RequestType: DISPLAY_SERVICE |

| |//Service: corresponds to the "Exercises" submenu |

|SpeechRecognition |ExercisesAndScheduleRequest(RequestType) |

| |// Requests that go through the "InteractionManager" |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

| |GUIRequest(RequestType, service) |

| |// GUI RequestType: DISPLAY_EXERCISE_SERVICE |

| |//service: corresponds to the "Exercises" submenu |

Outputs (with type):

|Messages Sent To |Message |

|ExercisesAndSchedule |GetExercises() |

| |// Requests the list of Exercises. |

| |GetActivities() |

| |// Requests the list of Activities. |

| |GetEncouragementsSchedule() |

| |//Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person should do each|

| |day. |

| |ExercisesAndScheduleRequest(RequestType) |

| |// Requests to the "ExercisesAndSchedule". |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

|GUI |NotificationToGUI(NotificationType, service) |

| |//Notifies the GUI to display a certain service. |

| |//NotificationType: DISPLAY_SERVICE |

| |//Service: corresponds to "Exercises" |

| |NotificationToGUI(NotificationType, Exercice) |

| |// NotificationType: DISPLAY_EXERCISE |

| |NotificationToGUI(NotificationType, Activity) |

| |// NotificationType: DISPLAY_ ACTIVITY |

|SpeechSynthesis |SendToTextToSpeech(EXERCISE_POSITIVE_REPLY) |

| |// EXERCISE_POSITIVE_REPLAY would be: |

| |// Great, have a look at my screen for some suggestions. |

| |// Great, here they come. |

| |// Great, here I come. |

| |// Here we go. |

| |// If you like, you can put on the vest for extra measurements. |

Settings data (with types):

Information about the background functions enabled or disabled

• NutritionAssistance: TRUE | FALSE

• Dehydrationprevention: TRUE | FALSE

• MedicalReminders: TRUE | FALSE

• ExercisingCoach: TRUE | FALSE

• PanicResponder: TRUE | FALSE

Information about main menu functions enabled or disabled

• AudioVideoContact: TRUE | FALSE

• Games: TRUE | FALSE

• Exercises: TRUE | FALSE

• SelfCheck: TRUE | FALSE

• HomeControl: TRUE | FALSE

• RobotControl: TRUE | FALSE

• AudioVideoContactIcon: string [file path]

• GamesIcon: string [file path]

• ExercisesIcon: string [file path]

• SelfCheckIcon: string [file path]

• HomeControlIcon: string [file path]

• RobotControlIcon: string [file path]

Logs (with types) and how to display them:

-

3 GUI(Exercises)

Description: (former TabletPCUI) Graphical User Interface provided by the robot, the one related to the "Exercises" service.

Development environment: This component is being implemented in C# (as an MRDS service) where all the graphics elements of the Human Machine Interface (HMI) are bundled. It has a single WPF window where the specific graphics controls can be shown.

Resources (devices): Tablet PC with touch screen (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |NotificationToGUI(NotificationType, service) |

| |// Notifies the GUI to display a certain service. |

| |// NotificationType: DISPLAY_SERVICE |

| |//Service: corresponds to "Exercises" |

| |NotificationToGUI(NotificationType, Exercice) |

| |// NotificationType: DISPLAY_EXERCISE |

| |NotificationToGUI(NotificationType, Activity) |

| |// NotificationType: DISPLAY_ ACTIVITY |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |GetExercises() |

| |// Requests the list of Exercises. |

| |GetActivities() |

| |// Requests the list of Activities. |

| |GetEncouragementsSchedule() |

| |//Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person should do each|

| |day. |

| |ExercisesAndScheduleRequest(RequestType) |

| |// Requests that go through the "InteractionManager". |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

| |GUIRequest(RequestType, service) |

| |// GUI RequestType: DISPLAY_SERVICE |

| |//Service: corresponds to the "Exercises" submenu. |

Settings data (with types):

-

Logs (with types) and how to display them:

-

4 SpeechRecognition

Description: Speech input recognition component, on the robot.

Development environment: C# (MRDS service)

Resources (devices): Tablet PC (with Kompai-RobuBOX), microphone

Inputs (with types): Spoken commands to recognize

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |ExercisesAndScheduleRequest(RequestType) |

| |//Requests that go through the "InteractionManager". |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

| |GUIRequest(RequestType, service) |

| |// GUI RequestType: DISPLAY_EXERCISE_SERVICE |

| |//Service: corresponds to the "Exercises" submenu. |

Settings data (with types):

Grammar for recognising commands, such as:

• GET_NEXT_EXERCISE

• GET_NEXT_ACTIVITY

• GET_PREVIOUS_EXERCISE

• GET_ PREVIOUS _ACTIVITY

• DISPLAY_EXERCISES_SUBMENU (e.g. "Kompai I want to do some exercise")

• DISPLAY_EXERCISES (e.g. "Kompai I want to do some exercises")

• DISPLAY_ACTIVITIES (e.g. "Kompai I want to do an activity")

Logs (with types) and how to display them:

-

5 SpeechSynthesis

Description: Text to speech voice generator, on the robot.

Development environment: C# (MRDS Service)

Resources (devices): Tablet PC (with Kompai-RobuBOX), speakers

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |SendToTextToSpeech(EXERCISE_POSITIVE_REPLY) |

| |// EXERCISE_POSITIVE_REPLAY would be: |

| |// Great, have a look at my screen for some suggestions. |

| |// Great, here they come. |

| |// Great, here I come. |

| |// Here we go. |

| |//If you like, you can put on the vest for extra measurements. |

Outputs (with type):

Synthesised speech

Settings data (with types):

-

Logs (with types) and how to display them:

-

2 Function "Exercise Coach" (part of F08)

1 Introduction to major components involved

The major components involved in the "Exercise Coach" function are introduced below:

• Database: Database where information about the different exercises and activities, as well as the schedule for the encouragements, and the logs, is stored and organized.

• ExercisesAndSchedule: Component that interfaces with the database.

• HomeAutomation: Component in the smart home controlling the KNX (AMX controller).

• SMHInterface: Component interfacing with the home automation component.

• DataLogger: It records and transmits vital signs (ECG and respiration), activity (3-axis acceleration) and some physiological extracted parameters like Heart rate (HR), Breathing rate (BR), activity classification, etc…

• ActivityMonitor: It interfaces with the DataLogger. It "asks" the DataLogger about vital signs, activity and physiological extracted parameters and "downloads" and stores them in file/database.

• LocaliseOlderPerson: This component is responsible for locating the older person in his/her apartment;

• GoNextToOlderPerson: It generates a path for the robot to go dynamically next to the older person.

• InteractionManager: This component is responsible for coordinating information across the various sensors/modalities, which are used within MOBISERV for supporting the implementation scenarios identified (in that case, for supporting the "Exercise coach" scenario).

• GUI: (former TabletPCUI) Graphical User Interface provided by the robot.

• SpeechRecognition: Speech input recognition component, on the robot.

• SpeechSynthesis: Text to speech voice generator, on the robot.

Note: The SpeechHMI component identified in D3.1's diagrams has been now split to the following two components: a) SpeechRocognition and b) SpeechSynthesis.

2 Sequence diagrams

[pic]

Figure 13: Sequence Diagram for Exercise Coach - Part 1

[pic]

Figure 14: Sequence Diagram for Exercise Coach - Part 2

3 Components Description

1 ExercisesAndSchedule

Description: Component that interfaces with the database, where information about the different exercises and activities, as well as the schedule for the encouragements, and the logs, is stored and organised.

In the "Exercise Coach" functionality, this component will retrieve information about the Encouragement Schedule and will also log, the user's data.

The encouragement Schedule may have the following properties:

o Id //integer

o What //foreign key (id of activity or exercise)

o Repetitions // integer

o Time //time

o Date //date

o Text //True | False

o Voice //True | False

o Message //string

o Image //string (file path) // If Image==NULL no image will be displayed

o Video //string (file path) // if Video==NULL no video will be displayed

Development environment: C# (MRDS Service)

Resources (devices): Tablet PC (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |GetEncouragementsSchedule() |

| |// Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person should do each|

| |day. |

| |ActivityDetected(encouragement) |

| |// Informs that an activity has been detected in the last 45 minutes => No need to issue |

| |an encouragement. |

| |UpdateEncouragment(encouragement) |

| |// Requests to update an encouragement (in the schedule). This encouragement is not yet |

| |considered reached. |

| |ExerciseEncouragementIsReached(encouragement) |

| |// Informs that the "encouragement" has been reached. |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |(List) encouragementSchedule |

| |//List of "encouragement events" |

| |EncouragementTimeReached(encouragement) |

| |// Informs that - according to the schedule - the user should have done an exercise / |

| |activity (i.e. it is time to issue an encouragement, if no activity was detected in the |

| |last 45 minutes). |

Settings data (with types):

• EncouragmentSchedule: Schedule of exercises and activities the user should do. This schedule will be used for presenting to the user (upon his/her request) the exercises and activities recommended (by the secondary user) on each day along with information for repetitions.

o Id //integer

o What //foreign key (id of activity or exercise)

o Repetitions // integer

o Time //time

o Date //date

o Text //True | False

o Voice //True | False

o Message //string

o Image //string (file path) // If Image==NULL no image will be displayed

o Video //string (file path) // if Video==NULL no video will be displayed

Logs (with types) and how to display them: Logged user's rejections to encouragements along with timing information

Secondary users would have access to these through the "Exercise Logs".

2 SMHInterface

Description: MRDS service providing an interface to the smart home automation infrastructure (particularly to sensors such as motion detections sensors)

Development environment: C# (MRDS Service)

Resources (devices): TabletPC (with MRDS and Kompai-RobuBOX installed) or SHACU (a deployment decision not yet taken), Home Automation control unit in the smart home

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |IsMotionDetected(lastXMinutes) |

| |//Asks if motion has been detected in the house in the lastXMinutes. |

| |//Default value of lastXMinutes=45 (minutes) |

| |LogSMHExercisesData() |

| |//Logs data coming from sensors in the smart home. The logs would include information |

| |about the place inside the house or the name of the motion detector sensor that detected |

| |motion, along with timing information. |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |MotionDetectionResult(result, confidence) |

| |//result: TRUE | FALSE |

| |//confidence: float |

Settings data (with types):

-

Logs (with types) and how to display them:

Data coming from sensors in the smart home would be logged. The logs would include information about the place inside the house or the name of the motion detector sensor that detected motion along with timing information.

It is not required (so far) to display these logs.

3 ActivityMonitor

Description: It interfaces with the DataLogger. It "asks" the datalogger about vital signs, activity and physiological extracted parameters and "downloads" and stores them in a file/database.

Development environment: C# (MRDS service)

Resources (devices): Tablet PC (with Kompai-RobuBOX) & DataLogger & Bluetooth connection (to connect to the data logger)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |IsActivityDetected(lastXMinutes) |

| |//Asks if activity has been detected (by the WHSU/ DataLogger) in the lastXMinutes. |

| |// Default value of lastXMinutes=45 |

| |GetExercicesLogData() |

| |//Asks the measurements concerning vital signs, activity and physiological extracted |

| |parameters. These measurements will be "downloaded". |

|DataLogger |execisesLogData |

| |//Data coming from the DataLogger (which will be stored in a file / database) |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |ActivityDetectionResult(result, confidence) |

| |//result: TRUE | FALSE |

| |//confidence: float |

|Database |LogDataLoggerExercisesData() |

| |//the DataLogger' recordings are stored in a file / database |

Settings data (with types):

• Exercise Coach: TRUE | FALSE

• ConnectionToDataloggerSettings

Logs (with types) and how to display them: Vital signs, activity and physiological extracted parameters

It is required - so far - that only secondary users would have access to these logs through the "Exercise Logs".

4 LocaliseOlderPerson

Description: This component is responsible for locating the older person in his/her apartment.

Development environment: C# (MRDS Service)

Resources (devices): Motion detection sensors in the smart home, Tablet PC

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |FindOlderPerson() |

| |// command to locate an older person in his apartment |

|Sensors on the robot |Odometers, Laser |

|Sensors in the smart home |Motion detection sensors |

Outputs (with types):

|Messages Sent To |Message |

|InteractionManager |FindOlderPersonResult(result) |

| |//Acknowledgment that the older person has been found and/or location of the older person in |

| |his/her apartment (result = location) |

| |//Where location: |

| |//Position X (meter): double |

| |//Position Y (meter) : double |

| |//Orientation (radian) : double (if available) |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

5 GoNextToOlderPerson

Description: It generates a path for the robot to go dynamically next to the older person.

Development environment: C# (MRDS Service)

Resources (devices): TabletPC (with Kompai-RobuBOX), PURE, robot sensors (odometers, laser)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |GoNextToOlderPerson(location) |

| |//Command sent by the InteractionManager to move (the robot) in the specific location; |

| |where location is: |

| |//Position X (meter): double |

| |//Position Y (meter) : double |

| |//Orientation (radian) : double (if available) |

Outputs (with type):

N/A

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

6 InteractionManager

Description: This component is responsible for coordinating information across the various sensors/ modalities, which are used within MOBISERV for supporting the implementation scenarios identified (in that case, for supporting the "Exercise coach" scenario).

Development environment: C# (MRDS Service)

Resources (devices): Tablet PC (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|ExercisesAndSchedule |(List) encouragementSchedule |

| |//List of "encouragement events" |

| |EncouragementTimeReached(encouragement) |

| |//Informs that - according to the schedule - the user should have done an exercise / |

| |activity (i.e. it is time to issue an encouragement, if no activity was detected in the |

| |last 45 minutes). |

|SMHInterface |MotionDetectionResult(result, confidence) |

| |//result: TRUE | FALSE |

| |//confidence: float |

|ActivityMonitor |ActivityDetectionResult(result, confidence) |

| |//result: TRUE | FALSE |

| |//confidence: float |

|LocaliseOlderPerson |FindOlderPersonResult(result) |

| |//Acknowledgment that the older person has been found and/or location of the older person |

| |in his/her apartment. |

|GUI |EncouregementNotReached(encouragement) |

| |//Informs the InteractionManager that the older person responded negatively to the |

| |encouragement issued. |

| |EncouregementIsReached(encouragement) |

| |//Informs the InteractionManager that the older person responded positively to the |

| |encouragement issued. |

|SpeechRecognition |EncouregementNotReached(encouragement) |

| |//Informs the InteractionManager that the older person responded negatively to the |

| |encouragement issued. |

| |EncouregementIsReached(encouragement) |

| |//Informs the InteractionManager that the older person responded positively to the |

| |encouragement issued. |

Outputs (with type):

|Messages Sent To |Message |

|ExercisesAndSchedule |GetEncouragementsSchedule() |

| |//Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person. should do |

| |each day |

| |ActivityDetected(encouragement) |

| |//Informs that an activity has been detected in the last 45 minutes => No need to issue an|

| |encouragement. |

| |UpdateEncouragment(encouragement) |

| |//Requests to update an encouragement (in the schedule). This encouragement is not yet |

| |considered reached. |

| |ExerciseEncouragementIsReached(encouragement) |

| |//Informs that the "encouragement" has been reached. |

|SMHInterface |IsMotionDetected(lastXMinutes) |

| |//Asks if motion has been detected in the house in the lastXMinutes. |

| |// Default value of lastXMinutes=45 |

| |LogSMHExercisesData() |

| |//Logs data coming from sensors in the smart home. The logs would include information |

| |about the place inside the house or the name of the motion detector sensor that detected |

| |motion, along with timing information. |

|ActivityMonitor |IsActivityDetected(lastXMinutes) |

| |//Asks if activity has been detected (by the WHSU/ DataLogger) in the lastXMinutes. |

| |//Default value of lastXMinutes=45 |

| |GetExercicesLogData() |

| |//Asks the measurements concerning vital signs, activity and physiological extracted |

| |parameters. These measurements will be "downloaded". |

|FindOlderPerson |FindOlderPerson() |

| |//Command to locate an older person in his apartment. |

|GUI |NotificationToGUI(encouragement, message, image, video) |

| |//Encouragement as set by the secondary user; |

| |//Message: message set by the secondary user. It will be delivered through the GUI. |

| |//Image: path to an image file |

| |//Video: path to a video file |

| |NotificationToGUI(NotificationType, Exercise) |

| |//NotificationType: DISPLAY_EXERCISE |

| |//Exercise: Exercise to display |

| |NotificationToGUI(NotificationType, Activity) |

| |//NotificationType: DISPLAY_ACTIVITY |

| |// Activity: Activity to display |

|SpeechSynthesis |SendToTextToSpeech(EXERCISE_POSITIVE_REPLY) |

| |//Notifies the user - though speech - that the exercise/activity is delivered/presented. |

| |TextToSpeechNotification(encouragement, message) |

| |//Encouragement as set by the secondary user |

| |//Message: message set by the secondary user. It will be delivered by voice. |

Settings data (with types):

Information about the background functions enabled or disabled:

• NutritionAssistance: TRUE | FALSE

• Dehydrationprevention: TRUE | FALSE

• MedicalReminders: TRUE | FALSE

• ExercisingCoach: TRUE | FALSE

• PanicResponder: TRUE | FALSE

Information about main menu functions enabled or disabled:

• AudioVideoContact: TRUE | FALSE

• Games: TRUE | FALSE

• Exercises: TRUE | FALSE

• SelfCheck: TRUE | FALSE

• HomeControl: TRUE | FALSE

• RobotControl: TRUE | FALSE

• AudioVideoContactIcon: string [file path]

• GamesIcon: string [file path]

• ExercisesIcon: string [file path]

• SelfCheckIcon: string [file path]

• HomeControlIcon: string [file path]

• RobotControlIcon: string [file path]

Logs (with types) and how to display them:

-

7 GUI

Description: (former TabletPCUI) Graphical User Interface provided by the robot, the one related to the "Exercise Coach" service.

Development environment: This component is being implemented in C# (as an MRDS service) where all the graphics elements of the Human Machine Interface (HMI) are bundled. It has a single WPF window where the specific graphics controls can be shown.

Resources (devices): Tablet PC with touch screen (with Kompai-RobuBOX)

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |NotificationToGUI(encouragement, message, image, video) |

| |//Encouragement as set by the secondary user |

| |//Message: message set by the secondary user. It will be delivered through the GUI |

| |//Image: path to an image file |

| |//Video: path to a video file |

| |NotificationToGUI(NotificationType, Exercice) |

| |//NotificationType: DISPLAY_EXERCISE |

| |//Exercise: Exercise to display |

| |NotificationToGUI(NotificationType, Activity) |

| |//NotificationType: DISPLAY_ACTIVITY |

| |// Activity: Activity to display |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |EncouregementNotReached(encouragement) |

| |//Informs the InteractionManager that the older person responded negatively to the |

| |encouragement issued. |

| |EncouregementIsReached(encouragement) |

| |//Informs the InteractionManager that the older person responded positively to the |

| |encouragement issued. |

Settings data (with types):

-

Logs (with types) and how to display them:

-

8 SpeechRecognition

Description: Speech input recognition component, on the robot. It is a basic component of Microsoft Windows, integrated within MRDS and Kompai-RobuBOX Software. It can be used to recognise any sentence.

Development environment: C# (MRDS service)

Resources (devices): Tablet PC (with Kompai-RobuBOX), microphone

Inputs (with types):

Spoken commands

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |EncouregementNotReached(encouragement) |

| |//Informs the InteractionManager that the older person responded negatively to the |

| |encouragement issued. |

| |EncouregementIsReached(encouragement) |

| |//Informs the InteractionManager that the older person responded positively to the |

| |encouragement issued. |

Settings data (with types):

Dialogue commands (yes, no) to recognise

Logs (with types) and how to display them:

-

9 SpeechSynthesis

Description: Text to speech voice generator, on the robot

Development environment: C# (MRDS Service)

Resources (devices): Tablet PC (with Kompai-RobuBOX), speakers

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |SendToTextToSpeech(EXERCISE_POSITIVE_REPLY) |

| |//Notifies the user - though speech - that the exercise / activity is delivered / presented. |

| |TextToSpeechNotification(encouragement, message) |

| |//Encouragement as set by the secondary user; |

| |//Message: message set by the secondary user. It will be delivered by voice. |

Outputs (with type):

Synthesised speech

Settings data (with types):

-

Logs (with types) and how to display them:

-

6 Function for Voice/Video/SMS via robot communication with friends and relatives (F11)

This function will support and increase the frequency of social interaction with friends and family.

1 Functional Requirement Overview

Sub-functions requirements

|Sub-Function ID |Sub-Functions |Requirement |

|F_11.1 |Ability to turn function ON or OFF |It should be clear to the user and others whether|

| | |this function is in the ON or OFF mode. |

|F_11.2 |Ability to select the mode of the communication (including |It should be clear to the user which mode of |

| |synchronous and asynchronous) |communication is currently selected and what is |

| | |its status - receiving/transmitted/both. |

|F_11.3 |Configuration of contacts - Ability to easily add / remove contacts |Usability of the contacts configuration should be|

| | |straightforward to use. |

| | |Open Source software available for friends and |

| | |family members to install locally. |

|F_11.4 |Check current availability/status of contacts |If the friend or relative is currently |

| | |unavailable it should be possible to contact them|

| | |to request communication in imminent future via |

| | |an alternative means. |

|F_11.5 |Ability to setup a connection to a remote party |Voice or touch screen command. |

|F_11.6 |Ability to mute the audio and/or video |Voice or touch screen command |

Table 5: Sub-Functions for F_11

Process Specification

[pic]

Figure 15: Process specification for video/voice/SMS Outgoing communication

[pic]

Figure 16: Process specification for video/voice/SMS Incoming communication

2 Introduction to major components involved

The major components involved in this function are introduced below:

• SkypeAPI: Skype C# Application Programming Interface (API);

• Skype: Skype API integrated into Kompai-RobuBOX. It enables an older person to communication through voice/ video / SMS with friends and relatives;

• Interaction Manager: This component is responsible for coordinating information across the various sensors/ modalities, which are used within MOBISERV for supporting the implementation scenarios identified (in that case, F11);

• GUI (Skype): (former TabletPCUI) Graphical User Interface provided by the robot, the one related to F11;

• Speech Recognition: Speech input recognition component, on the robot;

• Speech Synthesis: Text to speech voice generator, on the robot.

3 Sequence Diagrams

[pic]

Figure 17: Sequence Diagram for Voice/Video/SMS via robot communication with friends and relatives - Part1

[pic]

Figure 18: Sequence Diagram for Voice/Video/SMS via robot communication with friends and relatives - Part2

4 Components Description

1 SkypeAPI

Description: Skype C# Application Programming Interface (API) (Skype4COM)

Development environment: C#

Resources (devices): TabletPC, microphone, speakers

Inputs (with types):

|Messages Received From |Message |

|Skype |GetContacts() |

| |//Gets the list of contacts. |

| |PlaceCall(ContactID) |

| |//Makes a call to a certain contact; with ID= ContactID. |

| |PlaceMessage(Text, ContactID) |

| |//Sends a message to a certain contact. |

| |//The variable "Text" corresponds to message body. |

| |//ContactID: the ID of the contact to whom the message should be sent. |

| |HangUP() |

| |//Hangs up the current call. |

| |RejectCall() |

| |//Rejects an incoming call. |

| |AnswerCall() |

| |//Answers an incoming call. |

| |PlaceSMS(ContactID) |

| |//Sends an SMS to a certain contact, with ID= ContactID. |

Outputs (with type):

|Messages Sent To |Message |

|Skype |CallStatusChanged(call, status) |

| |//Notification that the call status has been changed |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

2 Skype

Description: Skype API integrated into Kompai-RobuBOX. It enables an older person to communication through voice/ video / SMS with friends and relatives.

Development environment: C# (MRDS Service)

Resources (devices): Internet Connection, TabletPC

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |GetContacts() |

| |//Sends a message to the Skype API to get the list of contacts. |

| |PlaceCall(ContactID) |

| |//Sends a message to the Skype API to make a call to a certain contact. |

| |PlaceMessage(Text,ContactID) |

| |//Sends a message to the Skype API to send an message (with body "Text") to a certain |

| |contact (with ID= ContactID). |

| |HangUP() |

| |//Sends a message to the Skype API to hang up the current call. |

| |RejectCall() |

| |//Sends a message to the Skype API to reject an incoming call. |

| |AnswerCall() |

| |//Sends a message to the Skype API to answer an incoming call. |

| |PlaceSMS(ContactID) |

| |//Sends a message to send an SMS to a certain contact with ID=ContactID. |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |StartCallNotification(notification) |

| |//Notifies that a call has started. |

| |CallingNotification(Notification) |

| |//Notifies about an outgoing call. |

| |IncomingCallNotification (Notification) |

| |//Notifies about an incoming call. |

| |CallProblemsNotification(notification) |

| |//Notifies about problem in making a call. |

Settings data (with types):

Older persons' SkypeID, password.

Logs (with types) and how to display them:

N/A

3 InteractionManager

Description: The InteractionManager handles all events that the other Components generate. The resulted information will be, then, used for accessing and updating the appropriate application data and then channelled to the appropriate output modalities.

Development environment: C# (MRDS service) (extended version of the "Control" service in Kompai-RobuBOX)

Resources (devices): TabletPC (with Kompai-RobuBOX)

Inputs (with types):

[Active input]

• Tactile commands issued by older persons though the robot's touch screen (GUI) (e.g. when the older person selects to answer a answer a call, by pressing the "Answer" button, in the GUI).

[Passive input]

• Events/Data coming from the Skype component

In particular:

|Messages Received From |Message |

|Skype |StartCallNotification(notification) |

| |//Notifies that a call has started. |

| |CallingNotification(Notification) |

| |//Notifies about an outgoing call. |

| |IncomingCallNotification (Notification) |

| |//Notifies about an incoming call. |

| |CallProblemsNotification(notification) |

| |//Notifies about problem in making a call. |

|GUI(Skype) |GetContacts() |

| |//Asks for the List of Contacts. |

| |SkypeRequest(RequestType, Text, ContactID) |

| |//RequestType: CALL, HANGUP| ANSWER | REJECT | |

| |SEND_MESSAGE } SEND_SMS |

| |//Text: parameter holding the body of the message/ sms to be sent |

| |//ContactID// Id of the calling / called contact and /or where the message will be sent |

Outputs (with type):

• Information channelled to the SpeechSynthesis component;

• Information channelled to the GUI component;

• Pass events, access or update applications / application components (e.g. Skype).

|Messages Sent To |Message |

|Skype |GetContacts() |

| |//Sends a message to the Skype API to get the list of contacts |

| |PlaceCall(ContactID) |

| |//Sends a message to the Skype API to make a call to a certain contact |

| |PlaceMessage(Text,ContactID) |

| |//Sends a message to the Skype API to send an message (with body "Text") to a certain |

| |contact (with ID= ContactID) |

| |HangUP() |

| |//Sends a message to the Skype API to hang up the current call |

| |RejectCall() |

| |//Sends a message to the Skype API to reject an incoming call |

| |AnswerCall() |

| |//Sends a message to the Skype API to answer an incoming call |

| |PlaceSMS(ContactID) |

| |//Sends a message to send an SMS to a certain contact with ID=ContactID |

|GUI (Skype) |NotificationToGUI (NotificationToGUIType, PartnerName) |

| |//NotificationToGUIType = |

| |START_SKYPE_VIDEOCONFERENCE | OUTGOING_SKYPE_CALL | OUTGOING_SKYPE_CALL_FAILED | |

| |INCOMING_SKYPE_CALL |

| |// PartnerName: Name of the calling/ called party. |

| |NotificationToGUI (NotificationToGUIType) |

| |//NotificationToGUIType = DISPLAY_SERVICE/ |

|SpeechSynthesis |TextToSpeechNotification(Notification) |

| |// Notification= OUTGOING_SKYPE_CALL_FAILED | ASK_ABOUT_LEAVING_A_MESSAGE | |

| |MESSAGE_HAS_BEEN_SENT |

Settings data (with types):

Information about the background functions enabled or disabled

• NutritionAssistance: TRUE | FALSE

• Dehydrationprevention: TRUE | FALSE

• MedicalReminders: TRUE | FALSE

• ExercisingCoach: TRUE | FALSE

• PanicResponder: TRUE | FALSE

Information about main menu functions enabled or disabled

• AudioVideoContact: TRUE | FALSE

• Games: TRUE | FALSE

• Exercises: TRUE | FALSE

• SelfCheck: TRUE | FALSE

• HomeControl: TRUE | FALSE

• RobotControl: TRUE | FALSE

• AudioVideoContactIcon: string [file path]

• GamesIcon: string [file path]

• ExercisesIcon: string [file path]

• SelfCheckIcon: string [file path]

• HomeControlIcon: string [file path]

• RobotControlIcon: string [file path]

Logs (with types) and how to display them:

N/A

4 GUI(Skype)

Description: Graphical User Interface provided by the robot, the one related to F11.

Development environment: This component is being implemented in C# (as an MRDS service) where all the graphics elements of the HMI are bundled. It has a single WPF window where the specific graphics controls for this high-level functionality can be shown.

Resources (devices): TabletPC on the robot.

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |NotificationToGUI (NotificationToGUIType, PartnerName) |

| |//NotificationToGUIType = |

| |START_SKYPE_VIDEOCONFERENCE | OUTGOING_SKYPE_CALL | OUTGOING_SKYPE_CALL_FAILED | |

| |INCOMING_SKYPE_CALL |

| |// PartnerName: Name of the calling/ called party. |

| |NotificationToGUI (NotificationToGUIType) |

| |//NotificationToGUIType = DISPLAY_SERVICE/ |

Outputs (with type):

|Messages Sent To |Message |

|InteractionManager |GetContacts() |

| |//Asks for the List of Contacts |

| |SkypeRequest(RequestType, Text, ContactID) |

| |//RequestType: CALL, HANGUP| ANSWER | REJECT | |

| |SEND_MESSAGE } SEND_SMS |

| |//Text: parameter holding the body of the message/ sms to be sent |

| |//ContactID// Id of the calling / called contact and /or where the message will be sent |

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

5 SpeechRecognition

Description: Speech input recognition component, on the robot

Development environment: It is a basic component of Microsoft Windows, integrated within MRDS and Kompai-RobuBOX Software. It can be used to recognise any sentence.

Resources (devices): Tablet PC (RobuBOX) and microphone

Inputs (with types):

-

Outputs (with type):

-

Settings data (with types): Spoken commands to recognise

Logs (with types) and how to display them:

N/A

6 SpeechSynthesis

Description: Text to speech voice generator, on the robot.

Development environment: C# (MRDS service)

Resources (devices): TabletPC, robubox and speakers

Inputs (with types):

|Messages Received From |Message |

|InteractionManager |TextToSpeechNotification(Notification) |

| |// Notification= OUTGOING_SKYPE_CALL_FAILED | ASK_ABOUT_LEAVING_A_MESSAGE | |

| |MESSAGE_HAS_BEEN_SENT |

Outputs (with type):

-

Settings data (with types):

N/A

Logs (with types) and how to display them:

N/A

Information Fusion and Channelling: InteractionManager Component

MOBISERV has multiple sources of information as well as multiple target applications for the gathered information. Information Fusion and Channelling within MOBISERV is handled by the InteractionManager, a component already identified in D3.1 [1] and Chapter 1. In the following paragraphs we provide more detailed information about this component, its role, communication inputs and outputs, development environment, etc.

Description: The InteractionManager is responsible for coordinating information across the various sensors and modalities (i.e. Modality Components), which are used within MOBISERV for supporting the implementation scenarios identified.

In particular, the InteractionManager handles all events that the other components generate. These events/data are initially interpreted and formatted in a certain format (DSSP-based communication messages); the resulted information is then used for accessing and updating the appropriate application data and then channelled to the appropriate output modalities, by means of the InteractionManager. If the InteractionManager does not contain an explicit handler for an event, any default behavior that has been established for the event will be respected. If there is no default behavior, the event will be ignored. A second level of interpretation is being performed by the InteractionManager, to define the dialogue context, which is then provided to application requiring this dialogue contextual information.

This means that the InteractionManager fulfils multiple functions. It is responsible for synchronization of data and focus, etc., across different components as well as the higher-level interaction flow that is independent of a specific (modality) component. It also maintains the high-level application data model.

The overall interaction flow may be represented as a state machine, whereas lower-level state machines nested inside the InteractionManager will handle the cross-component synchronization at each phase of the higher-level flow. A state machine defines several states that represent the current situation (interaction state and context of application) for the InteractionManager. Certain events from the other collaborating components (inputs from components the InteractionManager communicates with and changes in the system and environment) can change this state. For example, the InteractionManager may have a "NO-WAIT" state whereby it is no waiting to receiving a certain input. When a reminder to eat is being issued, the state may change from "NO_WAIT" to "WAIT_CONFIRMATION", and the Interaction Manager will issue a command to "GUI" and or "SpeechSynthesis" component (which will, then, answer whether the user answered positively or negatively to the reminder).

The InteractionManager manages also the MOBISERV functions (background functions and functions presented in the PRU's main menu). In D3.1 this was the role of the ApplicationManager. In the course of the project, the ApplicationManager was merged with the InteractionManager.

Overall the InteractionManager consists of the following parts:

• EventsManager: Handles Events generated from other MOBISERV components;

• ActionManager: Coordinates through actions-scheduling the uni-modal or multimodal information channelling to the MOBISERV output modality components or to other components requesting this information;

• InteractionFlowManager: Manages the states of the discourse flow, updates the current state, checks the conditions that might trigger state transitions and executes states transitions if those conditions are met. Furthermore, synchronises the flow state in the persistent database in order to provide overall robustness;

The following diagram presents the collaboration of the aforementioned parts/ components of the InteractionManager with the rest of the MOBISERV components.

[pic]

[pic]

Figure 19: InteractionManager - constituents and collaborating components

Development environment: The InteractionManager has been implemented in C# as an MRDS service. The "Control" service of the existing Kompai-RobuBOX has been considered as the starting point to build on. The MRDS SDK complies with the initial requirements set for the InteractionManager, i.e.:

• Seamless interoperation with Decentralized Software Services (DSS) and Concurrency and Coordination Runtime (CCR) libraries that are present in Microsoft Robotics Developer Studio (MRDS), which is the base development platform for PRU.

• Persistent state management and support for internal low-level states.

• Traceability of the current state of the InteractionManager from its run-time engine for monitoring and debugging purposes.

Resources (devices): Tablet PC, with Kompai-RobuBOX (and prerequisites) installed

Inputs (with types):

[Active input]

• Voice Commands issued by the older person (SpeechRecognition modality component);

• Tactile commands issued by older persons though the PRU's touch screen (GUI modality Component).

[Passive input]

• Eating/drinking events and related data coming from the NutritionAgenda component;

• Eating or drinking activity reports (activity detection results and confidence level) originated from the NutritionActivityDetection component and relayed through the NutritionActivity component;

• Events/Data coming from the Visitors component;

• Events/Data coming from the "ExercisesAndSchedule" component concerning the list of recommended exercises, activities along with their schedule;

• Events/Data coming from the "SMHInterface" component;

• Events/Data coming from the "ActivityMonitor" componenent;

• Data from the "LocaliseOlderPerson" component (with information about the older person's location in his/ her apartment);

• Events/Data coming from the Skype component;

The following table summarises information about the InteractionManager's communication inputs, which has been presented in Chapter 1 for each one of the high-level functionalities that will be supported by the First MOBISERV System Prototype. These inputs will have the form of DSSP-based communication messages.

|Messages Received From |Message |

|Active Input |

|GUI (Nutrition) |GUIIsNutritionEventReached() |

| |//Event received form the GUI. It means that the older person has been asked whether he/she|

| |has eaten/ drunk and he/she answered yes through the GUI (by pressing a button). |

|GUI (Visitors) |VisitorsRequest(RequestType) |

| |//The older person sends an event to the InteractionManager by pressing a button in the |

| |respective GUI. |

| |// RequestType = ANSWER_DOOR_CALL | HANG_UP_DOOR_CALL | OPEN_DOOR | CALL_FOR_HELP | |

| |CLEAR_MISSING_DOOR_CALLS |

| |GetMissingDoorCalls() |

| |//Sends a command to the InteractionManager to get the list of missing door calls. |

|GUI(Exercises) |GetExercises() |

| |// Requests the list of Exercises. |

| |GetActivities() |

| |// Requests the list of Activities. |

| |GetEncouragementsSchedule() |

| |//Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person should do each |

| |day. |

| |ExercisesAndScheduleRequest(RequestType) |

| |// Requests that go through the "InteractionManager". |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

| |GUIRequest(RequestType, service) |

| |// GUI RequestType: DISPLAY_SERVICE |

| |//Service: corresponds to the "Exercises" submenu |

|GUI (Exercises Coach) |EncouregementNotReached(encouragement) |

| |//Informs the InteractionManager that the older person responded negatively to the |

| |encouragement issued. |

| |EncouregementIsReached(encouragement) |

| |//Informs the InteractionManager that the older person responded positively to the |

| |encouragement issued. |

|GUI(Skype) |GetContacts() |

| |//Asks for the List of Contacts. |

| |SkypeRequest(RequestType, Text, ContactID) |

| |//RequestType: CALL, HANGUP| ANSWER | REJECT | |

| |SEND_MESSAGE } SEND_SMS |

| |//Text: parameter holding the body of the message/sms to be sent |

| |//ContactID: Id of the calling/called contact and/or where the message will be sent |

|SpeechRecognition |SpeechIsNutritionEventReached() |

| |//Event received form the SpeechRecognition component. It means that the older person has |

| |been asked whether he/she has eaten/ drunk and he/she answered yes through speech. |

| |VisitorsRequest(RequestType) |

| |//The older person sends an event to the InteractionManager by issuing the respective |

| |command. |

| |//RequestType = ANSWER_DOOR_CALL | HANG_UP_DOOR_CALL | OPEN_DOOR | CALL_FOR_HELP | |

| |CLEAR_MISSING_DOOR_CALLS |

| |GetMissingDoorCalls() |

| |//Sends a command to the InteractionManager to get the list of missing door calls. |

| |ExercisesAndScheduleRequest(RequestType) |

| |// Requests that go through the "InteractionManager". |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

| |GUIRequest(RequestType, service) |

| |// GUI RequestType: DISPLAY_EXERCISE_SERVICE |

| |//service: It corresponds to the "Exercises" submenu. |

| |EncouregementNotReached(encouragement) |

| |//Informs the InteractionManager that the older person responded negatively to the |

| |encouragement issued. |

| |EncouregementIsReached(encouragement) |

| |//Informs the InteractionManager that the older person responded positively to the |

| |encouragement issued. |

|Passive Input |

|NutritionAgenda |AgendaEvents |

| |//List of nutrition events |

| |InitiateNutritionActivity() |

| |//Request to initiate the service "NutritionActivity" |

| |NutritionEventReminder(Event, Text, Voice, Message, Image, Video) |

| |//Event as set by the secondary user |

| |//Text: TRUE | False |

| |//Voice: TRUE | False |

| |//Message: message set by the secondary user. It will be delivered through the GUI and/or |

| |SpeechSynthesis based on the vales of "Text" and "Voice". |

| |//Image: path to an image file |

| |//Video: path to a video file |

| |NutritionEventEncouragement(Event, Text, Voice, Message, Image, Video |

| |// similar to the above |

|NutritionActivity |Acknowledgement of initiation or termination of the NutritionActivity component (MRDS |

| |service) |

| |ActivityDetectionResult(EventResult, EventConfidence) |

| |//EventResult: TRUE | FALSE |

| |//EventConfidence: float |

|Visitors |IncomingCallNotification(notification, link) |

| |// link = the link to be established between the robot and the Door for displaying video/ |

| |audio. |

|ExercisesAndSchedule |(List of) Exercises |

| |(List of) Activities |

| |(List) encouragementSchedule |

| |//List of "encouragement events" |

| |Exercise |

| |//In case it has been requested to provide the NEXT or PREVIOUS Exercise |

| |Activity |

| |//In case it has been requested to provide the NEXT or PREVIOUS Activity |

| |EncouragementTimeReached(encouragement) |

| |//Informs that - according to the schedule - the user should have done an exercise / |

| |activity (i.e. it is time to issue an encouragement, if no activity was detected in the |

| |last 45 minutes). |

|SMHInterface |MotionDetectionResult(result, confidence) |

| |//result: TRUE | FALSE |

| |//confidence: float |

|ActivityMonitor |ActivityDetectionResult(result, confidence) |

| |//result: TRUE | FALSE |

| |//confidence: float |

|Skype |StartCallNotification(notification) |

| |//Notifies that a call has started. |

| |CallingNotification(Notification) |

| |//Notifies about an outgoing call. |

| |IncomingCallNotification (Notification) |

| |//Notifies about an incoming call. |

| |CallProblemsNotification(notification) |

| |//Notifies about problem in making a call. |

|LocaliseOlderPerson |FindOlderPersonResult(location) |

| |//Acknowledgment that the older person has been found and/or location of the older person |

| |in his/her apartment. |

Outputs (with type):

• Information channelled to the GUI component;

• Information channelled to the SpeechSynthesis component;

• Pass events, access or update application components:

o Information requested from and commands issued to the NutritionAgenda component;

o Commands issued to the NutritionActivity;

o Requests to the Visitors component;

o Information requested from and commands issued to the ExercisesAndSchedule component;

o Information requested from the SMHInterface;

o Information request from the ActivityMonitor;

o Information requested from and commands issued to the Skype component;

• Commands issued to the LocaliseOlderPerson component;

• Commands issued to the GoNextToOlderPerson component;

The following table summarises information about the InteractionManager's communication outputs, which has been presented in Chapter 1 for each one of the high-level functionalities that will be supported by the First MOBISERV System Prototype. These outputs will have the form of DSSP-based communication messages.

|Messages Sent To |Message |

|GUI(Nutrition) |NotificationToGUI(NotificationType) |

| |//NotificationType: NUTRITION_DETECTION_INITIATED |

| |//Notifies the user - though the GUI - that the cameras has been switched on. |

| |NotificationToGUI(Event, Message, Image, Video) |

| |//Event as set by the secondary user |

| |//Message: message set by the secondary user. It will be delivered through the GUI |

| |//Image: path to an image file |

| |//Video: path to a video file |

|GUI (Visitors) |NotificationToGUI(NotificationToGUIType) |

| |// Commands the GUI to display a notification. |

| |//NotificationToGUIType= INCOMING_DOOR_CALL | DOOR_CALL_ANSWERED | DOOR_CALL_HANGEDUP | |

| |DOOR_OPENED | START_SKYPE_VIDEOCONFERENCE | SHOW_CONFIRMATION_WINDOW | |

| |UPDATE_MISSINGCALLS_LIST |

|GUI (Exercises |NotificationToGUI(NotificationType, service) |

| |//Notifies the GUI to display a certain service. |

| |//NotificationType: DISPLAY_SERVICE |

| |//Service: corresponds to "Exercises" |

| |NotificationToGUI(NotificationType, Exercice) |

| |// NotificationType: DISPLAY_EXERCISE |

| |NotificationToGUI(NotificationType, Activity) |

| |// NotificationType: DISPLAY_ ACTIVITY |

|GUI (Exercise Coach) |NotificationToGUI(encouragement, message, image, video) |

| |//Encouragement as set by the secondary user |

| |//Message: message set by the secondary user. It will be delivered through the GUI. |

| |//Image: path to an image file |

| |//Video: path to a video file |

|GUI (Skype) |NotificationToGUI (NotificationToGUIType, PartnerName) |

| |//NotificationToGUIType = |

| |START_SKYPE_VIDEOCONFERENCE | OUTGOING_SKYPE_CALL | OUTGOING_SKYPE_CALL_FAILED | |

| |INCOMING_SKYPE_CALL |

| |// PartnerName: Name of the calling/ called party. |

| |NotificationToGUI (NotificationToGUIType) |

| |//NotificationToGUIType = DISPLAY_SERVICE/ |

|SpeechSynthesis |SendToTextToSpeech("NUTRITION_DETECTION_INITIATED") |

| |//Notifies the user - though speech - that the cameras have been switched on. |

| |TextToSpeechNotification(Event, Message) |

| |//Event as set by a secondary user |

| |//Message: message set by a secondary user. It will be delivered by voice. |

| |SendToTextToSpeech("INCOMING_DOOR_CALL") |

| |//Sends a notification to the SpeechSynthesis to say to the user that there is an incoming|

| |door call. |

| |SendToTextToSpeech("DOOR_OPENED") |

| |//Sends a notification to the SpeechSynthesis to say to the user that the door is opened. |

| |SendToTextToSpeech("CONFIRM_CLEAR_MISSINGCALLS_LIST") |

| |//Sends a notification to the SpeechSynthesis, asking the user to confirm that the list of|

| |missing calls will be cleared. |

| |SendToTextToSpeech("MISSINGCALLS_LIST_CLEARED") |

| |//Sends a notification to the SpeechSynthesis to say to the user that the list of missing |

| |calls was cleared. |

| |SendToTextToSpeech(EXERCISE_POSITIVE_REPLY) |

| |// EXERCISE_POSITIVE_REPLAY would be: |

| |// Great, have a look at my screen for some suggestions. |

| |// Great, here they come. |

| |// Great, here I come. |

| |// Here we go. |

| |// If you like, you can put on the vest for extra measurements. |

| |SendToTextToSpeech(EXERCISE_POSITIVE_REPLY) |

| |//Notifies the user - though speech - that the exercise / activity is delivered / |

| |presented. |

| |TextToSpeechNotification(encouragement, message) |

| |//Encouragement as set by the secondary user |

| |//Message: message set by the secondary user. It will be delivered by voice. |

| |TextToSpeechNotification(Notification) |

| |// Notification= OUTGOING_SKYPE_CALL_FAILED | ASK_ABOUT_LEAVING_A_MESSAGE | |

| |MESSAGE_HAS_BEEN_SENT |

|NutritionAgenda |GetAgendaEvents() |

| |// Requests the list of the nutrition events. |

| |EventReached(Event) |

| |//Informs the NutritionAgenda that the event has been reached (the user has eaten/drunk). |

| |UpdateEvent(Event) |

| |//Sends a command to the NutritionAgenda to update the event (the event has not been |

| |reached). |

| |NotifyOlderPersonNotFound() |

| |//Notifies the NutritionAgenda that the older person is probably not in his/ her |

| |apartment. |

|NutritionActivity |InitiateNutritionActivity() |

| |//Starts the respective MRDS time on demand. |

| |DetectActivity(event) |

| |//Terminates the respective MRDS time on demand. |

|Visitors |NotifyOlderPersonNotFound() |

| |//Notification that the older person has not been found |

| |AnswerDoorCall() |

| |// Sends a message to the Visitors component to answer the door call. |

| |HangUpDoorCall() |

| |//Sends a message to the Visitors component to hang-up the door call. |

| |OpenDoorNotification() |

| |//Sends a message to the Visitors component to open the door. |

|ExercisesAndSchedule |GetExercises() |

| |// Requests the list of Exercises. |

| |GetActivities() |

| |// Requests the list of Activities. |

| |GetEncouragementsSchedule() |

| |//Requests the schedule for the encouragements. Note that this schedule contains also |

| |information about the recommended activities and exercises the older person should do each|

| |day. |

| |ExercisesAndScheduleRequest(RequestType) |

| |// Requests to the "ExercisesAndSchedule". |

| |//RequestType: GET_NEXT_EXERCISE | GET_NEXT_ACTIVITY | GET_PREVIOUS_EXERCISE | GET_ |

| |PREVIOUS _ACTIVITY |

| |ActivityDetected(encouragement) |

| |//Informs that an activity has been detected in the last 45 minutes => No need to issue an|

| |encouragement. |

| |UpdateEncouragment(encouragement) |

| |//Requests to update an encouragement (in the schedule). This encouragement is not yet |

| |considered reached. |

| |ExerciseEncouragementIsReached(encouragement) |

| |//Informs that the "encouragement" has been reached. |

|SMHInterface |IsMotionDetected(lastXMinutes) |

| |//Asks if motion has been detected in the house in the lastXMinutes; |

| |// Default value of lastXMinutes=45 |

| |LogSMHExercisesData() |

| |//Logs data coming from sensors in the smart home. The logs would include information |

| |about the place inside the house or the name of the motion detector sensor that detected |

| |motion, along with timing information. |

|ActivityMonitor |IsActivityDetected(lastXMinutes) |

| |//Asks if activity has been detected (by the WHSU/ DataLogger) in the lastXMinutes; |

| |//Default value of lastXMinutes=45 |

| |GetExercicesLogData() |

| |//Asks the measurements concerning vital signs, activity and physiological extracted |

| |parameters. These measurements will be "downloaded". |

|Skype |GetContacts() |

| |//Sends a message to the Skype API to get the list of contacts |

| |PlaceCall(ContactID) |

| |//Sends a message to the Skype API to make a call to a certain contact |

| |PlaceMessage(Text,ContactID) |

| |//Sends a message to the Skype API to send an message (with body "Text") to a certain |

| |contact (with ID= ContactID). |

| |HangUP() |

| |//Sends a message to the Skype API to hang up the current call. |

| |RejectCall() |

| |//Sends a message to the Skype API to reject an incoming call. |

| |AnswerCall() |

| |//Sends a message to the Skype API to answer an incoming call. |

| |PlaceSMS(ContactID) |

| |//Sends a message to send an SMS to a certain contact with ID=ContactID. |

|LocaliseOlderPerson |FindOlderPerson() |

| |//Sends a command to locate an older person in his apartment. |

|GoNextToOlderPerson |GoNextToOlderPerson (result) |

| |// Generates a path for the robot to go dynamically next to the older person. |

| |//where Result = location: |

| |//Position X (meter): double |

| |//Position Y (meter) : double |

| |//Orientation (radian) : double (if available) |

Settings data (with types):

Information about the MOBISERV background functions enabled or disabled:

• NutritionAssistance: TRUE | FALSE

• Dehydrationprevention: TRUE | FALSE

• MedicalReminders: TRUE | FALSE

• ExercisingCoach: TRUE | FALSE

• PanicResponder: TRUE | FALSE

Information about MOBISERV main menu functions enabled or disabled

• AudioVideoContact: TRUE | FALSE

• Games: TRUE | FALSE

• Exercises: TRUE | FALSE

• SelfCheck: TRUE | FALSE

• HomeControl: TRUE | FALSE

• RobotControl: TRUE | FALSE

• AudioVideoContactIcon: string [file path]

• GamesIcon: string [file path]

• ExercisesIcon: string [file path]

• SelfCheckIcon: string [file path]

• HomeControlIcon: string [file path]

• RobotControlIcon: string [file path]

Logs (with types) and how to display them:

N/A

Remotely Accessible Services for Secondary Users

1 Introduction

This Chapter provides an introduction to remotely-accessible services for secondary users to enable interaction and personalisation of the MOBISERV system features and functionalities based on the individual older persons’ physical and cognitive needs.

As mentioned before, during the second project year, additional studies targeted to secondary and tertiary users were conducted under WP2. One of the outcomes of these studies was a report presenting recommendations for the graphical user interfaces for primary but also for secondary users. Through these interfaces, secondary users, such as carers or relatives, would be able to:

• monitor and view selected data collected on the primary user such as medical information or user activity history. These Data Logs (related to Nutrition, Hydration and Exercises and Physiological) would be accessed either though the PRU or remotely, over the web;

• adjust, activate or deactivate parameters of the primary user interface; these parameters would be adjusted through the PRU or remotely, over the web (at least part of them such as the setting for Nutrition, Hydration and Exercises and Physiological).

In the following paragraphs we provide an overview of the design recommendations for the Data Logs and Setting menus related to Nutrition, Hydration and Exercises and Physiological, as taken by the internal document on MOBISERV HMI specifications for the Physical Robotic Unit [3]. Subsequently, we present the technological solution adopted to implement this WebInterface for secondary users.

2 Overview of Recommendations for Remotely Accessible Secondary User Interface

1 Recommendations for Data Logs

1 Wireframe designs

[pic]

Figure 20: Design recommendation – secondary user main screen – data logs

[pic]

Figure 21: Design recommendation – secondary user – Nutrition Logs

[pic]

Figure 22: Design recommendation – secondary user – Hydration Logs

[pic]

Figure 23: Design recommendation – secondary user – Exercise Logs (Kcals)

[pic]

Figure 24: Design recommendation – secondary user – Exercise Logs (Heart and respiration)

2 Comments on the secondary user log screens

• These screens can be accessed via PRU touch screen tablet PC and/or remotely over the web.

• Selecting a date button opens/closes the full-days' data log.

• The columns provide an indication of what prompted the activity, with colour coding to indicate a positive action (green) for ease of visualisation of information.

• Settings will also allow reports to be sent by e-mail.

2 Recommendations for Settings Interface, related to Nutrition, Hydration and Exercises and Physiological

1 Nutrition Settings Screen

Wireframe design

[pic]

Figure 25: Design recommendation - Nutrition Settings screen

[pic]

Figure 26: Design recommendation - Nutrition Settings – Set-up Meal screen

• Adjustment of meal times (breakfast, lunch, dinner, snacks) for the individual;

• Customisation of reminder and encouragement voice/messages/pictures/videos. Secondary users should be able to upload pictures and messages that are displayed to the primary user during a reminder/encouragement message (the Customise Messages GUI is pending).

2 Hydration settings screens

[pic]

Figure 27: Design recommendation - Hydration Settings screen

[pic]

Figure 28: Design recommendation - Hydration Settings – Drink set-up screen

• Customisation of format of the reminder and encouragement voice/messages/pictures messages. Secondary users should be able to upload pictures and messages that are displayed to the primary user during a reminder/encouragement message.

3 Exercises setting screen

Wireframe design

[pic]

Figure 29: Design recommendation - Exercise Settings screen

• This sub-menu should allow the secondary user to select different types of exercises and activities or specify their own based on their expert knowledge of the ability and constraints of the user.

• Allow customisation of format of reminder and encouragement voice/messages/pictures based on when a user prefers to do exercises and when not. Secondary users should be able to upload pictures and messages that are displayed to the primary user during a reminder/encouragement message.

• Include an option for remote access of user activity data

3 Development Environment and Implementation Issues

The remotely-accessible services for secondary users are supported by a Web-based application that provides a WebInterface to them. This web-based application is based on three-tier architecture and was developed in PHP/ MySQL.

All data is stored in a MySQL database, which corresponds to the MOBISERV database component presented in the sequence diagrams of Chapter 1. The database is, also, accessible though some of the MRDS services deployed in the PRU.

The following figure shows the screen implemented through which the remote user (carer, family member, etc) enters information about the older person's nutrition schedule (adding new meal). Already defined meals are also presented there.

[pic]

Figure 30: Adding new meal (and viewing defined meals) screen

The figure below shows the screen implemented through which the remote user views, edits and adds information about the messages that will be presented to the primary user, to remind or encourage him/herself to eat.

[pic]

Figure 31: Adding new message (and viewing defined messages) screen

Interaction between the MOBISERV robotic system (PRU) and the smart home assistive infrastructure

The Smartest House of the Netherlands is a prototype intelligent house to incorporate novel living related technologies for study, development and demonstration use. There are installed multiple different technologies per project bases. The integration of technologies and control of the house is established by AMX and KNX systems.

AMX is a corporation providing products and solutions for buildings automation and control for devices, assets and content. Usual AMX solution areas are profitable buildings like hotels and conference centres, but AMX products can be applied very well to the home automation systems. The deployed product from the AMX in the Smartest Home is an AMX NetLinx controller that supports high-level home control functions. High-level functions consider making a program that implements certain functionality logic and adaptation to the controlled system, for example controlling separate Multimedia PC device. AMX NetLinx controller product also interacts natively to AMX wireless touch panels that provide the main user interface for the home control functionalities.

KNX is a European-based standard (with ISO standardization) to establish interworking of building control devices in low-level functionality. Low-level functionality means that KNX devices do not have much programming capability and systems deployment is very much hardware based. KNX devices usually control home assets power on/off control or by logic circuit signals. Interaction of KNX devices is usually formed by serial bus between devices along KNX standard-based device addressing. Inside serial bus is possible to make low-level programming by re-routing control signals to the certain device assets, e.g. a "which light switch" controls a "which light bulb". To establish control from high-level programs, IP/TCP/UDP gateway can be used.

In the Smartest House of the Netherlands, deployment of AMX- and KNX-based systems are used to get high- and low-level programmability and control the house devices, assets and content. The AMX NetLinx controller runs a smartest home specific program, where part of the control functionality connects, through IP networking, to a KNX IP gateway and then to specific KNX control devices. User to house interaction is made by multiple AMX touch panels installed into the house.

The communicating entities involved in MOBISERV system (not only those related to F14 but also to other high-level functionalities requiring communication with the smart home infrastructure) and the Smartest House Control System are presented in Figure 32: Home-automation components.

Figure 32: Home-automation components

Components roles and collaboration:

• InteractionManager handles and coordinates the home control events to the rest of MOBISERV systems functionalities and components.

• Visitors component provides application specific logic to management of home control assets. Component manages combing operations from KNX signals and door located devices, especially video camera.

• HomeAutomationAMX is an MRDS service component (implemented in C#) that adapts the DSSP-based communication messages to the AMX/KNX-based system command and events. This component, also, handles the structural addressing and session management towards the home-automation system. This component works as a simple gateway between MOBISERV systems and home control system and does not add any functionality.

• HomeAutomationTest is a virtual testing component having simulated home automation functionality. The AMX/KNX-based system is installed, at the moment, only in the Smartest House of the Netherlands; so to be able to perform system deployment tests, the virtual replacement component is needed.

• IHomeControlReq and IHomeControlInd are communication interfaces defining the messages interchanges between components.

• SmH-API and SmH-API-Events are interfaces definitions to access and control home assets. Communications goes through TCP/IP as a text messaging style [4] .

• AMX NetLinx is the AMX controller device. AMX controller is a closed system device having its own programming language, called NetLinx to implement buildings controlling logic. The controller is able to load and run only one program at a time; in this design this is called "TheProgram". "TheProgram" is divided into modules presenting certain application controlling logic. The "SmH-Control" is a NetLinx module providing text-based control commands to devices installed in the Smartest House. SmH-API defines the available commands.

• KNX-GW is a KNX gateway device providing the interconnection between IP-based communication and KNX device bus.

• KNX-Actor is a home control asset controlling device. It might be power relay, TTL logic based control or other style device stated in KNX standard.

• Door is a facade design construct consisting door related sub-components.

• Doorbell is activated by visitor and signalled to the system.

• Lock is an electronic lock controlled by KNX-actor. It might be opened upon older persons' request.

• Microphone and Loudspeaker, door installation might include vocal communication devices for older person and visitor discussion.

• Camera is able to monitor visitor upon older person's request.

• OtherAsset represents here other possible building control assets that the system is able to manage (possible extensibility).

Communications definition:

Component HomeAutomationAMX interfaces IHomeControlReq and IHomeControlInd:

IHomeControlReq request interface

interface IHomeControlReq [

doc “Home control functionalities. Request direction”;

stereotypes mrds-service, dssp;

direction request;

] {

message begin [

doc “Begin a communication session to the controller

device.”;

] {

data host: String [

doc “Hostname or ip-adress to connect”;]

data port: Uint16 [

doc “Port number to connect to”;]

} => group(msg where msg in IhomeControlInd and msg.tags

has “opcode”);

message end [

doc “End communication session to the controller.”;

] {} => None;

message raw_cmd [

doc “Send uninterpreted textual command to the AMX

netlinx controller”;

] {

data cmd: String [

doc “command string”;

precondexpr ch in cmd where ch not in “\n”;

} => IhomeControlInd.cmd_resp;

message open_door [

doc “Open a door lock in a house. This assumes only one

controlled door installation.”;

] {

} => group(msg where msg in IhomeControlInd and msg.tags

has “opcode”);

}

IHomeControlInd indication interface

interface IHomeControlInd [

doc “Events from home control devices. Indication

interface”;

stereotypes mrds-service, dssp;

direction indication;

] {

message cmd_ok [

doc “Given command operation was success. This assumes

synchronisation in communication.”;

tags “opcode”;

] {} => None;

message cmd_error [

doc “Given command operation was error. This assumes

synchronisation in communication.”;

tags “opcode”;

] {

data reason: String [doc “Error reason.”];

} => None;

message cmd_resp [

doc “Textual response from given control command.”;

] {

data response: String [doc “Response from command.”];

} => None;

}

MOBISERV Communication System

MOBISERV communication system definition and development is divided into two phases.

Phase I

As the PRU (Robosoft's Kompaï-RobuBOX) is the central platform in MOBISERV system deployment, the derived technologies from that platform form the existing base for software construction. The Microsoft Robotic Development Studio (MRDS) is the framework and platform where the components (or services) are deployed. The communication on the platform is realised with two-directional DSSP requests that are based on HTTP requests. Handling of requests is done by CCR framework, which provides asynchronous operation. Composition of the services set is managed by the DSS system.

Phase II

Being open is a requirement for the MOBISERV platform. The developed components should be usable by arbitrary future systems and the MOBISERV system itself should be able to be extended by adding, modifying or changing components that are deployed. The target of the Phase II in communications development is to ensure the communications practices, in interface, message ordering, structure or parameterization are usable when transferring deployed technologies to other systems that are currently in use.

The idea is to define MOBISERV system communications related issues, by implementing the language independent definitions language ComDL (Communications Definition Language). The definitions contain communications aspects, like message, interfaces, type definitions, organization, entity structures, message ordering or entity parameterization. The definitions are means to be re-usable and able to transform to different communications frameworks.

[pic]

Figure 33: Communications definitions use process defines steps in use of the definition language

Initial Security Analysis of the MOBISERV system

MOBISERV system provides important support for older adults and handles information that can be considered sensitive by system end-users. It is therefore important to analyse the security risks of the system and implications of the compromise of the system or data. A secure system will protect the data confidentiality and integrity as well as protect its availability.

Confidentiality:

MOBISERV applications will access and use information about user that is sensitive e.g. health status. Actual or perceived risk of such information being available for unauthorised personnel will affect negatively the acceptability of the MOBISERV solution. Protection of user privacy is thus important. Confidentiality of the MOBISERV data can be compromised at data storage, during data transmission or gaining access to one of the devices through which MOBISERV provides data output e.g. relatives' computer.

Integrity:

Many MOBISERV features rely on accurate information. Corrupted data may cause unexpected behaviour on the system. With fabricated data, a malicious party may try to affect on behavior of MOBISERV system. Data can be corrupted during the transmission or while stored. Fabricated data may be tried to enter to the system through normal MOBISERV input devices e.g. touch screen or via open communication channel.

Availability:

MOBISERV have several time critical functions. For example in emergency situation every second is important. Also, doctors and caretakers should have feasible access to the data they need. Finally the robot functions should be easily available for the end user.

1 Potential point of attacks

The MOBISERV system has several potential points towards or through which an attack may occur. These points can be divided in three different groups: Manipulation points, storage points and transmission points. Manipulation points are the locations from where data is read and written (created, modified and deleted), storage points are the places where data is held and transmission points are the communication channels through which data is transferred between entities. Identifying these points is important for securing the system and analysing the risks.

1 Information sources

MOBISERV system may take input from points inside and outside the smart home (user's home). Inside smart home the input points can be considered to be physically protected. Tampering the system via these points requires entering the house. The data input points inside the house are under control of MOBISERV developers and protecting them is easier. Access to points outside smart home are more widely available. Therefore it is more important to validate the input coming from these locations. Some of these points are not even under control of the MOBISERV development. For example, MOBISERV cannot affect the hardware of relatives and how they protect their computers.

Following is the list of points from where information can come to MOBISERV. After each point, the type of information/signal has been identified.

Outside the smart home

1. Door phone

Provides signal for ringing the door (Door ring). Video image from the door (Door image) and audio for conversation (Door audio).

2. Web application providing remotely accessible services to secondary users

Provides information about nutrition, hydration, exercises and physiological settings and logs

3. Caretaker/relative

Audio and Video → Skype

Robot control signals

Inside the smart home

1. Robot touch screen

Provides signal for starting application (App start).

Application specific commands (app com)

20. Robot microphone

Audio for video conferencing

Provides signal for starting application via multimodality (App start).

Application specific commands via multimodality (app com)

21. Robot Camera

Video conferencing

Provides signal for starting application via multimodality (App start).

Application specific commands via multimodality (app com)

22. Robot sensors

Provides information for robot movement (rob sens)

23. Activity camera(s)

Provides image about older adult activity for analysis (activity camera)

24. Smart garment sensors

Heart rate, ECG, 3daxis accelerometer

25. House sensors

Location

2 Information targets

MOBISERV system outputs different types of information through different points. The points can be inside or outside the smart home. The points outside the smart home cannot mostly be controlled by the MOBISERV system. For such location when sensitive data is transmitted, strong authentication is required. For locations inside smart home the access is mostly done by the older person. Therefore it is more important to focus on the simple access and avoid too complicated authentication method.

Outside smart home

1. Door view:

Audio from robot (av from robot). Signal for opening door (open door signal)

26. Internet

o Caretaker

Skype connection

o Web application

Events/Schedule and access to logs

Inside smart home

1. Robot screen at robot:

All the MOBISERV data are available.

3 Information storage

Information generated by the MOBISERV system, can be stored within the system or outside the system. Naturally the security of information stored outside the system cannot be affected by MOBISERV so the outside data storage has to be trusted.

Inside MOBISERV system

1. Robot laptop hard drive (in PRU)

All the MOBISERV data.

27. SHACU

Temporal logs of nutrition / hydration related activity

28. Data logger in WHSU

Temporary storage of sensor data

Known storage outside MOBISERV system

1. Caretaker /doctor / relative?

4 Information transfer

Outside the system, the data is transferred by using standard internet protocols. MOBISERV must use the protocols that are supported by the used service. Inside MOBISERV system the data transfer is not secured and the communication partners are not authenticated. For second phase of communication, the security issues will be addressed. Communication in conducted from MOBISERV system to the outside information targets and sources and within MOBISERV system to the earlier mentioned components.

2 Analysis of the handled data, threats and risks

MOBISERV system produces, handles, transmits and stores lot of different kind of data. Some of this data can be very sensitive while others need minimal protection. In order to develop security solution we need to know where these data are generated, stored and transmitted and what the effects are if some of the threats towards them are realised.

The basic threats towards handled data are: disclosure of data (unauthorised read, compromise of confidentiality); denial of service (compromise of data access, deletion of data); corruption of data (compromise of integrity of data, unauthorised write); and fabrication of data.

1 Activity video (for eating and drinking reminder and encouragement functionalities)

Source: Camera inside house

Stored at: SHACU or PRU

Storage length: Short. Only for the time it takes to anonymise it.

Transmitted between: Camera and SHACU or PRU.

Destination: MOBISERV system internal use only.

Threat analysis

Disclosure of data: Video feed is used to validate that the end-user eats and drinks regularly. The feed is meant only for internal use of MOBISERV system and not for human eyes. Compromise of the data will affect on end-user privacy and lowers the trust towards MOBISERV system. Reduces the acceptability of the system

Denial of service: lack of access to the activity video feed forces MOBISERV system to rely on query based verification of action performance.

Corrupted Data: False positives or false negatives on activity validation. May cause unwanted reminders or missed meal/drink

Fabricated data: False positives or false negatives on activity validation. May cause unwanted reminders or missed meal/drink

2 Door ring

Door ring signal is activated by the press or door bell. It activates a ringing sound effect in the house.

Source: Door bell

Transmitted between: Door bell → SHACU (Smart Home Infrastructure) → PRU

Stored at: Buffers on SHACU and PRU

Storage length: only for the duration of handling the signal

Output: Ring tone at Smart Home Infrastructure speakers.

Threat analysis

Disclosure of data: Not meaningful. However lack of response can be interpreted as empty house. Insignificant threat

Denial of service: Older adult will not know someone is at door. Insignificant threat

Corrupted data: No ringing. Insignificant threat

Fabricated data: May be used to cause repeated door bell ringing. Insignificant threat. Hard to execute, no real use.

3 Door video

Door video data is activated after door bell is rang and the older person sends signal to get door video.

Source: Door camera

Transmitted between: Door camera → SHACU (Smart Home Infrastructure) → PRU

Stored at: Only at buffers

Storage length: For transmission buffer only

Destination: Robot screen

Threat analysis

Disclosure of data: Enables view to front door. Insignificant threat

Denial of service: Older adult is not able to see who is behind door. May be used to hide identity of person ringing the door. Older adult may open the door to unknown person. Medium threat.

Corrupted data: Fake video feed is transmitted from door. May be used to fool the older person that a trusted person is behind the door. Hard to execute.

Fabricated data: To create illusion that someone is behind door. Insignificant threat.

4 WebInterface (Nutrition, Hydration, Exercises and Physiological Settings and Logs)

Nutrition, Hydration, Exercises and Physiological Settings are created by the older persons or the caretaker and they are stored in a database. These settings events can be fetched through the RobotHMI or though the WebInterface.

Also Nutrition, Hydration, Exercises and Physiological logs are stored in the database.

Source: Older person, caretaker, MOBISERV application

Transmitted between: PRU → SHACU → Internet

Stored at: PRU (or even SHACU)

Storage length: long

Destination: PRU (or even SHACU)

Threat analysis

Disclosure of data: Severity of single event/ setting disclosure depends on the event. Several disclosures allows eavesdropper to profile older persons. Medium to high threat towards privacy.

Denial of service: Event gets lost. Reliability and usability of the system is compromised. Small threat.

Corrupted data: Changing event may be used to affect older person's behaviour. High threat

Fabricated data: Fabricated event may be used to affect older person's behaviour (e.g. fool the older person to leave house). High threat

5 Nutrition, Hydration, Exercises and Physiological- related alert

Nutrition, Hydration, Exercises and Physiological- related alert is issued according to the relevant event settings.

Source: NutritionAgenda or ExercisesAndSchedule service

Transmitted between: NutritionAgenda or ExercisesAndSchedule service → PRU

Stored at: Database in PRU (or even at SHACU)

Storage length: minimum

Destination: Robot HMI

Threat analysis

Disclosure of data: Severity of single alert disclosure depends on the event. Several disclosures allows eavesdropper to profile the older person. Medium to high threat

Denial of service: Lost alert will cause missed event. Small threat

Corrupted data: Changing alert contents may be used to affect the older person's behaviour. Medium threat

Fabricated data: Fabricated alert may be used to affect the older person's behaviour (e.g. fake alarm may be used to fool older adult to go for a doctor, leaving house empty). Medium threat.

6 Video conferencing (Skype)

Source: Older adult and target

Transmitted between: PRU → SHACU → INTERNET

Stored at: Only at buffers,

Storage length: minimum

Destination: Relatives, caretakers

Threat analysis

Disclosure of data: Personal communications and personal data may be leaked out. Threat towards privacy, High threat

Denial of service: Connection to caretaker or relative not working. Reliability and usability of the system is compromised. May cause additional panic on elder adult who is in need of help. High Threat.

Corrupted data: Poor quality of video and audio. Small threat

Fabricated data: Someone faking to be doctor may get information from the older adult by using social engineering. Robot

7 MOBISERV Application start

Source: Robot HMI

Transmitted between: Inside Robot

Stored at: buffers

Storage length: minimum

Destination: robot

Threat analysis

Disclosure of data: Insignificant threat.

Denial of service: Usability of the system is compromised.. Small threat.

Corrupted data: Data corruption maybe used to start other applications in the robot such as enabling remote connections. High threat

Fabricated data: See corrupted data.

8 Application commands

Source: Robot HMI

Transmitted between: User – Robot

Stored at: buffers, possibly application log

Storage length: minimum on buffers, application specific on logs

Destination: Application

Threat analysis

Disclosure of data: Application specific

Denial of service: Application specific

Corrupted data: may be used to affect application behaviour

Fabricated data: May be used to affect application behaviour

9 Robot Audio and video

Source: Robot camera or microphone

Transmitted between: Robot hardware-Robot software

Stored at: Robot buffers

Storage length: short

Destination: Robot applications

Threat analysis

Disclosure of data: Privacy threat, same as for skype

Denial of service: Audio and video based systems don't work; see skype.

Corrupted data: Failures in audio/video based functionalities; see skype

Fabricated data: Failures in audio/video based functionalities; see skype

10 Smart garment sensor data

Source: Sensors in garment

Transmitted between: Smart garment (part of WHSU) → DataLogger (part of WHSU) → PRU → caretaker

Stored at: DataLogger → Database on the PRU (or even SHACU)

Storage length: hours

Destination: Database. Also, caretaker

Threat analysis

Disclosure of data: Allows knowledge about health status and activity of user. Major threat towards privacy (at least perceived threat)

Denial of service: Sensor data not available. Health status monitoring not working. Small threat

Corrupted data: False positive or negative on health status. False alarms may cause unnecessary anxiety on older adult; no alarm could delay the contact to caretakers which can be harmful for health. High Threat

Fabricated data: False positive or negative on health status. False alarms may cause unnecessary anxiety on older adult; no alarm could delay the contact to caretakers which can be harmful for health. High Threat.

References

1] "D3.1: Technical Specifications, Test and Implementation Scenarios" (v1.0.2), Jan, 10 2011

2] "D2.3: MOBISERV System Requirements Specification" - Volume II (v5), Jan, 4 2011

3] "HMI Specification Document" (v9.1), August, 08 2011 (internal document)

4] Richard Pasmans, “SmH-API specification” (v1.0.2) December, 6, 2011

-----------------------

The information contained in this report is subject to change without notice and should not be construed as a commitment by any members of the MOBISERV Consortium. The MOBISERV Consortium assumes no responsibility for the use or inability to use any software or algorithms, which might be described in this report. The information is provided without any warranty of any kind and the MOBISERV Consortium expressly disclaims all implied warranties, including but not limited to the implied warranties of merchantability and fitness for a particular use.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download