ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

6 0
  • 0 Collaborators

ViVACITY is a life monitoring system that runs hidden behind a robot. ...learn more

Robotics, RealSense™, Internet of Things, Artificial Intelligence

Overview / Usage

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

  • Interactive mirror.
  • Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

Comments (0)