ViVACITY -  A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Paulo Pinheiro

Paulo Pinheiro

Campinas, State of São Paulo

ViVACITY is a life monitoring system that runs hidden behind a robot.

Intel RealSense™, Robotics, Artificial Intelligence, Internet of Things

Description

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK. *temporary name.

===What roles could ViVa play?

  • Interactive mirror.
  • Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by. -Number of interactions. -Average people's emotion/mood. -Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age -vs. Weather -vs. Day of week -vs. Hours of day -vs. Food menu (business place) -vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women". -Ex: "This display store has attracted more men at the 30-40 years old range." -Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule". They were 55% happy, 10% concerned, 35% neutral." -Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform: ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement: To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition. -FreeLing, an open source language analysis. -Weka 3, data mining with open source machine learning software. -Protégé, a free, open-source ontology editor and framework for building intelligent systems. -DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web. -Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score. Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

Gallery

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 02893b76 8c2b 4cfd 9c44 6f3c310a39b5

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium bab74d26 e2c8 48b6 ad07 b602cd2e04b4

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 4cf33be6 2833 47df 8aa9 c87afe7646dd

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 2967ec2b bb3c 4728 ab75 54d910884a16

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 5f270915 04e2 4f15 a354 0f2496a5d2ae

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium afcb42e8 5d81 42ee 840f 99af14b3e71d

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 6e111e22 7807 446d bf72 b83b1f9f7493

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 23f8b648 e738 47e2 a598 e9069530c558

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 02893b76 8c2b 4cfd 9c44 6f3c310a39b5

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

11080561 764235763645452 7344272183143184984 o

Paulo P. added photos to project ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

Medium 4cf33be6 2833 47df 8aa9 c87afe7646dd

ViVACITY - A Life Monitoring System Hidden Behind an AI Robot.

ViVACITY is a life monitoring system that runs hidden behind an AI robot (ViVa) that can be an interactive mirror or an avatar. ViVa is a robot that uses an Intel RealSense camera to learning from people's behaviors, analyzing the environment and emotions. The ViVACITY's data mining algorithm can provide relevant information about people behaviors to the managers at stores and workplaces. From time to time, ViVACITY will cross all gathered data such as: emotions, hour of day, day of week, weather, age, gender, workplace's schedule to present the findings pattern and meaningful information. The ViVACITY comes embedded into the Intel RDK.
*temporary name.

===What roles could ViVa play?

- Interactive mirror.
- Host (a robot which receives or entertains guests at home or workplace or retail stores)

===What might ViVa record?

-Number of people have passed by.
-Number of interactions.
-Average people's emotion/mood.
-Most frequently asked questions and captured emotions before and after the given answer.

=== At the end of a period of time, what data would ViVACITY cross?

-Emotions vs. Gender and Age
-vs. Weather
-vs. Day of week
-vs. Hours of day
-vs. Food menu (business place)
-vs. Company's schedule

=== For example, what relevant information could ViVACITY offer?

-Ex: "At this department store region men spend more time than women".
-Ex: "This display store has attracted more men at the 30-40 years old range."
-Ex: "At IDF 2016, people on the first day have asked more about "Euclid", "Project Alloy", "Joule".
They were 55% happy, 10% concerned, 35% neutral."
-Ex: "4PM is the most positive hour at this workplace, while 10AM is the most negative one."

=== How does ViVa capture emotions?

-Platform:
ViVA uses a Intel RealSense camera to capture facial expressions of the user. Combining them we can extract emotions and then classifying them into three categories: Positive, Negative and Neutral.

-Engagement:
To extract more emotions/expressions is necessary to increase engagement between the robot and user. Thus, the robot should be able to maximize the interaction time providing a service, an interesting display or using non-verbal or verbal communication.

==Is ViVa an avatar or a physical robot?

ViVa could be an interactive mirror, avatar or physical robot.

==What does ViVACITY use?

-Intel RealSense Camera R200 to capture emotions and perform face recognition.
-FreeLing, an open source language analysis.
-Weka 3, data mining with open source machine learning software.
-Protégé, a free, open-source ontology editor and framework for building intelligent systems.
-DPBedia, a community effort to extract structured information from Wikipedia and to make this information available on the Web.
-Interactive mirror (or HTML 5 avatar).

==ViVACITY vs. Avatar for services

ViVACITY is about extracting important data from the relationship between people and environment. Its core is on data mining and machine learning. The services provided are a pretext to be closer to the people 24/7. The robot ViVa which makes part of ViVACITY might be seen as an IoT node, but a node that might interact with people increasing the engagement like no other node, thus increasing the quantity/quality of data acquired.

==Experiments

A conceptual study was performed with the purpose of demonstrating the practical use of the developed platform and its ability to recognize user emotions, to quantify them and to provide data for potential analysis. To demonstrate the platform, the robot was setup on the corridor at the department collecting emotions of the employee from 8 AM to 6 PM during the first 10 weekdays of the month of November. Even if the worker did not interact with the robot, but his/her face was detected, the expressions were captured and considered. The system calculates the emotion score for each hour based on the score-based algorithm described in Section 3.3. The study was held at the Division of Robotic and Visual Computing in CTI (Centro de Tecnologia da Informação Renato Archer) in Campinas, Brazil.

==Results

Fig. 8 shows 10 weekdays and the score average for each hour of the day. For a closer look, Fig. 9 insulates the morning hours and their score, while Fig. 10 insulates the afternoon hours. The 8 AM line appears as the morning time of day which the sentiment is more positive, followed by 11 AM. The 9 AM and 10 AM lines feature more negative score.
Fig. 11 shows the average sentiment of each day along the 10 days. The study comprises two weeks, starting from Monday. On the second week, the daily score average was positive for every day. Unlike the first week, which on average, only 2 days were positive.

See More

No users to show at the moment.

Bigger 0 0gzkp9cdxgbkh6txsch syqwnaisgq1kicdnse6w9lcmgexketd9dfqdncuig6x3a0h iffubirixbooe9k0idbfrirwxbmfe9kty9u75aecxh0metnpnzsqxm
  • Projects 0
  • Followers 0

brad lindsey

first computer experience in the 1970's with Apple Desktops Apple II

534 Cypress St, Abilene, TX 79601, USA

Bigger photo
  • Projects 0
  • Followers 2

Aravind Sura

Hyderabad, Telangana, India

Bigger j5oui22x 400x400
  • Projects 0
  • Followers 1

Thomas Jung

engineer, hacker, maker, federal hackathon winner.

Dallas, TX, USA

Bigger 14264126 1719610614966859 7658205800800384838 n
  • Projects 0
  • Followers 0

Caleb Pfohl

151 New Park Ave # 75, Hartford, CT 06106, USA

Bigger 20161112 172934
  • Projects 0
  • Followers 1

Dennis Masesi

A passionate student with the love of Artificial Intelligence

Kenya

Default user avatar 57012e2942
  • Projects 0
  • Followers 0

awadhi ally

Dar es Salaam, Tanzania

See More