U.S. patent application number 16/507621 was filed with the patent office on 2019-10-31 for method for performing emotional gestures by a device to interact with a user.
This patent application is currently assigned to Intuition Robotics, Ltd.. The applicant listed for this patent is Intuition Robotics, Ltd.. Invention is credited to Roy AMIR, Islam HERALLA, Itai MENDELSOHN, Dor SKULER.
Application Number | 20190329417 16/507621 |
Document ID | / |
Family ID | 62839549 |
Filed Date | 2019-10-31 |
![](/patent/app/20190329417/US20190329417A1-20191031-D00000.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00001.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00002.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00003.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00004.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00005.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00006.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00007.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00008.png)
![](/patent/app/20190329417/US20190329417A1-20191031-D00009.png)
United States Patent
Application |
20190329417 |
Kind Code |
A1 |
AMIR; Roy ; et al. |
October 31, 2019 |
METHOD FOR PERFORMING EMOTIONAL GESTURES BY A DEVICE TO INTERACT
WITH A USER
Abstract
A method for performing an emotional gesture by a device based
on a received electronic message, including receiving an electronic
message by the device, where the received electronic message
includes content and metadata; determining an intent of a sender of
the received electronic message based on the content and the
metadata; determining at least one emotional gesture based on the
determined intent; and performing the determined at least one
emotional gesture.
Inventors: |
AMIR; Roy; (Mikhmoret,
IL) ; MENDELSOHN; Itai; (Tel Aviv-Yafo, IL) ;
SKULER; Dor; (Oranit, IL) ; HERALLA; Islam;
(Kfar Kasem, IL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Intuition Robotics, Ltd. |
Ramat-Gan |
|
IL |
|
|
Assignee: |
Intuition Robotics, Ltd.
Ramat-Gan
IL
|
Family ID: |
62839549 |
Appl. No.: |
16/507621 |
Filed: |
July 10, 2019 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
PCT/US2018/012923 |
Jan 9, 2018 |
|
|
|
16507621 |
|
|
|
|
62444384 |
Jan 10, 2017 |
|
|
|
62444386 |
Jan 10, 2017 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
B25J 11/001 20130101;
G06F 3/011 20130101; B25J 13/003 20130101; G06F 2203/011 20130101;
B25J 13/087 20130101; B25J 13/081 20130101; H04L 51/10 20130101;
B25J 13/086 20130101 |
International
Class: |
B25J 11/00 20060101
B25J011/00; H04L 12/58 20060101 H04L012/58 |
Claims
1. A method for performing an emotional gesture by a device based
on a received electronic message, comprising: receiving an
electronic message by the device, wherein the received electronic
message includes content and metadata; determining an intent of a
sender of the received electronic message based on the content and
the metadata; determining at least one emotional gesture based on
the determined intent; and performing the determined at least one
emotional gesture.
2. The method of claim 1, wherein the at least one emotional
gesture comprises an electro-mechanical gesture.
3. The method of claim 1, further comprising: identifying the
sender of the received electronic message, wherein the determining
at least one emotional gesture is further based on the identified
sender.
4. The method of claim 1, wherein determining at least one
emotional gesture further comprises: determining at least one
emotional gesture based on the analyzed user data.
5. The method of claim 4, wherein the user data includes at least
one of: real-time data associated with a user, and historical data
associated with a user.
6. The method of claim 4, further comprising: determining an
interaction objective based on the received electronic message and
the user data, wherein the determining at least one emotional
gesture is further based on the interaction objective.
7. The method of claim 6, wherein the interaction objective
includes a desired goal related to a determined emotional state of
a user.
8. The method of claim 1, further comprising: sending a user
response to the sender based on the determined at least one
emotional gesture.
9. A non-transitory computer readable medium having stored thereon
instructions for causing a processing circuitry to perform the
method of claim 1.
10. A device including at least a processing circuitry and
configured to perform the method of claim 1.
11. A method for performing an emotional gesture by a device based
on a recommendation, comprising: analyzing a received electronic
recommendation and data related to a user to determine at least one
action to be performed based on a current emotional state of a
user; performing the at least one action; and determining a user
response to the at least one action based on data collected from at
least one sensor.
12. The method of claim 11, wherein the analyzing a received
electronic communication and data related to a user includes
determining a user's emotional state and wherein the at least one
action is further based on the user's emotional state.
13. The method of claim 11, wherein the at least one action
includes at least one of: an emotional gesture, an
electro-mechanical gesture, and playing multimedia content.
14. The method of claim 11, wherein the data related to the user
includes at least one of: real-time data associated with the user,
and historical data associated with the user.
15. The method of claim 11, further comprising: storing the
determined user response in a storage.
16. The method of claim 11, wherein the recommendation is received
from at least one of: a person and a recommendation generator.
17. A non-transitory computer readable medium having stored thereon
instructions for causing a processing circuitry to perform the
method of claim 11.
18. A device including at least a processing circuitry and
configured to perform the method of claim 11.
19. A method for performing an emotional gesture by a device based
on a determined user emotional state, comprising: determining a
current emotional state of a user based on real-time indicators;
selecting at least one emotional gesture based on the determined
current emotional state of the user; and performing the at least
one emotion gesture.
20. The method of claim 19, wherein the real-time indicators may be
received from a plurality of sensors.
21. The method of claim 19, wherein the sensors include at least
one of: an environmental sensor, a camera, a microphone, a motion
detector, a proximity sensor, a light sensor, a temperature sensor,
and a touch detector.
22. The method of claim 19, wherein the current emotional state of
the user is determined based on a comparison between the real-time
indicators and a plurality of previously determined user
profiles.
23. The method of claim 22, wherein the previously determined user
profiles include parameters, and wherein the parameters include at
least one of: an amount of time a user has been determined to be
idle, an amount of time between movements of a user, and an amount
of time between conversations involving a user.
24. A non-transitory computer readable medium having stored thereon
instructions for causing a processing circuitry to perform the
method of claim 19.
25. A device including at least a processing circuitry and
configured to perform the method of claim 19.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation of International
Application No. PCT/US2018/012923 filed Jan. 9, 2018 which claims
the benefit of U.S. Provisional Patent Application No. 62/444,384
and U.S. Provisional Application No. 62/444,386, both filed on Jan.
10, 2017, the contents of which are hereby incorporated by
reference.
TECHNICAL FIELD
[0002] The present disclosure relates generally to electronic
devices, and more specifically to electronic devices designed to
perform electro-mechanical gestures that portray particular
emotions.
BACKGROUND
[0003] Electronic devices, including personal electronic devices
such as smartphones, tablet computers, consumer robots, and the
like, have been recently designed with ever increasing
capabilities. Such capabilities fall within a wide range,
including, for example, automatically cleaning or vacuuming a
floor, playing high definition video clips, identifying a user by a
fingerprint detector, running applications with multiple uses,
accessing the internet from various locations, and the like.
[0004] In recent years, microelectronics advancement, computer
development, control theory development and the availability of
electro-mechanical and hydro mechanical servomechanisms, among
others, have been key factors in robotics evolution, giving rise to
a new generation of automatons known as social robots. Social
robots can conduct what appears to be emotional and cognitive
activities, interacting and communicating with people in a simple
and pleasant manner following a series of behaviors, patterns and
social norms. Advancements in the field of robotics have included
the development of biped robots with human appearances that
facilitate interaction between the robots and humans by introducing
anthropomorphic human traits in the robots. The robots often
include a precise mechanical structure allowing for specific
physical locomotion and handling skill.
[0005] Although social robots have sensory systems to perceive the
surrounding environment and are capable of interacting with human
beings, the self-expressions they are currently programmed to
display remain limited. Current social robots' performances include
simple direct responses to a user's actions. For example, these
responses may include performing a movement or series of movements
based on predetermined paths. Vacuum robots employ such
predetermined paths in order to efficiently maximize coverage of a
floor plan and may run based on a user determined schedule.
Responses may further include predetermined movements when
encountering a known obstacle, which may be employed by biped
robots to maneuver a course. However, these responses are difficult
to employ when the desired application of the robot is to directly
respond to a user's queries or to determine and respond to a user's
mood. Without the ability to provide gestures and movements that
appear as emotional in nature, robots become less appealing to many
users, especially those who are less familiar with robotic
technology.
[0006] It would therefore be advantageous to provide a solution
that would overcome the challenges noted above.
SUMMARY
[0007] A summary of several example embodiments of the disclosure
follows. This summary is provided for the convenience of the reader
to provide a basic understanding of such embodiments and does not
wholly define the breadth of the disclosure. This summary is not an
extensive overview of all contemplated embodiments, and is intended
to neither identify key or critical elements of all embodiments nor
to delineate the scope of any or all aspects. Its sole purpose is
to present some concepts of one or more embodiments in a simplified
form as a prelude to the more detailed description that is
presented later. For convenience, the term "certain embodiments"
may be used herein to refer to a single embodiment or multiple
embodiments of the disclosure.
[0008] Certain embodiments disclosed herein include a device for
performing emotional gestures to interact with a user. The device
includes a base; a controller; a first body portion pivotally
connected to the base, the first body portion having a first
aperture; an electro-mechanical member disposed within the first
body portion and connected to the controller; and a second body
portion connected to the electro-mechanical member, the second body
portion having a second aperture. The electro-mechanical member is
configured to extend from the first body portion through the first
aperture to the second body portion through the second aperture and
the controller is configured to control movements of the
electro-mechanical member and the first body portion, where the
movements include emotional gestures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] The subject matter disclosed herein is particularly pointed
out and distinctly claimed in the claims at the conclusion of the
specification. The foregoing and other objects, features, and
advantages of the disclosed embodiments will be apparent from the
following detailed description taken in conjunction with the
accompanying drawings.
[0010] FIG. 1A is a schematic diagram of a device for performing
emotional gestures according to an embodiment.
[0011] FIG. 1B is a schematic diagram of a device for performing
emotional gestures with a user device attached thereto, according
to an embodiment.
[0012] FIG. 2 is an exploded view of the first body portion of the
device illustrated in FIGS. 1A and 1B, according to an
embodiment.
[0013] FIG. 3 is a perspective view of the first body portion and
the second body portion of the device illustrated in FIGS. 1A and
1B, according to an embodiment.
[0014] FIG. 4 is an exploded perspective view of the second body
portion of the device illustrated in FIGS. 1A and 1B according to
an embodiment.
[0015] FIG. 5 is a block diagram of a controller for controlling a
device for performing emotional gestures.
[0016] FIG. 6 is a flowchart of a method for performing an
emotional gesture based on a received electronic message according
to an embodiment.
[0017] FIG. 7 is a flowchart of a method for performing an
emotional gesture based on a recommendation according to an
embodiment.
[0018] FIG. 8 is a flowchart of a method for performing an
emotional gesture based on a determined user emotional state,
according to an embodiment.
DETAILED DESCRIPTION
[0019] It is important to note that the embodiments disclosed
herein are only examples of the many advantageous uses of the
innovative teachings herein. In general, statements made in the
specification of the present application do not necessarily limit
any of the various claimed embodiments. Moreover, some statements
may apply to some inventive features but not to others. In general,
unless otherwise indicated, singular elements may be in plural and
vice versa with no loss of generality. In the drawings, like
numerals refer to like parts through several views.
[0020] The various disclosed embodiments include a device
configured to perform and display gestures that may be interpreted
as emotional gestures by a user. The device includes a base
connected to a first body portion, where the first body portion is
rotatable relative to the base. A second body portion is placed
above the first body portion and is attached thereto via an
electro-mechanical arm.
[0021] FIG. 1A is an example schematic diagram of a device 100 for
performing emotional gestures according to an embodiment. The
device 100 comprises a base 110, which may include therein a
variety of electronic components, hardware components, and the
like. The base 110 may further include a volume control 180, a
speaker 190, and a microphone 195.
[0022] A first body portion 120 may be mounted to the base 110
within a ring 170 designed to accept the first body portion 120
therein. The first body portion 120 may include a hollow hemisphere
mounted above a hollow cylinder, although other appropriate bodies
and shapes may be used while having a base configured to fit into
the ring 170. A first aperture 125 crossing through the apex of the
hemisphere of the first body portion 120 provides access into and
out of the hollow interior volume of the first body portion 120.
The first body portion 120 is mounted to the base 110 within the
confinement of the ring 170 such that it may rotate about its
vertical axis symmetry, i.e., an axis extending perpendicular from
the base. For example, the first body portion 120 rotates clockwise
or counterclockwise relative to the base 110. The rotation of the
first body portion 120 about the base 110 may be achieved by, for
example, a motor (not shown) mounted to the base 110 or a motor
(not shown) mounted within the hollow of the first body portion
120.
[0023] The device 100 further includes a second body portion 140.
The second body portion 140 may additionally include a hollow
hemisphere mounted onto a hollow cylindrical portion, although
other appropriate bodies may be used. A second aperture 145 is
located at the apex of the hemisphere of the second body portion
140. When assembled, the second aperture 145 is positioned to align
with the first aperture 125.
[0024] The second body portion 140 is mounted to the first body
portion 120 by an electro-mechanical member (not shown in FIG. 1)
placed within the hollow of the first body portion 120 and
protruding into the hollow of the second body portion 140 through
the first aperture 125 and the second aperture 145.
[0025] In an embodiment, the electro-mechanical member enables
motion of the second body portion 140 with respect of the first
body portion 120 in a motion that imitates at least an emotional
gesture understandable to a human user. The combined motion of the
second body portion 140 with respect of the first body portion 120
and the first body portion 120 with respect to the base 110 is
configured to correspond to one or more of a plurality of
predetermined emotional gestures capable of being presented by such
movement. A head camera assembly (not shown) may be embedded within
the second body portion 140. The head camera assembly comprises at
least one image capturing sensor that allows capturing images and
videos.
[0026] The base 110 may be further equipped with a stand 160 that
is designed to provide support to a user device, such as a portable
computing device. The stand 160 may include two vertical support
pillars that may include therein electronic elements. Example for
such elements include wires, sensors, charging cables, wireless
charging components, and the like and may be configured to
communicatively connect the stand to the user device.
[0027] In an embodiment, a camera assembly 165 is embedded within a
top side of the stand 160. The camera assembly 165 includes at
least one image capturing sensor.
[0028] According to some embodiments, shown in FIG. 1B, a user
device 150 is shown supported by the stand 160. The user device 150
may include a portable electronic device including a smartphone, a
mobile phone, a tablet computer, a wearable device, and the like.
The device 100 is configured to communicate with the user device
150 via a controller (not shown). The user device 150 may further
include at least a display unit used to display content, e.g.,
multimedia. According to an embodiment, the user device 150 may
also include sensors, e.g., a camera, a microphone, a light sensor,
and the like. The input identified by the sensors of the user
device 150 may be relayed to the controller of the device 100 to
determine whether one or more electro-mechanical gestures are to be
performed.
[0029] Returning to FIG. 1A, the device 100 may further include an
audio system, including, e.g., a speaker 190. The speaker 190 in
one embodiment is embedded in the base 110. The audio system may be
utilized to, for example, play music, make alert sounds, play voice
messages, and other audio or audiovisual signals generated by the
device 100. The microphone 195, being also part of the audio
system, may be adapted to receive voice instructions from a
user.
[0030] The device 100 may further include an illumination system
(not shown). Such a system may be implemented using, for example,
one or more light emitting diodes (LEDs). The illumination system
may be configured to enable the device 100 to support emotional
gestures and relay information to a user, e.g., by blinking or
displaying a particular color. For example, an incoming message may
be indicated on the device by a LED pulsing green light. The LEDs
of the illumination system may be placed on the base 110, on the
ring 170, or within on the first or second body portions 120, 140
of the device 100.
[0031] Emotional gestures understood by humans are, for example and
without limitation, gestures such as: slowly tilting a head
downward towards a chest in an expression interpreted as being
sorry or ashamed; tilting the head to the left of right towards the
shoulder as an expression of posing a question; nodding the head
upwards and downwards vigorously as indicating enthusiastic
agreement; shaking a head from side to side as indicating
disagreement, and so on. A profile of a plurality of emotional
gestures may be compiled and used by the device 100.
[0032] In an embodiment, the device 100 is configured to relay
similar emotional gestures by movements of the first body portion
120 and the second body portion 140 relative to each other and to
the base 110. The emotional gestures may be predefined movements
that mimic or are similar to certain gestures of humans. Further,
the device may be configured to direct the gesture toward a
particular individual within a room. For example, for an emotional
gesture of expressing agreement towards a particular user who is
moving from one side of a room to another, the first body portion
120 may perform movements that track the user, such as a rotation
about a vertical axis relative to the base 110, while the second
body portion 140 may move upwards and downwards relative to the
first body portion 120 to mimic a nodding motion.
[0033] FIG. 2 shows an example exploded view of the first body
portion 120 of the illustrated in FIGS. 1A and 1B, according to an
embodiment. The first body portion 120 includes an
electro-mechanical member 130 used by the device 100 for performing
emotional gestures. The electro-mechanical member 130 is mounted
within the hollow of the first body portion 120 and at least
partially protrudes through the first aperture 125. The
electro-mechanical member 130 may further be connected to the base
110, such that the first body portion 120 may be moved relative
thereto.
[0034] The electro-mechanical member 130 is configured to control
movements of the first body portion 120 and the second body portion
140 and includes an assembly of a plurality of mechanical elements
that in combination enable such motion. The plurality of mechanical
elements may include a variety of combinations, such as, but not
limited to, shafts, axles, pivots, wheels, cogwheels, poles, and
belts. The electro-mechanical member 130 may be connected to one or
more electric motors (not shown) configured to rotate the first
body portion 120 about a vertical axis.
[0035] In an example embodiment, the electric motor of the
electro-mechanical member 130 may be physically connected to the
first body portion 120 by an axis that enables a complete
360-degree spin. In another example embodiment, an arm 121 of the
electro-mechanical member 130 may be connected to the electric
motor and extend through the first aperture 125 to be physically
connected to the second body portion 140 by an axis that enables
movement of the second body portion 140 via the arm 121, e.g.,
moving the second body portion upwards, downwards, forwards, and
backwards. The arm 121 may include a narrow portion configured to
fit within the first aperture 125 and the second aperture 145, such
that the first body portion 120 and second body portion 140 may be
connected through the arm 121. Additional components within the
first body portion 120 may include a connector 123 adapted to
connect the electro-mechanical member 130 to a controller (not
shown) within the device 100. In an embodiment, the electric motor
is connected to a spring system 122 that is configured to allow for
smooth movements of the arm, and, in turn, the second body portion
140, without the use of cogwheels or gears.
[0036] The combined movements of the first body portion 120 and the
second body portion 140 may be configured to perform diverse
emotional gestures. For example, the first body portion 120 may
rotate right while the second body portion 140 performs a tilting
movement, which may be interpreted as posing a question to a user
detected to be positioned to the right of the device 100.
[0037] FIG. 3 is an example perspective view of the first body
portion 120 and the second body portion 140 of the device
illustrated in FIGS. 1A and 1B, according to an embodiment. When
assembled, the arm 121 protrudes from the first aperture 125 and
extends through the second aperture 145. A bearing assembly 142 may
be secured to the top of the arm 121 and configured to hold the arm
121 in place within the hollow of the second body portion 140.
[0038] FIG. 4 is an example exploded perspective view of the second
body portion 140 of the device for performing emotional gestures
according to an embodiment. When fully assembled, the neck 121
extends into the interior volume of the hollow second body portion
140 and is attached thereto. In an embodiment, a head motor
assembly 142 is disposed within the second body portion and
connected to the neck 121. The head motor assembly 143 may be
configured to allow for additional movement of the second body
portion 140 with respect to the first body portion 120. A connector
144 allows for a connection between the head motor assembly 143 and
a controller (not shown). In an embodiment, a motherboard 146 is in
communication with the head motor assembly 143 and the connector
144, and is configured to be controlled via the controller.
[0039] FIG. 5 is an example block diagram of a controller 500 of
the device 100 implemented according to an embodiment. In an
embodiment, the controller 500 is disposed within the base 110 of
the device 100. In another embodiment, the controller 500 is placed
within the hollow of the first body portion 120 or the second body
portion 140 of the device 100. The controller 500 includes a
processing circuitry 510 that is configured to control at least the
motion of the various electro-mechanical segments of the device
100. The processing circuitry 510 may be realized as one or more
hardware logic components and circuits. For example, and without
limitation, illustrative types of hardware logic components that
can be used include field programmable gate arrays (FPGAs),
application-specific integrated circuits (ASICs),
application-specific standard products (ASSPs), system-on-a-chip
systems (SOCs), general-purpose microprocessors, microcontrollers,
digital signal processors (DSPs), and the like, or any other
hardware logic components that can perform calculations or other
manipulations of information.
[0040] The controller 500 further includes a memory 520. The memory
520 may contain therein instructions that, when executed by the
processing circuitry 510, cause the controller 510 to execute
actions, such as, performing a motion of one or more portions of
the device 100, receive an input from one or more sensors, display
a light pattern, and the like. According to an embodiment, the
memory 520 may store therein user information, e.g., data
associated with a user's behavior pattern. The memory 520 is
further configured to store software. Software shall be construed
broadly to mean any type of instructions, whether referred to as
software, firmware, middleware, microcode, hardware description
language, or otherwise. Instructions may include code (e.g., in
source code format, binary code format, executable code format, or
any other suitable format of code). The instructions cause the
processing circuitry 510 to perform the various processes described
herein. Specifically, the instructions, when executed, cause the
processing circuitry 510 to cause the first body portion 120, the
second body portion 140, the electro-mechanical member 130, and the
arm 121 of the device 100 to perform emotional gestures as
described herein. In a further embodiment, the memory 520 may
further include a memory portion (not shown) including the
instructions.
[0041] The controller 500 further includes a communication
interface 530 which is configured to perform wired 532
communications, wireless 534 communications, or both, with external
components, such as a wired or wireless network, wired or wireless
computing devices, and so on. The communication interface 530 may
be configured to communicate with the user device to receive data
and instructions therefrom.
[0042] The controller 500 may further include an input/output (I/O)
interface 540 that may be utilized to control the various
electronics of the device 100, such as sensors 550, including
sensors on the device 100, sensors on the user device 150, the
electro-mechanical member 130, and more. The sensors 550 may
include, but are not limited to, environmental sensors, a camera, a
microphone, a motion detector, a proximity sensor, a light sensor,
a temperature sensor and a touch detector, one of more of which may
be configured to sense and identify real-time data associated with
a user. For example, a motion detector may sense movement, and a
proximity sensor may detect that the movement is within a
predetermined distance to the device 100. As a result, instructions
may be send to light up the illumination system of the device 100
and raise the second body portion 140, mimicking a gesture
indicating attention or interest.
[0043] According to an embodiment, the real-time data may be saved
and stored within the device 100, e.g., within the memory 520, and
may be used as historical data to assist with identifying behavior
patterns, changes occur in behavior patterns, and the like.
[0044] As a non-limiting example, the controller 500 may determine,
based on sensory input from a sensor 550, that a certain emotional
gesture is appropriate based on identification of a specific user
behavior. As a result, the controller 500 may cause the first body
portion 120, the electro-mechanical member 130 and the second body
portion 140 to perform one or more movements that may be
interpreted by the user as one or more emotional gestures.
[0045] Methods implemented by the device 100 may be utilized for
several purposes. An example for such a purpose may be performing
an electro-mechanical gesture based on a receipt of an electronic
message by identification of information based on the electronic
message and collection of data with respect to a user's state.
[0046] According to another example method, the device 100 may be
further utilized to perform an electro-mechanical gesture
respective of a receipt of an electronic recommendation. The method
may be configured to receive an electronic recommendation, collect
data related to the user, analyze the data and the information
associated with the recommendation to determine a proactive
electro-mechanical gesture associated with at least one emotion.
Then, the at least one electro-mechanical gesture is performed.
According to another exemplary method executed using the device
100, the device 100 may be configured to respond to detected
loneliness of a user.
[0047] For example, the device 100 may be configured to detect
loneliness of a user using predetermined loneliness profiles based
on various parameters including, e.g., identifying the amount of
time a user has been stationary watching television, sleeping,
sitting in a chair without significant movement, and the like. The
controller of the device may be configured to select at least one
electro-mechanical gesture from a predetermined set of
electro-mechanical gestures based on the detected loneliness
profile. The gestures may be adapted for various users. Thus,
different users may experience different electro-mechanical
gestures from the device based on identical parameters. As an
example gesture, where a user is identified as lonely, the
electro-mechanical gesture may include rotating the first body
segment 120 towards the user, moving the second body portion 140
forward and towards the user, and causing the illumination system
to illuminate and the audio system to play music.
[0048] FIG. 6 is an example flowchart of a method 600 for
performing an emotional gesture based on a received electronic
message according to an embodiment. At S610, an electronic message
is received by the device. The electronic message may include, for
example, a short message service (SMS) message, a multimedia
messaging service (MMS) message, an email, a voice mail, a message
sent over a social network, and the like. The electronic message
may be received over a network, such as a local area network (LAN),
a wireless network, the Internet, and the like.
[0049] The received message may include both content and metadata,
where metadata is data describing the content. The content may
include text, such as alphanumeric characters, words, sentences,
queries, an image, a picture, a video, an audio recording a
combination thereof, and the like. The metadata of the electronic
message can include information about a sender of the message, a
device or phone number associated with a device from which the
electronic message was received, a time the message was initially
sent, a time the message was received, the size of the message, and
the like.
[0050] At S620, the received electronic message is analyzed. In one
embodiment, the analysis includes identifying the content and the
metadata of the electronic message and determining the intent of a
sender of the message based on the content and the metadata. The
intent may include asking a question to an intended recipient,
determining the current state of an intended recipient, e.g., the
current emotional state of the intended recipient, sending a
reminder to an intended recipient and the like. In an embodiment,
the intended recipient is a user of the device 100, discussed
above.
[0051] In another embodiment, the analysis further includes
identifying the sender of the electronic message. The identity of
the sender may include a name of the sender, the relationship
between the sender and the intended recipient (also referred to as
the user), and the like. For example, a sender may be identified as
a family relative, a friend, a co-worker, a social worker, and so
on, of the user. The identity of the sender may be useful in
determining the sender's intent. For example, based on historical
data associated with the user, including previously received
messages, if the sender is determined to be a child of the user and
the message has been sent at a time that many previous messages
have been sent, it may be determined that the child is checking in
on the wellbeing of the user. The historical data may be retrieved
from a database, as discussed below.
[0052] As an example, the sender of an electronic message may be
identified as the daughter of the intended recipient, where the
electronic message is determined to have been sent through a
messaging service, e.g., Facebook.RTM. Messenger, where the sender
is associated with a user account previously known to be associated
with the daughter of the intended recipient. According to the same
example, the content of the electronic message may be analyzed to
determine that the intent of the daughter is to confirm that the
user and sender are meeting for dinner on the upcoming Sunday at 6
pm at a particular restaurant.
[0053] At S630, data of the intended recipient, or the user, is
received. The data may include real-time data or historical data.
Real-time data may be information associated with a user's current
state, and may be accessed and determined via one or more sensors,
e.g., the sensors 550 discussed above in FIG. 5. As mentioned
above, the sensors may be connected to a device and may include an
environmental sensor, a camera, a microphone, a motion detector, a
proximity sensor, a light sensor, a temperature sensor, a touch
detector, a combination thereof, and the like. As an example,
real-time data may include frequency of motion detected from the
user within a room within a predetermined period of time, e.g., the
previous three hours, based on data received via a proximity sensor
and a motion sensor.
[0054] Historical data may include previously recorded real-time
data of the user, attributes associated with the user, and the
like, and be indicative of the user's behavior patterns and
preferences based on previously determined data. For example,
historical data may indicate that the user's sense of hearing is in
a poor condition. The historical data may be stored in, for
example, a memory, e.g., the memory 520 of the device of FIG. 5, a
database, a cloud database, and the like.
[0055] At optional S640, an interaction objective is determined.
The real-time and historical data may be utilized to determine at
least one interaction objective. The interaction objectives are
desired goals related to a determined user state to be achieved by
a proactive interaction based on the received electronic message
and user data. For example, if, based on the analyzed data, it is
determined that a user's emotional state is identified as lonely,
at least one interaction objective may be to improve this user's
state and minimize perceived loneliness by performing at least one
electro-mechanical gesture related to the received electronic
message. The determination of the interaction objectives may be
achieved by collecting and analyzing the data associated with the
user from, for example, a plurality of sensors, social networks
used by the user, previous user data stored on a database, and the
like, in order to identify the current user state. The
electro-mechanical gestures can be adjusted for each individual
user. For example, if it is determined that a user has been without
human interaction for 24 hours, the electro-mechanical gesture that
will be provided in response to a receipt of an electronic message,
e.g., an electronic message used to check in with a user from a
relative, will be different from the electro-mechanical gesture
that will be provided in case the user has been entertaining
company.
[0056] At S650, at least one electro-mechanical gesture based on
the analysis of the electronic message and the analysis of the data
of the user is determined to be performed. The electro-mechanical
gesture may include at least one of the electro-mechanical gestures
determined to achieve the interaction objective. The
electro-mechanical gesture may include emotional gestures that are
predefined movements that mimic or are similar to certain gestures
of humans, as described herein above.
[0057] At S660, the determined at least one electro-mechanical
gesture is performed, e.g., by the device described in FIGS.
1A-5.
[0058] At optional S670, a response to the electronic message is
sent, e.g., to the sender. In an embodiment, the response may
indicate that the electronic message has been received by user. In
an additional embodiment, the response may include a reaction or
reply from the user intended to be sent to the sender. The response
may be sent, e.g., over a network, and may include text, such as
alphanumeric characters, words, sentences, queries, an image, a
picture, a video, an audio recording, a combination thereof, and
the like.
[0059] At S680, it is determined if additional electronic messages
have been received. If so, execution continues at S610; otherwise
execution terminates.
[0060] FIG. 7 is an example flowchart of a method 700 for
performing an emotional gesture based on a recommendation according
to an embodiment. At S710, a recommendation is received. The
recommendation may include an instruction to perform an
electro-mechanical gesture, to display a multimedia content item,
to provide a link to a multimedia content item, and the like. In an
embodiment, the recommendation is received from a person, e.g.,
from a user device over a network. In a further embodiment, the
recommendation is received from a recommendation generator, which
is configured to generate a recommendation based on data associated
with a user. The recommendation generator may be the controller
discussed in FIG. 5 above.
[0061] At S720, data of the intended recipient, or the user, is
received. The data may include real-time data or historical data.
Real-time data may be information associated with a user's current
state, and may be accessed and determined via one or more sensors,
e.g., the sensors 550 discussed above in FIG. 5. As mentioned
above, the sensors may be connected to a device and may include an
environmental sensor, a camera, a microphone, a motion detector, a
proximity sensor, a light sensor, a temperature sensor, a touch
detector, a combination thereof, and the like. As an example,
real-time data may include frequency of motion detected from the
user within a room within a predetermined period of time, e.g., the
previous three hours, based on data received via a proximity and a
motion sensor.
[0062] Historical data may include previously recorded real-time
data of the user, attributes associated with the user, and the
like, and be indicative of the user's behavior patterns and
preferences based on previously determined data. For example,
historical data may indicate that the user's sense of hearing is in
a poor condition. The historical data may be stored in, for
example, a memory, e.g., the memory 520 of the device of FIG. 5, a
database, a cloud database, and the like.
[0063] At S730, the data and the electronic recommendation is
analyzed to determine an action to be performed based thereon. In
an embodiment, the analysis includes determining a user's emotional
state, and determining an appropriate action based on the
determined emotional state and the received recommendation. In an
embodiment, the action includes an electro-mechanical gesture,
e.g., an emotion gesture, to be performed. For example, if the
recommendation includes a display of a video, and it is determined
that a user is currently in a lonely state, the electro-mechanical
gesture may include causing a device to display a show of interest
with the user, and displaying a link to an uplifting video within a
display of the device. In an embodiment, the action does not
include an electro-mechanical gesture.
[0064] At S740, the action is performed. In an embodiment, the
action is performed by a device configured to perform emotional
gestures, e.g., the device discussed above in FIGS. 1A-5. The
electro-mechanical gesture may include moving a first body portion
and a second body portion to mimic human emotional gestures, as
discussed above. Further, the action may include displaying a
multimedia content, e.g., based on the received recommendation, in
tandem with one or more electro-mechanical gestures. In an
embodiment, an electro-mechanical gesture is configured to create
an emotional reaction by the user. For example, an
electro-mechanical gesture may include rotating a first body
segment slowly towards the user's direction and moving the second
body portion slowly forward, imitating a human gesture that
expresses an interest in a companion's feelings.
[0065] At S750, a user response to the action is determined. The
response may be determined based on data collected from a plurality
of sensors. A user response includes at least one of: a response to
the electro-mechanical gesture, and a response to the electronic
recommendation. For example, if the user's response to a
recommended video was expressed by smiling and laughing, the user's
state may be determined to be positive, and the video, the type of
video, the time of the suggestion, and the like, may be stored in a
storage, e.g., in a database, for managing future
recommendations.
[0066] FIG. 8 is an example flowchart of a method for performing an
emotional gesture based on a determined user emotional state,
according to an embodiment. At S810, real-time indicators that
indicate the current state of a user are received. Real-time
indicators may be received from sensors, i.e., the sensors
discussed above, including an environmental sensor, a camera, a
microphone, a motion detector, a proximity sensor, a light sensor,
a temperature sensor, a touch detector, and combination thereof,
and the like. Examples of real-time indicators include determining
if user motions have been detected over a period of time, e.g., by
a motion sensor, if conversations have taken place over a period of
time, e.g., by a microphone, and the like.
[0067] At S820, the real-time indicators related to a current state
of a user are analyzed. In an embodiment, the analysis includes
comparing the received real-time indicators with previously
determined user profiles. For example, based on detected motions,
it may be determined whether the user is currently awake or asleep,
if there is current user movement, if there has been user movement
over a previously determined period of time, if the user has
watched television without moving for a predetermined period of
time, and the like. The analysis may further include comparing the
determined current emotional state of a user to a plurality of
predetermined profiles. As a non-limiting example, a plurality of
loneliness profiles may be accessed via a database, where each
profile includes a set of parameters that may indicate loneliness.
The parameters may include data associated with real-time
indicators, including the amount of time a user has been idle, the
amount of time between movements, the amount of time between
conversations involving the user, and the like.
[0068] At S830, a user's current emotional state is determined
based on the analysis, e.g., if the user is determined to be in a
lonely state.
[0069] At S840, at least one electro-mechanical gesture is selected
to be performed based on the determined user's emotional state. The
gesture may be selected from a plurality of electro-mechanical
gestures based on the user's emotional state, where the
electro-mechanical gestures may be associated with predetermined
profiles associated with at least one emotion. In an embodiment,
the plurality of electro-mechanical gestures may be dynamically
updated based on, for example, a user's reactions to the selected
electro-mechanical gesture as identified by sensors. Thus,
different users may experience different electro-mechanical
gestures based on identical circumstances.
[0070] As an example, if a user is identified as currently having a
lonely state, at least one electro-mechanical gesture may be
selected to mimic an emotional gesture, such as, for example,
rotating a first body segment of a device, e.g., the device of
FIGS. 1A-5, towards the user, move a second body portion forward
and towards the user, and the like, to display a gesture of
interest. According to another embodiment, certain multimedia items
may be display or played, e.g., a light turned on, a video played,
music played, and the like.
[0071] At S850, each of the selected electro-mechanical gestures is
performed. At S860, it checked whether more real-time indicators
have been received, and if so, execution continues with S810;
otherwise, execution terminates.
[0072] The various embodiments disclosed herein can be implemented
as hardware, firmware, software, or any combination thereof.
Moreover, the software is preferably implemented as an application
program tangibly embodied on a program storage unit or computer
readable medium consisting of parts, or of certain devices and/or a
combination of devices. The application program may be uploaded to,
and executed by, a machine comprising any suitable architecture.
Preferably, the machine is implemented on a computer platform
having hardware such as one or more central processing units
("CPUs"), a memory, and input/output interfaces. The computer
platform may also include an operating system and microinstruction
code. The various processes and functions described herein may be
either part of the microinstruction code or part of the application
program, or any combination thereof, which may be executed by a
CPU, whether or not such a computer or processor is explicitly
shown. In addition, various other peripheral units may be connected
to the computer platform such as an additional data storage unit
and a printing unit. Furthermore, a non-transitory computer
readable medium is any computer readable medium except for a
transitory propagating signal.
[0073] As used herein, the phrase "at least one of" followed by a
listing of items means that any of the listed items can be utilized
individually, or any combination of two or more of the listed items
can be utilized. For example, if a system is described as including
"at least one of A, B, and C," the system can include A alone; B
alone; C alone; A and B in combination; B and C in combination; A
and C in combination; or A, B, and C in combination.
[0074] All examples and conditional language recited herein are
intended for pedagogical purposes to aid the reader in
understanding the principles of the disclosed embodiment and the
concepts contributed by the inventor to furthering the art, and are
to be construed as being without limitation to such specifically
recited examples and conditions. Moreover, all statements herein
reciting principles, aspects, and embodiments of the disclosed
embodiments, as well as specific examples thereof, are intended to
encompass both structural and functional equivalents thereof.
Additionally, it is intended that such equivalents include both
currently known equivalents as well as equivalents developed in the
future, i.e., any elements developed that perform the same
function, regardless of structure.
* * * * *