U.S. patent application number 09/985909 was filed with the patent office on 2002-06-27 for electronic toy.
Invention is credited to Atobe, Hirohiko, Hayakawa, Tetsuya, Igarashi, Kaoru, Saji, Ryotaro, Yamada, Satoshi.
Application Number | 20020081937 09/985909 |
Document ID | / |
Family ID | 27481758 |
Filed Date | 2002-06-27 |
United States Patent
Application |
20020081937 |
Kind Code |
A1 |
Yamada, Satoshi ; et
al. |
June 27, 2002 |
Electronic toy
Abstract
Provided is an electronic toy which automatically activates when
the user is nearby. An electronic toy controlled so as to react to
external information has a movement mechanism structuring the
mechanical movement of the toy (FIG. 5); an input element for
obtaining external information (51-55); a distinction element for
distinguishing whether an object body exists in the periphery
(S152); and a control element for selecting, among a plurality of
control parameters, a control parameter for controlling the
movement mechanism in correspondence with the external information
based on the distinction result, and controlling the movement of
the movement mechanism, and activates when a person exists in the
periphery (S162-180).
Inventors: |
Yamada, Satoshi; (Tokyo,
JP) ; Atobe, Hirohiko; (Tokyo, JP) ; Igarashi,
Kaoru; (Tokyo, JP) ; Saji, Ryotaro; (Tokyo,
JP) ; Hayakawa, Tetsuya; (Tokyo, JP) |
Correspondence
Address: |
KEATING & BENNETT LLP
10400 Eaton Place, Suite 312
Fairfax
VA
22030
US
|
Family ID: |
27481758 |
Appl. No.: |
09/985909 |
Filed: |
November 6, 2001 |
Current U.S.
Class: |
446/175 |
Current CPC
Class: |
A63H 3/48 20130101; A63H
2200/00 20130101; A63H 11/00 20130101 |
Class at
Publication: |
446/175 |
International
Class: |
A63H 030/00 |
Foreign Application Data
Date |
Code |
Application Number |
Nov 7, 2000 |
JP |
2000-339744 |
Jan 17, 2001 |
JP |
2001-9555 |
Feb 12, 2001 |
JP |
2001-79425 |
Jun 5, 2001 |
JP |
2001-170342 |
Claims
What is claimed is:
1. An electronic toy controlled so as to react to external
information, comprising: a movement mechanism structuring the
mechanical movement of the toy; input means for obtaining external
information; distinction means for distinguishing whether an object
body exists in the periphery; and control means for selecting,
among a plurality of control parameters, a control parameter for
controlling said movement mechanism in correspondence with said
external information based on said distinction result, and
controlling the movement of said movement mechanism.
2. An electronic toy according to claim 1 further comprising
information display means for externally displaying information,
wherein said control means further selects, among a plurality of
control parameters, a control parameter for controlling said
information display means in correspondence with said external
information, and controls the operation of said information display
means.
3. An electronic toy according to claim 1 or claim 2 further
comprising sound generation means for externally outputting sound,
wherein said control means further selects, among a plurality of
control parameters, a control parameter for controlling said sound
generation means in correspondence with said external information,
and controls the operation of said sound generation means.
4. An electronic toy according to claim 1, further comprising:
information display means for externally displaying information;
sound generation means for externally outputting sound; means for
calculating the lifestyle rhythm of a specific person; and event
detection means for detecting the occurrence of an event during
said lifestyle rhythm; wherein said control means further selects
the control parameter of at least said movement mechanism, said
information display means and said sound generation means in
correspondence with said event.
5. An electronic toy according to claim 1, further comprising:
information display means for externally displaying information;
sound generation means for externally outputting sound; clock means
for detecting the present time; and detection means for detecting
the occurrence of an event planned on a time base in advance;
wherein said control means further selects the control parameter of
at least said movement mechanism, said information display means
and said sound generation means in correspondence with said
event.
6. An electronic toy according to claim 1, wherein said distinction
means detects peripheral sound and/or movement.
7. An electronic toy according to claim 1, wherein said distinction
means detects peripheral sound and/or brightness.
8. An electronic toy according to claim 1, wherein said distinction
means comprises a microphone for collecting sound and/or a camera
for photographing the periphery.
9. An electronic toy according to claim 1, wherein said movement
mechanism is structured in the shape of a humanoid robot, and the
movement thereof is controlled so as to express at least one of a
human emotion of "delight", "anger", "sorrow" and "pleasure".
10. An electronic toy according to claim 1, wherein said control
means selects the control parameter for a predetermined solo play
operation when it is judged that a person does not exist in the
periphery.
11. An electronic toy according to claim 2, wherein said electronic
toy is in the shape of a human, and said information display means
is provided to a part corresponding to the face and displays
expressions of the face and symbols such as text.
12. An electronic toy according to claim 1 further comprising
storage means for recording the voice of a person.
13. An electronic toy according to claim 1, wherein said input
means includes at least one of a touch sensor, microphone, light
sensor, camera, .smallcircle..times. switch and condition
sensor.
14. An electronic toy according to claim 1 further comprising means
for detecting the output of a battery, which is the source of power
of said movement mechanism; wherein said control means further
generates a warning with the information display means for
externally displaying information or the sound generation means for
externally outputting sound when the output of said battery becomes
weak.
15. An electronic toy controlled so as to react to external
information, comprising: a human-shaped structure; control means
for controlling the movement of said structure in correspondence
with external information; a miniature camera provided to said
structure and for photographing the external situation; and
communication means for externally transmitting the photographed
image.
16. A toy, comprising: a basic frame disposed at the torso portion
of the human-shaped toy; first and second sub-frames respectively
provided at both sides of said basic frame and rotatably mounted on
said basic frame; first and second rotational axes respectively
provided to said first and second sub-frames; a cam mechanism
provide to a third rotational axis driven by a first motor; a link
for connecting said cam mechanism between said first and second
sub-frames and oscillating both sub-frames; a gear mechanism driven
by a second motor; and a transmission mechanism disposed across
said basic frame and between said first and second sub-frames and
for transmitting the output of said gear mechanism to said first
and second rotational axes.
17. A toy according to claim 16, wherein said transmission
mechanism is structured of a gear train formed of a plurality of
gears, and each of the gears on both ends are respectively disposed
in said first and second sub-frames and respectively engage with
said first and second rotational axes via a bevel gear.
18. An electronic toy in the shape of a human or an animal
comprising a display unit at a face-corresponding portion of the
head capable of displaying text and symbols, and which is
structured such that the information input pursuant to the
operation of the input unit, which is formed of a plurality of
input switches provided on the body, can be visually confirmed with
the display unit provided on said face-corresponding portion.
19. An electronic toy in the shape of a human or an animal having a
head portion and a torso portion and comprising a display unit at a
face-corresponding portion of the head capable of displaying text
and symbols, an input unit formed of a plurality of input switches
is provided to the torso portion, and which is structured such that
the operation results of said input unit can be visually confirmed
with the display unit provided on said face-corresponding
portion.
20. An electronic robot in the shape of a human or an animal
comprising a display unit at a face-corresponding portion of the
head capable of displaying text and symbols, and which is
structured such that the information input by the operator
operating the input unit provided to the body of said robot can be
visually confirmed with the display unit provided on said
face-corresponding portion.
21. An electronic robot in the shape of a human or an animal
comprising a display unit at a face-corresponding portion of the
head capable of displaying text and symbols, and which is
structured such that the information input by the operator
operating the input unit provided to the body of said robot is
displayed on the display unit provided on said face-corresponding
portion so as to form the expression of said robot.
22. An electronic toy according to claim 1, wherein an emotion
parameter is included in said control parameter, and said emotion
parameter is represented as the biorhythm of a specific person or
the biorhythm of the robot.
23. An electronic toy according to claim 22, wherein said emotion
parameter is affected by the occurrence of an event.
24. An electronic toy according to claim 23, wherein the response
to a question inquired by the electronic toy to the user is
included in said event.
25. An electronic toy according to claim 24, wherein defined
beforehand in said question are changes in the emotion parameter
against the anticipated response to the question.
26. An electronic toy according to claim 22, further comprising:
information display means for externally displaying information;
and sound generation means for externally outputting sound; wherein
said control unit further selects at least one of the information
to be externally displayed or the sound to be externally output
based on said emotion parameter, and respectively supplies this to
the corresponding means among said information display means and
said sound generation means.
27. An electronic toy according to claim 24, wherein said control
means further stores the response to said question, and forms a
standard sentence by using data relating to said response.
28. An electronic toy in the shape of a human or an animal,
comprising: a display unit capable of displaying text and symbols
on the face-corresponding portion of the head or
torso-corresponding portion; input means provided on the body for
performing input operations; storage means for storing a plurality
of words; and control means having a function for outputting
emotion parameter values representing self emotions, and which
selects said words based on said emotion parameter value and
displays said words on said display unit.
29. An electronic toy in the shape of a human or an animal,
comprising: vocalization means for outputting sound data as a
voice; input means provided to the body for performing input
operations; storage means for storing a plurality of sound data;
and control means having a function for outputting emotion
parameter values representing self emotions, and which selects said
sound data based on said emotion parameter value and makes said
vocalization means vocalize said sound data.
30. An electronic toy in the shape of a human or an animal,
comprising: a display unit capable of displaying text and symbols
on the face-corresponding portion of the head or
torso-corresponding portion; vocalization means for outputting
sound data as a voice; input means provided on the body for
performing input operations; storage means for storing a plurality
of words and a plurality of sound data; and control means having a
function for outputting emotion parameter values representing self
emotions, and which selects said words and said sound data based on
said emotion parameter value and supplies said words and said sound
data to said display unit and said vocalization means,
respectively.
31. An electronic toy according to any one of claims 28 to 30,
wherein said emotion parameter value temporally changes between the
maximum value and minimum value.
32. An electronic toy according to claim 28, wherein said control
means further inquires questions with said text or sound, and
changes the value of said emotion parameter in accordance with the
input operation in response thereto.
33. An electronic toy according to claim 32, wherein a plurality of
said questions are stored in advance, and changes in said emotion
parameter are defined against the anticipated response to the
respective questions.
34. An electronic toy according to claim 32, wherein a plurality of
said questions are stored in advance, and the degree of intimacy
between the electronic toy and user is defined against the
anticipated response to the respective questions.
35. An electronic toy according to claim 32, wherein said control
means further stores the response to said question, and forms a
standard sentence by using data relating to said response.
36. An electronic toy according to claim 34, wherein said control
unit further accumulates the degree of intimacy obtained pursuant
to the. respective questions and, when this exceeds a prescribed
value, supplies data for expressing a specific emotion to said
display unit and/or said vocalization means.
37. An electronic toy according to claim 32, wherein said questions
include questions that will affect and questions that will not
affect said emotion parameter.
38. An electronic toy according to claim 31, wherein a plurality
zones are defined in advance between the maximum value and minimum
value of said emotion parameter and said words and said voice data
are distributed in the respective zones, and wherein said control
means selects words and sound data of the corresponding zone
depending on to which zone the current emotion parameter value
belongs.
39. An electronic toy according to claim 38, wherein said control
means, in a special zone, further selects the control for
performing a special movement accompanying the mechanical movement
of the components structuring the human shape or animal shape.
40. An electronic toy according to claim 28, wherein said control
means further comprises an exhibition mode for changing said
emotion parameter value between the maximum value and minimum value
thereof in short cycles.
41. An electronic toy according to claim 30 further comprising
connection means for connecting the electronic toy to a network,
wherein at least one of the words and sound data is downloaded to
said storage means from a server device connected to said
network.
42. An electronic toy according to claim 41, wherein said
downloaded words and sound data are current affair terms.
43. An electronic toy according to claim 41, wherein said
downloaded words and sound data are terms corresponding to the
characteristics of the user.
44. An electronic toy according to claim 30 further comprising
connection means for connecting two electronic toys, wherein at
least one of the words and sound data stored in the connected
electronic toy of the opponent is received by said storage
means.
45. An electronic toy according to claim 41 or claim 44, wherein
said connection means includes at least one of a communication
cable, PHS, mobile telephone and personal computer.
46. An electronic toy according to claim 44, wherein text data is
exchanged between the electronic toys, and simulated conversation
is conducted by incorporating the exchanged data in standard
sentences.
47. An electronic toy in the shape of a human or an animal,
comprising: sound detection means for detecting peripheral sound; a
display unit capable of displaying text and symbols on the
face-corresponding portion of the head or torso-corresponding
portion; storage means for storing a plurality of expressions; and
control means having a function for outputting emotion parameter
values representing self emotions, and which selects said
expression based on said emotion parameter value and displays said
expression on said display unit, and sets said emotion parameter to
an unpleasant state when said sound exceeds a prescribed level and
continues beyond a prescribed time.
48. An electronic toy in the shape of a human or an animal,
comprising: a display unit capable of displaying text and symbols
on the face-corresponding portion of the head or
torso-corresponding portion; storage means for storing a plurality
of expressions; input means provided on the body for performing
input operations; and control means having a function for
outputting emotion parameter values representing self emotions, and
which selects said expression based on said emotion parameter value
and displays said expression on said display unit, and selects an
expression corresponding to said emotion parameter when said input
means is continuously operated for a prescribed time or a
prescribed number of times.
49. An electronic toy according to claim 47, wherein an expression
of anger is displayed on said display unit during said unpleasant
state.
50. An electronic toy according to claim 48, wherein the expression
selected in correspondence with said continuous operation is a
painful expression upon being pounded, or a delightful expression
upon being patted.
51. An electronic toy in the shape of a human or an animal,
comprising: a display unit capable of displaying text and symbols
on the face-corresponding portion of the head or
torso-corresponding portion; storage means for storing a plurality
of expressions; a light sensor for detecting the peripheral
brightness; and control means for selecting an expression
corresponding to the self emotion and selecting the expression of
closing the eyes when said light sensor detects a dark state beyond
a prescribed time.
52. An electronic toy according to claim 51, wherein said control
means further moves the mechanical components structuring the human
shape or animal shape so as to express a reluctant expression.
53. An electronic toy according to claim 28, wherein the initial
value of the function for outputting the emotion parameter value
expressing said emotions is set randomly.
54. An electronic toy in the shape of a human or an animal,
comprising: a display unit capable of displaying text and symbols
on the face-corresponding portion of the head or
torso-corresponding portion; movably structured mechanical
components structuring a human shape or an animal shape; and a
control unit for distinguishing the message and control information
from a file attached to an e-mail, displaying said message on said
display unit, and moving said mechanical components in
correspondence with said control information.
55. An electronic toy according to claim 54, wherein said
attachment file is a sound file.
56. An electronic toy according to claim 55, wherein said sound
file is reproduced as a sound signal by a computer, and said sound
signal is supplied to said control unit.
57. An electronic toy according to claim 54, wherein said control
information designates the movement stored beforehand in said
control unit.
58. An electronic toy according to claim 54, wherein said control
information designates to said control unit a series of control
procedures of said mechanical components.
59. An electronic toy according to claim 54, wherein, when said
control information is not attached, said control unit selects
adequate movement of said mechanical components.
60. An electronic toy according to claim 54, wherein said control
information expresses emotions such as delight, anger, sorrow and
pleasure of the robot.
61. A method of exchanging e-mail, comprising: a step of converting
the message to be displayed on the electronic toy of the receiving
side and the movement to be made by said electronic toy into a
sound signal; a step of converting said sound signal into a sound
file and making this an attachment file of an e-mail; and a step of
transmitting the e-mail with said sound attachment file from the
terminal device of the transmitting side to the terminal device of
the receiving side.
62. A method of exchanging e-mail, comprising: a step of receiving
e-mail with a sound attachment file in which said attachment file
is a sound file where a sound signal bears the message to be
displayed on an electronic toy and the movement to be made by said
electronic toy, and obtaining said sound signal by reproducing said
sound file; a step of forwarding said reproduced sound signal from
the terminal device of the receiving side to the electronic toy of
the receiving side; and a step of making said electronic toy
display said message and perform said movement.
63. A method of exchanging e-mail, comprising: a step of converting
the message to be displayed on the electronic toy of the receiving
side and the movement to be made by said electronic toy into a
sound signal; a step of converting said sound signal into a sound
file and making this an attachment file of an e-mail; a step of
transmitting the e-mail with said sound attachment file from the
terminal device of the transmitting side to the terminal device of
the receiving side; a step of forwarding said reproduced sound
signal from the terminal device of said receiving side to said
electronic toy; and a step of making said electronic toy display
said message and perform said movement.
64. An electronic toy in the shape of a human or an animal,
comprising: a leg structure structuring a pair of movable legs in
the shape of a human or an animal; and a control unit for
controlling the movement of said legs in correspondence with sound
to be output.
65. An electronic toy according to claim 64, wherein said control
unit sets the speed of movement of said legs in correspondence with
the volume of said sound and rhythm.
66. An electronic toy according to claim 64, wherein the movement
of said pair of legs is a movement of opening/closing said legs in
the horizontal direction.
67. An electronic toy according to claim 64, wherein slide
prevention means is provided to the sole of one of said legs, and
sliding means is provided to the sole of the other of said
legs.
68. An electronic toy according to claim 64, wherein said leg
structure comprises: a waist portion frame provided with a pair of
hip joint portions rotatable at least in one direction; a pair of
leg portions respectively connected to said pair of hip joint
portions; a pair of drive shafts in which one end portion thereof
is mounted on said leg portion and the other end portion thereof
extends inside said waist portion frame beyond the hip joint
portion of said leg portion; a link member for mutually connecting
the respective other end portions of the respective drive shafts; a
cam mechanism interjacent between said other end portion of at
least one of said drive shafts and said link member, and for
changing the respective one end portions of said drive shafts to
become wide or narrow; and a motor built in one of said leg
portions and for rotatably driving said one of drive shafts.
69. An electronic toy according to claim 68, wherein the other end
portion of said drive shaft and said link member, or said cam and
said link member, are connected via a spherical engagement
member.
70. An electronic toy according to claim 68 further comprising a
sliding means provided on one end portion of the other drive shaft
among said pair of drive shafts so as to slide on the ground
surface or floor surface.
71. An electronic toy according to claim 68, wherein the other of
said leg portions comprises an above-knee portion connected
rotatably in the cross direction to said hip joint portion, a
below-knee portion connected rotatably in the cross direction with
said above-knee portion, and a ground portion connected rotatably
in the horizontal direction with one end portion of the other drive
shaft among said pair of drive shafts; and said electronic toy
having a structure wherein a protrusion is formed at the lower face
of said below-knee portion, an inclined face to which said
protrusion contacts is formed on the upper face of said ground
portion, said protrusion is pushed up pursuant to the
opening/closing movement of said leg portions, and the connection
of said above-knee portion and said below-knee portion bends
thereby.
72. An electronic toy according to claim 67, wherein said sliding
means is a roller.
73. An electronic toy according to claim 69, wherein the degree of
opening/closing of said pair of legs is adjustable pursuant to the
position in which said engagement member is mounted on said cam
mechanism.
74. An electronic toy comprising a walking mechanism for performing
bipedal locomotion by moving both legs back and forth, wherein the
movement mechanism of one leg comprises: a waist portion frame; an
above-knee portion connected rotatably to said waist portion frame;
a below-knee portion connected rotatably to said above-knee
portion; a ground portion connected rotatably to said below-knee
portion; a rotatably driven cam pulley provided to said waist
portion frame; a first cam provided to said cam pulley; a second
cam provided to said cam pulley; a long member for vertically
oscillating said ground portion with said first cam; and a short
member for oscillating said below-knee portion with said second cam
in the cross direction.
75. An electronic toy comprising a walking mechanism for performing
bipedal locomotion by moving both legs back and forth, wherein the
movement mechanism of one leg comprises: a waist portion frame; an
above-knee portion connected rotatably to said waist portion frame;
a below-knee portion connected rotatably to said above-knee
portion; a ground portion connected rotatably to said below-knee
portion; a rotatably driven cam provided to said waist portion
frame; a long member for vertically oscillating said ground portion
with said cam; and a short member for oscillating said below-knee
portion with said cam in the cross direction.
76. An electronic toy according to claim 75, wherein said long
member comprises a guide hole to be engaged with a guide member and
a pushdown plate in contact with said cam.
77. An electronic toy comprising a walking mechanism for performing
bipedal locomotion by moving both legs back and forth, wherein the
movement mechanism of the respective legs comprises: an above-knee
portion connected rotatably to a waist portion frame; a below-knee
portion connected rotatably to said above-knee portion; a ground
portion connected rotatably to said below-knee portion; and
energization means for energizing the tip of said ground portion in
the pushdown direction.
78. An electronic toy according to claim 77, wherein the size of
said electronic toy is approximately 30 cm.
79. An electronic toy comprising a walking mechanism for performing
bipedal locomotion by moving both legs back and forth, wherein the
movement mechanism of the respective legs comprises: an above-knee
portion connected rotatably to a waist portion frame; a below-knee
portion connected rotatably to said above-knee portion; a ground
portion connected rotatably to said below-knee portion; and oblique
direction drive means provided to said ground portion and for
driving said electronic toy in an oblique direction against the
advancing direction by said bipedal locomotion mechanism.
80. An electronic toy according to claim 79, wherein said oblique
direction drive means is structured by comprising a rotatably
driven drive roller or drive belt.
82. An electronic toy according to claim 80, wherein a plurality of
said driver rollers or drive belts are provided.
82. An electronic toy according to claim 79, wherein said oblique
direction drive means is respectively provided to the respective
ground portions of both legs, and the respective drive directions
of each of said oblique direction drive means exist on the
circumference of an approximate curvature.
83. An electronic toy according to claim 79, wherein said oblique
direction drive means is provided at the toe side of said ground
portion, and a sliding roller is provided at the heel side of said
ground portion.
84. An electronic toy comprising a walking mechanism for performing
bipedal locomotion by moving both legs back and forth, provided
with an oblique direction drive mechanism for driving said
electronic toy in an oblique direction against the advancing
direction by said bipedal locomotion mechanism.
85. An electronic toy according to claim 84, wherein said oblique
direction drive mechanism is structured by comprising a rotatably
driven drive roller or drive belt.
86. An electronic toy according to claim 85, wherein a plurality of
said driver rollers or drive belts are provided.
87. An electronic toy according to claim 84, wherein said oblique
direction drive mechanism is respectively provided to the
respective sole portions of both feet, and the respective drive
directions of each of said oblique direction drive means exist on
the circumference of an approximate curvature.
88. An electronic toy according to claim 84, wherein said oblique
direction drive mechanism is provided at the toe side of the sole
portion of said feet, and a sliding roller is provided at the heel
side of the sole portion of said feet.
89. An electronic toy according to claim 84 further comprising an
energization means for pushing down the tip side of said feet in
the sole direction.
Description
BACKGROUND OF THE INVENTION
[0001] 1. Field of the Invention
[0002] The present invention relates to an electronic toy
(including adult "toys" and "playthings", and domestic robots) that
performs control so as to move arbitrarily in accordance with
external sound and contact.
[0003] 2. Description of the Related Art
[0004] Stuffed animals of dogs, cats, bears and so on have been
widely used as animal toys from the past. Further, there are animal
toys structured by housing a motor and speaker inside the body of a
stuffed animal or a synthetic resin body in the shape of an animal,
and, for example, by contacting and pressing the head portion, such
animal toy would perform prescribed movements such as moving its
legs or mouth, as well as generate a prescribed cry.
[0005] With this type of animal toy, since it repeats the same
movement and also repeatedly generates the same cry, the user often
loses interest quickly.
[0006] Contrarily, if the movement is selected randomly, the
movement expected by the user will not occur, and, again, the user
may lose interest soon.
[0007] In view of such conventional animal toys, development of
electronic toys equipped with a microcomputer for controlling
various movements so as to keep the interest of users is being
promoted.
[0008] As such electronic toy, there are those which are structured
to make a certain movement (e.g., generating words stored
beforehand from the speaker, making a movement of swaying its body)
pursuant to the command of the microcomputer when the user, for
example, pats the head of the electronic toy, lifts it up, or
speaks to it. Moreover, with this type of electronic toy, the
number of times the head was patted, the number of times it was
lifted up, and the number of times it was spoken to are counted,
and, for example, the electronic toy is controlled such that the
words generated from the speaker gradually change to adorable
phrasing pursuant to the increase in the count value.
SUMMARY OF THE INVENTION
[0009] In the aforementioned conventional electronic toy, since the
user would play with such toy after he/she turns on the power, it
does not move by automatically selecting a movement in
correspondence with the existence of the user (person). Further,
the electronic toy itself does not determine the movement by
judging the surrounding circumstances. When envisioning an
electronic toy for seeking communication with the user, it is
desirable that the electronic toy automatically makes movement in
correspondence with the user from its activation.
[0010] Thus, an object of the present invention is to provide an
electronic toy that is automatically activated when the user exists
nearby.
[0011] Another object of the present invention is to provide an
electronic toy having elements for seeking communication with the
user.
[0012] In order to achieve the foregoing objects, the electronic
toy (i.e., domestic robot) of the present invention is an
electronic toy controlled so as to react to external information,
comprising: a movement mechanism structuring the mechanical
movement of the toy; an input means for obtaining external
information; a distinction means for distinguishing whether an
object body exists in the periphery; and a control means for
selecting, among a plurality of control parameters, a control
parameter for controlling the movement mechanism in correspondence
with the external information based on the distinction result, and
controlling the movement of the movement mechanism.
[0013] According to the aforementioned structure, obtained is an
electronic toy that will move in correspondence with external
information when a person or the like (object body) exists in the
periphery. It is thereby possible for the electronic toy side to
communicate to the user. This is also effective for conserving
batteries (power source).
[0014] Preferably, the electronic toy further comprises an
information display means for externally displaying information,
and wherein the control means further selects, among a plurality of
control parameters, a control parameter for controlling the
information display means in correspondence with the external
information, and controls the operation of the information display
means.
[0015] Obtained thereby is an electronic toy capable of reacting in
correspondence with external information pursuant to the mechanism
movement and visual display.
[0016] Preferably, the electronic toy further comprises a sound
generation means for externally outputting sound, and wherein the
control means further selects, among a plurality of control
parameters, a control parameter for controlling the sound
generation means in correspondence with the external information,
and controls the operation of the sound generation means.
[0017] Obtained thereby is an electronic toy capable of reacting in
correspondence with external information pursuant to the mechanism
movement, visual display, and sound output.
[0018] Preferably, the electronic toy further comprises: a means
for calculating the lifestyle rhythm of a specific person; and an
event detection means for detecting the occurrence of an event
during such lifestyle rhythm; wherein the control means further
selects the control parameter of at least the movement mechanism,
the information display means and the sound generation means in
correspondence with the event.
[0019] According to the foregoing structure, obtained is an
electronic toy which makes communication in correspondence with the
lifestyle rhythm (e.g., biorhythm) of the user.
[0020] Preferably, the electronic toy further comprises: a clock
means for detecting the present time; and a detection means for
detecting the occurrence of an event planned on a time base in
advance; wherein the control means further selects the control
parameter of at least the movement mechanism, the information
display means and the sound generation means in correspondence with
the event.
[0021] According to the foregoing structure, obtained is an
electronic toy which makes communication in correspondence with the
temporal lifestyle activity pattern of the user.
[0022] Preferably, the distinction means detects peripheral sound
and/or movement.
[0023] Preferably, the distinction means detects peripheral sound
and/or brightness.
[0024] Preferably, the distinction means comprises a microphone for
collecting sound and/or a camera for photographing the
periphery.
[0025] According to the foregoing structure, it is possible to
sense that the user is near the electronic toy by detecting the
peripheral sound or brightness, existence of a moving body, or the
like.
[0026] Preferably, the movement mechanism is structured in the
shape of a humanoid robot, and the movement thereof is controlled
so as to express at least one of a human emotion of "delight",
"anger", "sorrow" and "pleasure".
[0027] Preferably, the control means selects the control parameter
for a predetermined solo play operation when it is judged that a
person does not exist in the periphery. The operation of a solo
play is activated irrespective of the input made by the user, and,
for example, includes the display of a solo play game on the
display unit of the electronic toy.
[0028] Preferably, the electronic toy is in the shape of a human,
and the information display means is provided to a part
corresponding to the face and displays expressions of the face and
symbols such as text.
[0029] Preferably, the electronic toy further comprises a storage
means for recording the voice of a person. This will enable voice
memos and impersonation (voice imitation).
[0030] Preferably, the input means includes at least one of a touch
sensor, microphone, light sensor, camera, .smallcircle..times.
switch and condition sensor.
[0031] Preferably, the electronic toy further comprises a means for
detecting the output of a battery, which is the source of power of
the movement mechanism; and wherein the control means further
generates a warning with the information display means for
externally displaying information or the sound generation means for
externally outputting sound when the output of the battery becomes
weak.
[0032] Further, the electronic toy of the present invention is an
electronic toy controlled so as to react to external information,
comprising: a human-shaped structure; a control means for
controlling the movement of the structure in correspondence with
external information; a miniature camera provided to the structure
and for photographing the external situation; and a communication
means for externally transmitting the photographed image.
[0033] According to the foregoing structure, it is possible to
comprehend the peripheral situation as image data, and grasp the
existence of the user (person) from the movement of the
subject.
[0034] As the communication means, for example, infrared (IR)
communication, PHS, mobile phone, wired communication, general
telephone circuits and so on may be used.
[0035] Moreover, the electronic toy of the present invention
comprises: a basic frame disposed at the torso portion of the
human-shaped toy; first and second sub-frames respectively provided
at both sides of the basic frame and rotatably mounted on the basic
frame; first and second rotational axes respectively provided to
the first and second sub-frames; a cam mechanism provide to a third
rotational axis driven by a first motor; a link for connecting the
cam mechanism between the first and second sub-frames and
oscillating both sub-frames; a gear mechanism driven by a second
motor; and a transmission mechanism disposed across the basic frame
and between the first and second sub-frames and for transmitting
the output of the gear mechanism to the first and second rotational
axes.
[0036] According to the foregoing structure, the shoulder, arm and
wrist become movable, and simulated humanlike movement can be
represented.
[0037] Preferably, the transmission mechanism is structured of a
gear train formed of a plurality of gears, and each of the gears on
both ends are respectively disposed in the first and second
sub-frames and respectively engage with the first and second
rotational axes via a bevel gear.
[0038] According to the foregoing structure, the arm can be rotated
simultaneously together with the rotation of the shoulder.
[0039] Preferably, a first clutch mechanism for protecting the
member from an overload is provided between the first motor and
third rotational axis.
[0040] Preferably, a second clutch mechanism for protecting the
member from an overload is provided to the gear mechanism.
[0041] Further, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal comprising a
display unit at a face-corresponding portion of the head portion
capable of displaying text and symbols, and which is structured
such that the information input pursuant to the operation of the
input unit, which is formed of a plurality of input switches
provided on the body, can be visually confirmed with the display
unit provided on the face-corresponding portion.
[0042] Moreover, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal having a head
portion and a torso portion and comprising a display unit at a
face-corresponding portion of the head portion capable of
displaying text and symbols, an input unit formed of a plurality of
input switches is provided to the torso portion, and which is
structured such that the operation results of the input unit can be
visually confirmed with the display unit provided on the
face-corresponding portion.
[0043] Further, the electronic robot of the present invention is an
electronic robot in the shape of a human or an animal comprising a
display unit at a face-corresponding portion of the head portion
capable of displaying text and symbols, and which is structured
such that the information input by the operator operating the input
unit provided to the body of the robot can be visually confirmed
with the display unit provided on the face-corresponding
portion.
[0044] Moreover, the electronic robot of the present invention is
an electronic robot in the shape of a human or an animal comprising
a display unit at a face-corresponding portion of the head capable
of displaying text and symbols, and which is structured such that
the information input by the operator operating the input unit
provided to the body of the robot is displayed on the display unit
provided on the face-corresponding portion so as to form the
expression of the robot.
[0045] Preferably, an emotion parameter is included in the control
parameter, and the emotion parameter is represented as the
biorhythm of a specific person or the biorhythm of the robot.
Expressions based on the emotions of the robot itself are thereby
realized.
[0046] Preferably, the emotion parameter is affected by the
occurrence of an event. Emotions thereby change on a case-by-case
basis pursuant to the situation.
[0047] Preferably, the response to a question inquired by the
electronic toy to the user is included in the event. Emotions can
be changed depending on the response to the question.
[0048] Preferably, defined beforehand in the question are changes
in the emotion parameter against the anticipated response to the
question. Influence on the respective responses to the question may
thereby be made to differ.
[0049] Preferably, the control unit further selects the information
to be externally displayed and/or selects the sound to be
externally output based on the emotion parameter. Obtained thereby
is information and sound to be externally output based on
emotions.
[0050] Preferably, the control means further stores the response to
the question, and forms a standard sentence by using data relating
to the response. That is, the response results are used (reflected)
in the control.
[0051] Further, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal, comprising: a
display unit capable of displaying text and symbols on the
face-corresponding portion of the head or torso-corresponding
portion; an input means provided on the body for performing input
operations; a storage means for storing a plurality of words; and a
control means having a function for outputting emotion parameter
values representing self emotions, and which selects the words
based on the emotion parameter value and displays the words on the
display unit.
[0052] According to the foregoing structure, it is possible to
output words based on the emotions of the robot.
[0053] Moreover, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal, comprising: a
vocalization means for outputting sound data as a voice; an input
means provided to the body for performing input operations; storage
means for storing a plurality of sound data; and a control means
having a function for outputting emotion parameter values
representing self emotions, and which selects the sound data based
on the emotion parameter value and makes the vocalization means
vocalize the sound data.
[0054] According to the foregoing structure, it is possible to
output words based on the emotions of the robot.
[0055] Further, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal, comprising: a
display unit capable of displaying text and symbols on the
face-corresponding portion of the head or torso-corresponding
portion; a vocalization means for outputting sound data as a voice;
an input means provided on the body for performing input
operations; a storage means for storing a plurality of words and a
plurality of sound data; and a control means having a function for
outputting emotion parameter values representing self emotions, and
which selects the words and the sound data based on the emotion
parameter value and supplies the words and the sound data to the
display unit and the vocalization means, respectively.
[0056] According to the foregoing structure, it is possible to
output words and sound based on the emotions of the robot.
[0057] Preferably, the emotion parameter value temporally changes
between the maximum value and minimum value.
[0058] Preferably, the control means further inquires questions
with text or sound, and changes the value of the emotion parameter
in accordance with the input operation in response thereto. The
emotion of the electronic toy will thereby change depending on the
user's response to the question.
[0059] Preferably, a plurality of questions are stored in advance,
and changes in the emotion parameter are defined against the
anticipated response to the respective questions. This is amusing
in that the degree of change in the emotion per question will
differ.
[0060] Preferably, a plurality of questions are stored in advance,
and the degree of intimacy between the electronic toy and user is
defined against the anticipated response to the respective
questions.
[0061] Preferably, the control means further stores the response to
the question, and forms a standard sentence by using data relating
to the response.
[0062] Preferably, the control unit further accumulates the degree
of intimacy obtained pursuant to the respective questions and, when
this exceeds a prescribed value, supplies data for expressing a
specific emotion to the display unit and/or the vocalization means.
The electronic toy is thereby able to make affectionate expressions
to its user.
[0063] Preferably, the aforementioned questions include questions
that will affect and questions that will not affect the emotion
parameter.
[0064] Preferably, a plurality zones are defined in advance between
the maximum value and minimum value of the emotion parameter and
the words and the voice data are distributed in the respective
zones, and wherein the control means selects words and sound data
of the corresponding zone depending on to which zone the current
emotion parameter value belongs.
[0065] Preferably, the control means, in a special zone, further
selects the control for performing a special movement accompanying
the mechanical movement of the components structuring the human
shape or animal shape. The overall movement will have a large
impact on the user.
[0066] Preferably, the control means further comprises an
exhibition mode for changing the emotion parameter value between
the maximum value and minimum value thereof in short cycles. The
characteristics of the electronic toy in the show window can
thereby be introduced in a short period of time.
[0067] Preferably, the electronic toy further comprises a
connection means for connecting the electronic toy to a network,
and wherein at least one of the words and sound data is downloaded
to the storage means from a server device connected to the network.
Data of words and sound as well as control data can thereby be
updated.
[0068] Preferably, the downloaded words and sound data are current
affair terms. This is amusing in that the electronic toy will be
contemporary.
[0069] Preferably, downloaded words and sound data are terms
corresponding to the characteristics of the user. Words befitting
the user can thereby be selected.
[0070] Preferably, the electronic toy further comprises a
connection means for connecting two electronic toys, and wherein at
least one of the words and sound data stored in the connected
electronic toy of the opponent is received by the storage means.
Data exchange between toys is thereby possible.
[0071] Preferably, the connection means includes at least one of a
communication cable, PHS, mobile telephone and personal
computer.
[0072] Preferably, text data is exchanged between the electronic
toys, and simulated conversation is conducted by incorporating the
exchanged data in standard sentences. It is thereby possible to
make it look like the electronic toys are having a
conversation.
[0073] Moreover, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal, comprising: a
sound detection means for detecting peripheral sound; a display
unit capable of displaying text and symbols on the
face-corresponding portion of the head or torso-corresponding
portion; a storage means for storing a plurality of expressions;
and a control means having a function for outputting emotion
parameter values representing self emotions, and which selects the
expression based on the emotion parameter value and displays the
expression on the display unit, and sets the emotion parameter to
an unpleasant state when the sound exceeds a prescribed level and
continues beyond a prescribed time.
[0074] According to the foregoing structure, reluctant expressions
and gestures are made if a loud sound is continuously provided to
the electronic toy.
[0075] Further, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal, comprising: a
display unit capable of displaying text and symbols on the
face-corresponding portion of the head or torso-corresponding
portion; a storage means for storing a plurality of expressions; an
input means provided on the body for performing input operations;
and a control means having a function for outputting emotion
parameter values representing self emotions, and which selects the
expression based on the emotion parameter value and displays the
expression on the display unit, and selects an expression
corresponding to the emotion parameter when the input means is
continuously operated for a prescribed time or a prescribed number
of times.
[0076] According to the foregoing structure, when pounding or
patting the electronic toy, expressions and movements corresponding
to the emotion at such time can be expected.
[0077] Preferably, an expression of anger is displayed on the
display unit during the unpleasant state.
[0078] Preferably, the expression selected in correspondence with
the continuous operation is a painful expression upon being
pounded, or a delightful expression upon being patted.
[0079] Moreover, the electronic toy of the present invention is an
electronic toy in the shape of a human or an animal, comprising: a
display unit capable of displaying text and symbols on the
face-corresponding portion of the head or torso-corresponding
portion; a storage means for storing a plurality of expressions; a
light sensor for detecting the peripheral brightness; and a control
means for selecting an expression corresponding to the self emotion
and selecting the expression of closing the eyes when the light
sensor detects a dark state beyond a prescribed time.
[0080] According to the foregoing structure, the state of falling
asleep can be represented.
[0081] Preferably, the control means further moves the mechanical
components structuring the human shape or animal shape so as to
express a reluctant expression of going to sleep.
[0082] Preferably, the initial value of the function for outputting
the emotion parameter value expressing the emotions is set
randomly.
[0083] According to the foregoing structure, the state of
commencing movement in the respective electronic toys will differ.
This is amusing in that the each electronic toy will be
individualized.
[0084] The electronic toy of the present invention is an electronic
toy in the shape of a human or an animal, comprising: a display
unit capable of displaying text and symbols on the
face-corresponding portion of the head or torso-corresponding
portion; movably structured mechanical components structuring a
human shape or an animal shape; and a control unit for
distinguishing the message and control information from a file
attached to an e-mail, displaying the message on the display unit,
and moving the mechanical components in correspondence with the
control information.
[0085] Preferably, the attachment file is a sound file.
[0086] Preferably, the sound file is reproduced as a sound signal
by a computer, and the sound signal is supplied to the control
unit.
[0087] Preferably, the control information designates the movement
stored beforehand in the control unit.
[0088] Preferably, the control information designates to the
control unit a series of control procedures of the mechanical
components.
[0089] Preferably, when the control information is not attached,
the control unit selects adequate movement of the mechanical
components.
[0090] Preferably, the control information expresses emotions such
as delight, anger, sorrow and pleasure of the robot.
[0091] The e-mail method of the present invention comprises: a step
of converting the input message to be displayed on the electronic
toy of the receiving side and the movement to be made by the
electronic toy into a sound signal; a step of converting the sound
signal into a sound file and making this an attachment file of an
e-mail; a step of transmitting the e-mail with the sound attachment
file from the terminal device of the transmitting side to the
terminal device of the receiving side; a step of forwarding the
reproduced sound signal from the terminal device of the receiving
side to the electronic toy; and a step of making the electronic toy
display the message and perform the movement.
[0092] The electronic toy of the present invention is an electronic
toy in the shape of a human or an animal, comprising: a leg
structure structuring a pair of movable legs in the shape of a
human or an animal; and a control unit for controlling the movement
of the legs in correspondence with sound to be output.
[0093] Preferably, the control unit sets the speed of movement of
the legs in correspondence with the volume of the sound and
rhythm.
[0094] Preferably, the movement of the pair of legs is a movement
of opening/closing the legs in the horizontal direction.
[0095] Preferably, a slide prevention means is provided to the sole
of one of the legs, and sliding means is provided to the sole of
the other of the legs.
[0096] Preferably, the leg structure comprises: a waist portion
frame provided with a pair of hip joint portions rotatable at least
in one direction; a pair of leg portions respectively connected to
the pair of hip joint portions; a pair of drive shafts in which one
end portion thereof is mounted on the leg portion and the other end
portion thereof extends inside the waist portion frame beyond the
hip joint portion of the leg portion; a link member for mutually
connecting the respective other end portions of the respective
drive shafts; a cam mechanism interjacent between the other end
portion of at least one of the drive shafts and the link member,
and for changing the respective one end portions of the drive
shafts to become wide or narrow; and a motor built in one of the
leg portions and for rotatably driving one of drive shafts.
[0097] Preferably, the other end portion of the drive shaft and the
link member, or the cam and the link member, are connected via a
spherical engagement member.
[0098] Preferably, the electronic toy further comprises a sliding
means provided on one end portion of the other drive shaft among
the pair of drive shafts so as to slide on the ground surface or
floor surface.
[0099] Preferably, the other of the leg portions comprises an
above-knee portion connected rotatably in the cross direction to
the hip joint portion, a below-knee portion connected rotatably in
the cross direction with the above-knee portion, and a ground
portion connected rotatably in the horizontal direction with one
end portion of the other drive shaft among the pair of drive
shafts; and wherein the electronic toy has a structure in which a
protrusion is formed at the lower face of the below-knee portion,
an inclined face to which the protrusion contacts is formed on the
upper face of the ground portion, the protrusion is pushed up
pursuant to the opening/closing movement of the leg portions, and
the connection of the above-knee portion and the below-knee portion
bends thereby.
[0100] Preferably, the sliding means is a roller.
[0101] Preferably, the degree of opening/closing of the pair of
legs is adjustable pursuant to the position in which the engagement
member is mounted on the cam mechanism.
[0102] The electronic toy of the present invention is an electronic
toy comprising a walking mechanism for performing bipedal
locomotion by moving both legs back and forth, wherein the movement
mechanism of one leg comprises: a waist portion frame; an
above-knee portion connected rotatably to the waist portion frame;
a below-knee portion connected rotatably to the above-knee portion;
a ground portion connected rotatably to the below-knee portion; a
rotatably driven cam pulley provided to the waist portion frame; a
first cam provided to the cam pulley; a second cam provided to the
cam pulley; a long member for vertically oscillating the ground
portion with the first cam; and a short member for oscillating the
below-knee portion with the second cam. in the cross direction.
[0103] According to the foregoing structure, it is possible to lift
and move the tip (toe) or back end (heel) of the ground portion
(feet) in an appropriate angle upon moving both feet alternatively
and advancing forward or retreating backward.
[0104] Further, the electronic toy of the present invention is an
electronic toy comprising a walking mechanism for performing
bipedal locomotion by moving both legs back and forth, wherein the
movement mechanism of one leg comprises: a waist portion frame; an
above-knee portion connected rotatably to the waist portion frame;
a below-knee portion connected rotatably to the above-knee portion;
a ground portion connected rotatably to the below-knee portion; a
rotatably driven cam provided to the waist portion frame; a long
member for vertically oscillating the ground portion with the cam;
and a short member for oscillating the below-knee portion with the
cam in the cross direction.
[0105] According to the foregoing structure, it is possible to lift
and move the tip (toe) or back end (heel) of the ground portion
(feet) in an even larger angle upon moving both feet alternatively
and advancing forward or retreating backward.
[0106] Preferably, the long member comprises a guide hole to be
engaged with a guide member and a pushdown plate in contact with
the cam. The pushdown plate pushes down the ground portion and sets
the upper limit of the long member. It is thereby possible to
prevent a larger inclination and excess lifting of the ground
portion.
[0107] Preferably, the electronic toy further comprises an
energization means for energizing the tip of the ground portion in
the pushdown direction.
[0108] Preferably, the size of the electronic toy is approximately
30 cm.
[0109] Preferably, an oblique direction drive means is provided to
the ground portion for driving the electronic toy in an oblique
direction against the advancing direction by the bipedal locomotion
mechanism.
[0110] Preferably, the oblique direction drive means is structured
by comprising a rotatably driven drive roller or drive belt.
[0111] Preferably, a plurality of driver rollers or drive belts are
provided.
[0112] Preferably, the oblique direction drive means is
respectively provided to the respective ground portions of both
legs, and the respective drive directions of each of the oblique
direction drive means exist on the circumference of an approximate
curvature.
[0113] Preferably, the oblique direction drive means is provided at
the toe side of the ground portion, and a sliding roller is
provided at the heel side of the ground portion.
[0114] The present invention is an electronic toy comprising a
walking mechanism for performing bipedal locomotion by moving both
legs back and forth, and is provided with an oblique direction
drive mechanism for driving the electronic toy in an oblique
direction against the advancing direction by the bipedal locomotion
mechanism.
[0115] Preferably, the oblique direction drive mechanism is
structured by comprising a rotatably driven drive roller or drive
belt.
[0116] Preferably, a plurality of driver rollers or drive belts are
provided.
[0117] Preferably, the oblique direction drive mechanism is
respectively provided to the respective sole portions of both feet,
and the respective drive directions of each of the oblique
direction drive means exist on the circumference of an approximate
curvature.
[0118] Preferably, the oblique direction drive mechanism is
provided at the toe side of the sole portion of the feet, and a
sliding roller is provided at the heel side of the sole portion of
the feet.
BRIEF DESCRIPTION OF THE DRAWINGS
[0119] FIG. 1 is a front view for explaining the robot as the
electronic toy (domestic robot);
[0120] FIG. 2 is a rear view for explaining the robot as the
electronic toy;
[0121] FIG. 3 is a top, view, for explaining the robot as the
electronic toy;
[0122] FIG. 4 is a side view for explaining the robot as the
electronic toy;
[0123] FIG. 5 is an explanatory diagram for explaining the
mechanism enabling the rotation of the arm, shoulder, neck, and so
on of the robot;
[0124] FIG. 6 is a perspective view for explaining the
aforementioned mechanism;
[0125] FIG. 7 is an explanatory diagram showing the mechanism
enabling the rotation of the neck portion and rotation of the
shoulder portion of the robot;
[0126] FIG. 8 is an explanatory diagram showing the mechanism
enabling the rotation of the arm portion of the robot;
[0127] FIG. 9 is a block diagram for explaining the structure of
the control system;
[0128] FIG. 10 is a block diagram explaining the schematic
structure of the control unit 60;
[0129] FIG. 11 is a flowchart explaining an example of inputting
the "date of birth" into the robot for calculating the
biorhythm;
[0130] FIG. 12 is a flowchart explaining an example of enabling the
determination of the existence of a user by collecting ambient
sound;
[0131] FIG. 13 is a flowchart explaining an example of recognizing
the voice (order, etc.) of the user and the robot making movements
corresponding thereto;
[0132] FIG. 14 is a flowchart explaining an example of detecting
the movement of the object;
[0133] FIG. 15 is a flowchart explaining an example of
distinguishing the existence of the user based on the switch
operation, movement of the object, and existence of sound;
[0134] FIG. 16 is a flowchart explaining an example for
distinguishing the existence of the user based on the switch
operation, peripheral brightness, and existence of sound;
[0135] FIG. 17 is a flowchart explaining an example of the control
movement in consideration of the biorhythm;
[0136] FIG. 18 is an explanatory diagram explaining the
biorhythm;
[0137] FIG. 19 is an explanatory diagram explaining an example of
the facial eye expressions and the text (symbol) scroll displayed
on the display screen;
[0138] FIG. 20 is a flowchart explaining an example of movement
control of the robot pursuant to the lapse in time;
[0139] FIG. 21 is a flowchart explaining the execution of a control
program pursuant to the control unit (CPU);
[0140] FIG. 22 is an explanatory diagram explaining an example of
the posture expressing a feeling of "delight" of the robot;
[0141] FIG. 23 is an explanatory diagram explaining an example of
the posture expressing a feeling of "pleasure" of the robot;
[0142] FIG. 24 is an explanatory diagram explaining an example of
the posture expressing a feeling of "sorrow" of the robot;
[0143] FIG. 25 is an explanatory diagram explaining an example of
the posture expressing a feeling of "affection" of the robot;
[0144] FIG. 26 is a front view explaining an example of another
robot as the electronic toy;
[0145] FIG. 27 is a side view explaining an example of another
robot as the electronic toy;
[0146] FIG. 28(a) to FIG. 28(d) are explanatory diagrams explaining
an example of the various expressions corresponding to the delight,
anger, sorrow and pleasure of the robot;
[0147] FIG. 29(a) and FIG. 29(b) are explanatory diagrams
explaining an example of the various expressions corresponding to
the emotions of the robot;
[0148] FIG. 30 is an explanatory diagram showing another example of
a different biorhythm (biorhythm of robot);
[0149] FIG. 31 is a flowchart explaining the operation of an
example of displaying words on the display screen of the robot
influencing the current feelings;
[0150] FIG. 32 is an explanatory diagram explaining an example of
displaying words (anger mode of biorhythm) on the display screen of
the robot;
[0151] FIG. 33 is an explanatory diagram explaining an example of
displaying words (normal emotion mode) on the display screen of the
robot;
[0152] FIG. 34 is an explanatory diagram explaining an example of
displaying words (haiku style) on the display screen of the
robot;
[0153] FIG. 35 is a flowchart explaining an example of the feeling
of the robot changing pursuant to the response to the question
asked by the robot;
[0154] FIG. 36 is an explanatory diagram showing an example of a
question in which the response thereof will influence the
biorhythm;
[0155] FIG. 37 is an explanatory diagram showing an example of a
question in which the response thereof will not influence the
biorhythm;
[0156] FIG. 38 is an explanatory diagram explaining an example of
questions that will influence the biorhythm (feelings) of the
robot;
[0157] FIG. 39 is an explanatory diagram explaining an example of
questions that will influence the biorhythm (feelings) of the
robot;
[0158] FIG. 40 is an explanatory diagram explaining an example of
the feelings (biorhythm) becoming worse pursuant to the response to
the result of the question;
[0159] FIG. 41 is an explanatory diagram explaining an example of
two robots being connected via a cable for exchanging data in order
to communicate with each other;
[0160] FIG. 42 is an explanatory diagram showing an example where a
robot is connected to a PHS or mobile phone in order to acquire
data by communicating with another robot or a server device so as
to make conversation or movement;
[0161] FIG. 43 is a block diagram explaining an example of
connecting the communication interfaces of robots via a cable in
order to conduct communication;
[0162] FIG. 44 is a block diagram explaining an example of
communicating by using a terminal device connectable to a
communication network such as a PHS or mobile phone;
[0163] FIG. 45 is a block diagram explaining an example of enabling
communication between robots by using the Internet;
[0164] FIG. 46 is an explanatory diagram explaining an example of
downloading data from a server device to the robot;
[0165] FIG. 47 is a communication diagram explaining a procedural
example upon conducting data communication via a connection
cable;
[0166] FIG. 48 is a communication diagram explaining a procedural
example upon conducting data communication with a mobile phone or
PHS;
[0167] FIG. 49 is a communication diagram explaining a procedural
example upon conducting data communication when obtaining data from
the server device;
[0168] FIG. 50 is an explanatory diagram explaining an example of
"current affairs" and user adaptive data provided by the server
device;
[0169] FIG. 51 is a block diagram explaining the operation of an
action mail;
[0170] FIG. 52 is an explanatory diagram explaining the contents
(format) of the action mail;
[0171] FIG. 53 is an explanatory diagram explaining an example the
robot making a movement of "delight" upon receiving the action
mail;
[0172] FIG. 54 is an explanatory diagram explaining an example the
robot making a movement of "anger" upon receiving the action
mail;
[0173] FIG. 55 is an explanatory diagram explaining an example the
robot making a movement of "sorrow" upon receiving the action
mail;
[0174] FIG. 56 is an explanatory diagram explaining an example the
robot making a movement of "pleasure" upon receiving the action
mail;
[0175] FIG. 57 is a perspective view explaining the first posture
(legs closed) of the dance robot;
[0176] FIG. 58 is a perspective view explaining the second posture
(legs opened) of the dance robot;
[0177] FIG. 59 is a perspective view explaining open/close
mechanism of the legs (posture of legs closed);
[0178] FIG. 60 is a perspective view explaining open/close
mechanism of the legs (posture of legs opened);
[0179] FIG. 61 is a perspective view explaining a structural
example of the right leg;
[0180] FIG. 62 is an explanatory diagram explaining the operation
of bending the knee of the right leg;
[0181] FIG. 63 is a perspective view explaining a structural
example of the left leg;
[0182] FIG. 64 is an explanatory diagram explaining the adjustment
of synchronization of the left and right legs by a cam;
[0183] FIG. 65 is a block diagram explaining the control system of
the dance robot;
[0184] FIG. 66 is a block diagram explaining the control system of
another dance robot;
[0185] FIG. 67 is a perspective view explaining the bipedal
robot;
[0186] FIG. 68 is a perspective view explaining the bipedal
robot;
[0187] FIG. 69 is a perspective view explaining the bipedal
robot;
[0188] FIG. 70 is an explanatory diagram explaining the bipedal
robot mechanism;
[0189] FIG. 71 is an explanatory diagram explaining the waist
portion frame;
[0190] FIG. 72 is an explanatory diagram explaining the upper knee
portion;
[0191] FIG. 73 is an explanatory diagram explaining the lower knee
portion;
[0192] FIG. 74 is an explanatory diagram explaining the ground
portion;
[0193] FIG. 75 is an explanatory diagram explaining the campulley,
long rod and short rod;
[0194] FIG. 76(1) to FIG. 76(4) are explanatory diagrams explaining
the movement of the bipedal mechanism corresponding to the rotation
of the camshaft;
[0195] FIG. 77(5) to FIG. 77(8) are explanatory diagrams explaining
the movement of the bipedal mechanism corresponding to the rotation
of the camshaft;
[0196] FIG. 78 is an explanatory diagram explaining the movement of
the leg of the robot;
[0197] FIG. 79 is an explanatory diagram explaining another bipedal
mechanism;
[0198] FIG. 80 is an explanatory diagram explaining the cam pulley,
long rod, short rod and spring of an example of another bipedal
mechanism;
[0199] FIG. 81 is an explanatory diagram explaining the turnabout
mechanism of the robot;
[0200] FIG. 82 is an explanatory diagram explaining the turnabout
mechanism of the robot;
[0201] FIG. 83 is an explanatory diagram explaining the turnabout
mechanism of another robot;
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0202] Embodiments of the present invention are now described with
reference to the attached drawings.
[0203] FIG. 1 through FIG. 4 show examples of a humanoid robot (pet
robot) as the electronic toy (domestic robot), and the diagrams
respectively illustrate the front view, back view, top view and
side view of the robot.
[0204] The robot 1 is structured by comprising a head portion 10, a
torso portion 20, left and right arm portions 30, and left and
right leg portions 40. The head portion 10 and the torso portion 20
are rotatably connected via a neck joint K6. The torso portion 20
and the arm portion 30 are rotatably connected via a shoulder joint
K1. An elbow joint K2 and a wrist joint K3 are provided to the arm
portion 30 so as to realize the free bending of the arm portion 30.
The torso portion 20 and the leg portion 40 are rotatably connected
via a hip joint K4. Moreover, a knee joint KS is provided to the
leg portion 40.
[0205] Provided to the head portion 10 are an after-mentioned
microcomputer system for controlling the robot, a window display
unit for enabling communication between the user and robot, a sound
sensor for collecting sound, a light sensor (or camera) for
acquiring peripheral information, a touch sensor, a speaker for
generation the sound of the robot, and so on. Further provided to
the head portion 10 are an after-mentioned waggle mechanism for
rotating the internal frame (not shown) of the head portion 10 and
a nodding mechanism (not shown) for moving the head portion 10 back
and forth. The neck joint K6 corresponds to the above.
[0206] The torso portion 20 comprises a motor as the source of
power, an arm opening/closing mechanism for rotating the left and
right arm portions 30 around the Z axis (vertical direction in FIG.
1) of the shoulder joint K1, an arm rotating mechanism for rotating
the left and right arm portions 30 around the X axis (horizontal
direction in FIG. 1) of the shoulder joint K1, and a neck rotating
mechanism for rotating the head portion 10 around the Z axis.
Moreover, ".smallcircle." and ".times." switches 54 as the
detection switch are provided to the torso portion 20.
[0207] A battery as the power source for activating the
aforementioned motor and microcomputer system or the like is
disposed inside the left and right leg portions 40. The battery may
also be disposed in the torso portion 20 or the arm portion 30.
When disposing the battery in the torso portion 20, bending of the
knee joint K5 becomes possible.
[0208] In addition, the bending of arms and legs can be realized by
disposing an actuator such as an electromagnet or micro motor
inside the respective arm portions or respective leg portions,
thereby enabling more humanlike movements.
[0209] Although the aforementioned electronic robot is in the shape
of a human, this electronic robot may also be in the shape of an
animal. Further, the display unit 71 capable of displaying text and
symbols on the face-corresponding portion of the head portion is
structured such that the information, which is input by the
operator who operates the input unit formed of a plurality of input
switches 51, 54 and so on provided to the body of the
aforementioned robot, can be visually confirmed with the display
unit 71 provided on the face-corresponding portion described
above.
[0210] FIG. 5 through FIG. 8 are diagrams for explaining the
mechanical structure built in the torso portion 20. FIG. 5 is a
front view of the mechanical structure, and FIG. 6 is the
perspective view thereof. FIG. 7 is an explanatory diagram
illustrating the components corresponding to the aforementioned arm
opening/closing mechanism and neck rotating mechanism in the
mechanical structure. FIG. 8 is an explanatory diagram illustrating
the components corresponding to the aforementioned arm rotating
mechanism in the mechanical structure.
[0211] As shown in FIG. 5 and FIG. 6, the mechanical structure 200
is structured by comprising a basic frame 201, a sub frame 202, a
neck (head portion) rotating mechanism 210 (c.f. FIG. 7), an arm
(or shoulder) opening/closing mechanism 220 (c.f. FIG. 7), an arm
rotating mechanism 230 (c.f. FIG. 8), a neck (head portion)
rotational axis (203), an arm rotational axis 204, a first motor
205, a second motor 206, a motor mounting plate 207 for fixing the
respective motors. to the frame 201, and so on.
[0212] The sub frame 202 is formed in an approximate horseshoe
shape, and is respectively provided to both left and right sides of
the basic frame 201 so as to be freely rotatable around the Z axis
against the frame 201. A bevel gear mechanism for changing the
transmission direction of the power is provided inside the sub
frame, and power is thereby transmitted to the arm rotational axis
205 even when the sub frame 202 rotates around the Z axis.
[0213] As shown in FIG. 7, the neck rotating mechanism 210 and the
arm opening/closing mechanism 220 are driven with the first motor
205. The rotational axis of the motor 205 is connected to a warm
gear mechanism 211 for changing the power transmission direction
and converting the torque, and rotates the head portion rotational
axis 203 via a spring clutch mechanism 212 as the safety device. A
frame (not shown) of the head portion 10 is connected to the upper
end of the head portion rotational axis 203 and rotates the head
portion 10 around the Z axis. Or, a warm gear mechanism may be
provided at the upper end of the head portion rotational axis 203
in order to realize the movement of the head portion nods back and
forth by obtaining the rotation around the x axis. An arm
opening/closing mechanism 220 is connected to the lower end of the
head portion rotational axis 203. The spring clutch mechanism 212
prevents the breakage of components by sliding when there is an
overload in the head portion rotational axis 203 or the sub frame
(arm opening/closing mechanism).
[0214] A cam mechanism 221 is provided to the lower end of the head
portion rotational axis 203. The cam mechanism 221 is structured by
comprising a plate 222 fixed to the axis 203, two arm mounting pins
223 provided to the plate 222, pins 224 respectively mounted on the
two sub frames 202, and two links 225 for rotatably connecting one
of the arm mounting pins 223 and the pin 224 of one of the sub
frames, and the other arm mounting pin 223 and the pin 224 of the
other sub frame 202, respectively. Each of the sub frames 202 is
rotatably retained with the basic frame 201 with the pin 226.
[0215] Thus, when the motor 205 rotates, the head portion
rotational axis 203 rotates in correspondence with the reverse
rotational direction, thereby rotating the head portion 10. The
plate 222 rotates pursuant thereto, thereby moving the link 226 and
moving the sub frame 202 around the Z axis. This enables the
movement of opening and closing the arm portion 30 (e.g., movement
of hugging). The motor 205 is controlled by the microcomputer. The
rotational quantum of the axis 203 or the comprehension of the
movement posture is, for example, grasped by reading the codes of a
sensor disk (not shown) provided to the tip portion of the axis203,
or with the combination of the cam and switch (not shown) provided
to the tip portion of the axis 203.
[0216] As depicted in FIG. 8, the arm rotating mechanism 230 is
driven with the second motor 206. The pinion gear mounted on the
rotational axis of the motor 206 drives a gear mechanism 231 formed
of a plurality of gears. This gear mechanism 231 further drives the
gear train 233 which propagates the driving force in the
longitudinal (horizontal) direction at the upper part inside the
basic frame 201. A clutch mechanism 233 as the protection mechanism
for preventing the breakage of components due to an overload is
provided between the gear mechanism 231 and the gear train 233. The
clutch mechanism 233, for example, generates a slide movement on
the rubber surface in the case of an overload via the rubber
friction plate (surface) sandwiched between the gears. Further, the
likes of the aforementioned spring system or a flexible
concave/convex plate may also be combined therewith.
[0217] The gear train 233, for example, is structured of 6 gears,
and the gears on both ends are provided within the sub frame 202.
And, these gears on both ends engage with the bevel gear fixed to
one end of the arm rotational axis 204 retained rotatably by the
sub frame 202. An arm portion 30 (not shown) is mounted to the
other end side of the arm rotational axis 204 via a collar 234
fixed to the axis 204. Therefore, the driving force of the motor
206 rotates the arm rotational axis 204 via the gear mechanism 231,
clutch mechanism 233 and gear train 232, and rotates the arm
portion 30 mounted on this rotational axis 204. A sensor is
provided to an appropriate position, to the collar 234 for example,
for detecting the rotational position of this arm and controlling
the motor.
[0218] FIG. 9 is a block diagram for explaining the control system
of a robot as the electronic toy. The robot comprises, as means for
detecting the peripheral situation and inputs, a touch sensor 51,
microphone (sound sensor) 52, light sensor (e.g., CCD camera) 53,
".smallcircle.".multidot.- ".times." switch 54 for generating
output corresponding to the operation of the ".smallcircle." button
and ".times." button, status (posture) sensor 55 and battery
voltage detection sensor 56. The touch sensor 51, for example, is
provided to the upper surface of the head portion 10 of the robot
(c.f. FIG. 3) and is capable of detecting the patting (contact) on
the head by the user. The touch sensor, for instance, is a micro
switch or a capacitance-detecting contact detection switch. The
status sensor 55 detects the posture of the robot. Outputs from
these respective sensors are supplied to the control unit 60. Based
on these inputs, the control unit 60 controls the motors 205 and
206, the window display unit 71 of the head portion, the speaker
72, and the joint actuator group 73. When performing a simpler
posture control in which detailed movements of the arms and legs of
the robot are not made, the joint actuator group 73 may be omitted.
Further, by internally providing a USB terminal or an infrared
interface, it is possible to incorporate a function of forwarding
the image read by the light sensor to a personal computer, PHS or
mobile phone. Moreover, when incorporating a function of storing
the user's name and calling such user's name in the robot, since it
is impossible to store each and every name at the initial stage of
shipment, a function may be provided where additional name data is
prepared in advance in the homepage server or in accordance with
the request from the user, such that the user may connect his/her
PC, PHS or mobile phone and the like to the USB terminal or
infrared interface of the robot in order to download and use the
desired name information from such homepage. The USB terminal or
infrared interface, for example, may be disposed in the back of the
head portion where the CPU is built in.
[0219] As illustrated in FIG. 10, the control unit 60 comprises a
CPU 61 as the central processing unit, a ROM 62 (storage means), a
RM 63 and a timer (time and calendar function). Stored in the ROM
62 are a movement control program for driving and controlling the
display unit 71, speaker 72, motors 205 and 206, and actuator group
73; posture control data for switching a plurality of movement
postures by controlling the rotational direction and rotational
quantum of the motors 205 and 206 (and actuator group 73) in
accordance with the posture of the robot to be set; sound control
data for generating voices and melodies to be output from the
speaker 72; display control data for making the display unit 71
display information to be displayed on the robot; program data for
calculating the biorhythm of the user; sound/image processing
program for judging the peripheral situation; for example, the
existence of the user, based on the sound input or image input of
the CCD camera; communication program (not shown) for externally
conducting data communication via a PHS and the like.
[0220] The sound/image program includes a sound processing program
for performing the likes of filter processing, discrimination
processing and modulation processing of input sound, and an image
processing program for detecting the peripheral brightness and
detecting the movement of the subject. Further, the movement
control program includes the likes of a movement selection program
for selecting the movement pattern and display pattern
corresponding to the situation among a plurality of movement
patterns based on the judgment result of the peripheral situation
pursuant to sound and/or images, and a posture control program for
performing control so as to move the head portion 10, arm portion
30, joints and the like in the selected movement pattern.
[0221] Stored in the RAM 84 are output data of the microphone 52
and the output data of the light sensor (camera) 53 pursuant to the
DMA operation via the interface of the microcomputer (not
shown).
[0222] The sound signal output by the microphone 52 is A/D
converted with the interface, low-pass filter processed so as to
eliminate noise and only extract the voice range of a person, and
retained as sound data by the RAM 63. Sound data is subject to the
sound processing program. This data is stored for a fixed period of
time and subject to sound recognition processing. The method of
sound recognition may either be recognition of general speakers or
recognition of a specific speaker. As a result of this sound
recognition processing, a command corresponding to the words
communicated by the voice of the user is output. The corresponding
movement control is enabled by this command being communicated to
the movement control program, thereby realizing the robot to make
movements, displays and enunciations corresponding to the
voice.
[0223] Moreover, in a standby state where the robot is not moving,
lifestyle sounds are collected by temporally observing the average
level of the sound data in order to distinguish whether the user is
near the robot.
[0224] The sound processing program, which includes the storage
processing of storing the sound in the memory 63, may also be used
as a so-called voice memo for storing the voice of the user.
Further, impersonation (voice imitation) by performing conversion
processing of tone and pitch to the stored sound data, and
forwarding this to the speaker 72 for vocalization is also
possible.
[0225] The output signal corresponding to one frame output from the
CCD camera as the light sensor is converted into image data with
the interface, and retained in the image storage region of the RAM
63. Image data is subject to the image processing program. For
example, in the standby state, the image is periodically sampled,
and changes in the image (movement of subject) based on the
difference of the image data of the previous frame and the image
data of the present frame are read. The existence of the user is
distinguished (or presumed) with the movement of the subject of the
camera. Moreover, there is no need to compare each and every frame,
and image data in a plurality of sections within the frame may be
compared. The peripheral brightness of the robot can be
distinguished with the average value of the image data (luminance).
Upon distinguishing the peripheral brightness, a CCD camera is not
essential, and a light detection element such as an SPD or
phototransistor may also be used. In such a case, for example, the
existence of the user may be distinguished by recognizing that it
is bright during night hours by combining the time and brightness.
It is also possible to distinguish the existence of the user by
distinguishing the existence of voices (or lifestyle sound) and the
brightness in the room. The existence or non-existence of the user
is displayed in the flag area of the RAM 63.
[0226] In addition, it is possible to externally forward the image
data read by the CCD camera 53 according to an external request via
a communication interface 74, and, for instance, it would be
possible to transmit the condition of the room, in correspondence
to the access from the mobile phone of the user, to such mobile
phone.
[0227] The respective outputs of the touch sensor 51,
.smallcircle..times. switch 54, status sensor 55 and the like are
set to the flags in the flag area of the respective switches of the
RAM 63 via the interface. Interruption is generated pursuant to
this setting of flags and event processing is performed
thereby.
[0228] Next, the operation of the control unit 60 is explained. The
robot as the electronic toy of the present invention moves in
conformity with the biorhythm, which is a parameter for
representing the physical condition (behavior) of the user, and
moves so as to exhibit a so-called healing atmosphere.
[0229] FIG. 11 is a flowchart explaining the input processing for
acquiring the birthday necessary for calculating the biorhythm of
the user.
[0230] For example, when the user simultaneously presses both the
".smallcircle." and ".times." buttons 54 provided to the torso
portion 20, it becomes a mode selection state (not shown). In this
state, various modes are sequentially displayed in prescribed time
intervals on the display unit 71. Included in the modes are
"calendar date setting", "clock time setting", "user name input",
"user birthday input", "user gender input", "voice memo input",
"voice sample input", "external (mobile phone) forwarding setting",
"energy saving setting" and so on. When the user presses the
.smallcircle. button when the "user birthday input" is displayed on
the screen, the birthday input program is activated and proceeds to
the present routine.
[0231] The control unit (CPU) 60 displays "Please input your birth
date", "Please input in the order of year, month and day" on the
liquid crystal panel or LED matrix of the display unit 71. When the
letter string does not fit in the size of the screen of the display
unit, the letter string is displayed so as to move (scroll display)
in the horizontal or vertical direction of the screen (S22) After
the display of "Please input your birth date", for example, the
last two digits of the Christian year "40" to "00 (current
Christian year)" corresponding to the range of age of target users
are sequentially displayed on the display unit 71 in prescribed
time intervals (S24). When the year in which the user was born is
displayed, the user presses the .smallcircle. button to select such
year. The operation of the .smallcircle. button or the .times.
button is distinguished by the setting of the corresponding flag
within the RAM 63. The control unit 60 distinguishes whether a
selection has been made or not (S26). When a selection is not made
even upon a prescribed time elapsing (S26; No), the displayed year
is repeatedly increased in increments of "1" (S24 and S26). When
selected (S26; Yes), the selected "year" is retained. Moreover,
after having pressed the .smallcircle. button, the user may cancel
such input by pressing the .times. button if it is within a
prescribed time.
[0232] When the "year" is selected, the routine proceeds to the
input of "month". The control unit 60, after having displayed
"Please input the month", sequentially displays "1" to "12" on the
display unit 71 in prescribed time intervals (S28). When the month
in which the user was born is displayed, the user presses the
.smallcircle. button to select such month. The control unit 60
distinguishes whether a selection has been made or not (S30). When
a selection is not made even upon a prescribed time elapsing (S30;
No), the displayed month is repeatedly increased in increments of
"1" (S28 and S30). When selected (S30; Yes), the selected "month"
is retained.
[0233] When the "month" is selected, the routine proceeds to the
input of "day". The control unit 60, after having displayed "Please
input the day", sequentially displays "1" to "31" on the display
unit 71 in prescribed time intervals (S32). When the day in which
the user was born is displayed, the user presses the .smallcircle.
button to select such day. The control unit 60 distinguishes
whether a selection has been made or not (S34). When a selection is
not made even upon a prescribed time elapsing (S34; No), the
displayed day is repeatedly increased in increments of "1" (S32 and
S34). When selected (S34; Yes), the selected "day" is retained.
When the input of "year", "month" and "day" is completed, the
control unit 60 writes the user's "year", "month" and "day" in the
user biorhythm data area of the ROM 62. The user's biorhythm
calculation is thereby possible. Moreover, as described later, it
is also possible to set the biorhythm of the robot such that the
robot will exert itself under its own biorhythm.
[0234] Similarly, the user sets the "calendar date", sets the
"clock time" built in the robot, inputs the "user's name", inputs
the "user's gender" and so on.
[0235] FIG. 12 illustrates an example of the aforementioned sound
processing (sound volume detection) of the control unit 60. The
control unit (CPU) 60 performs computing processing equivalent to a
low-pass filter for eliminating a high frequency noise component
from the sound data stored in the RAM 63 (S42). The average value
is sought by multiplying the amplitude level of the sound data in a
prescribed time frame of the processed sound data (S44) The control
unit 60 stores this average value (S46). The control unit 60
further judges the location of the user by distinguishing whether
the sound level increased rapidly by continuously observing the
average value of the sound level (S48). When it is judged that the
user exists (is located) in the room, the (sound) flag representing
the aforementioned location is set (S50)
[0236] FIG. 13 illustrates an example of the second sound
processing (sound recognition). The control unit (CPU) 60 performs
normalization processing in order to match the time axis and signal
level of the sound data stored in the RAM 63 with the contrast data
(S62). Characteristic parameters of the sound are extracted form
the normalized data (S64). Vocalization is distinguished based on
the extracted characteristic parameters, and the movement command
of the robot corresponding to the subject matter (meaning) of the
generation is output (S66). The flag representing this command is
set in the RAM 63 (S68). The control unit 60 thereby reads the
vocalization control data, display control data and posture control
data corresponding to the command and controls the movement of the
robot as described later.
[0237] FIG. 14 illustrates an example of the image processing of
the control unit 60. The control unit 60 compares the previously
stored image data from the CCD camera 53 stored in the RAM 63 in a
prescribed sampling frequency (S72) with the currently stored image
data (S74, and distinguishes changes in the image data. For
example, the difference in data of the respective pixels in
positions corresponding to both frames is sought and accumulated.
When the subject is moving, this accumulated value significantly
changes. Further, in order to reduce the operational load, changes
in data in a prescribed position on the screen; for example, the
center and four corners of the screen, may be compared (S76).
Whether the.subject moved (or changed) is judged with the CCD
screen (image) based on such difference (S78). When a moving body
exists, a flag representing movement detection (user location) is
set (S80). Further, the brightness inside the room can also be
distinguished with the average value of the luminescence of the
image data.
[0238] FIG. 15 is a flowchart explaining an example of judging
whether a user is located (or exists) based on the switches, sound,
movement of the subject, and so on.
[0239] In FIG. 15, the control unit 60 repeats this routine in
prescribed cycles during the standby state. Foremost, the control
unit 60 distinguishes whether the user directly operated the
switches with the likes of the touch sensor 51 or
.smallcircle..times. switch 54 by checking the relevant flag
(S102). If the switches have been operated (S102; Yes), since this
means nothing less than that the user exists, the flag representing
the location of the user is set (S112), and this routine is
ended.
[0240] When the switches have not be operated (S102; No), it is
distinguished whether both the movement detection flag (S80) based
on the results of the aforementioned image processing and the sound
detection flag (S50) based on the results of the sound processing
have been turned on (S104). When both flags have been turned on
(S104; Yes), since the probability of the user's location is high,
the flag representing the location of user is set (S112), and this
routine is ended.
[0241] When both flags have not been turned on (S104; No), it is
distinguished whether one of the flags have been turned on (S106).
When neither flag has been turned on (S106; No), since the
possibility of the user being in the room is low, the flag
representing the user's location is turned off or reset S110), and
this routine is ended. When one of the flags have been turned on
(S106; Yes), it is judged whether the present time is within the
movement prohibition time frame preset by the user or the factory
in advance (S108). For example, this enables the prevention of
annoyance caused by movements in the middle of the night as well as
the prevention of wasteful movements during the time of absence.
When it is outside the movement prohibition time frame (S108; No),
the flag representing the existence of the user is turned on
(S112), and this routine is ended. When it is within the movement
prohibition time frame (S108; Yes), the user flag is turned off or
reset (S110), and this routine is ended.
[0242] FIG. 16 is a flowchart explaining another example of judging
the location (or existence) of the user based on the switches,
sound, movement of the subject, and so on.
[0243] In this example, the brightness of the room is detected
instead of the detection of movement, and differs from the case
depicted in FIG. 15 in that the user is considered to exist nearby
when the room is bright. In other words, it is distinguished
whether the flag representing that the room is bright pursuant to
the results of the aforementioned image processing or based on a
phototransistor, and a sound detection flag (S50) pursuant to the
results of the sound processing have been turned on (S120) The
other routines are the same as the case in FIG. 15 and explanation
thereof is omitted.
[0244] Next, an example of the robot movement control is explained.
The example shown in FIG. 17 illustrates an example where the robot
reacts in correspondence with the biorhythm of the user.
[0245] The control unit 60, for instance, executes this routine
when it is activated in the morning. Foremost, it is judged whether
the user exists in (or near) the room (S132). When the user does
not exist (S132; No), this routine is ended. When the user does
exist (S132; Yes), the built-in calendar is read (S134). The
biorhythm of the user is calculated as depicted in FIG. 18 based on
today's date and the birth date of the user (S136). Event
occurrence dates are set in this biorhythm beforehand. For example,
an event occurrence date shall be turnabout points El and E3 in
which the positive and negative behavior switches, optimum point E2
and worst point E3. It is then judged whether today corresponds to
an event occurrence date set in advance (S138). When it is not an
event occurrence date (S138; No), this routine is ended.
[0246] When it is an event occurrence date (S138; Yes), it is
judged whether it is a preset time; for example, a time for the
user to start work (S140). When it becomes the set time (S140;
Yes), processing (robot movement) corresponding to the biorhythm of
the event occurrence date is selected. For example, when it is
event E1, the "happy eyes" as illustrated in FIG. 19(A) and the
text (scroll display) of "You'll start feeling better" as
illustrated in FIG. 19(F) are displayed on the display unit 21.
Moreover, the likes of "good luck" are output from the speaker 72.
When it is event E2, the "heart eyes" as illustrated in FIG. 19(C)
and the text of "It's a perfect day for you" are displayed on the
display unit 21. Moreover, the likes of "Don't get too excited" are
output from the speaker 72. When it is event E3, the "disappointed
eyes" as illustrated in FIG. 19(D) and the text of "Please take
care of your health" are displayed on the display unit 21.
Moreover, the likes of "Don't work too hard" are output from the
speaker 72. When it is event E4, the "round eyes" as illustrated in
FIG. 19(E) and the text of "Please watch out for accidents" are
displayed on the display unit 21. Moreover, the likes of "Today is
not your lucky day" are output from the speaker 72.
[0247] FIG. 20 is a flowchart showing an example of controlling the
movement such that the movement of the robot changes with time.
When entering this movement mode, the control unit (CPU) 60
foremost distinguishes whether the user exists nearby with the
settingof the aforementioned flags (e.g., S126) (S152). When the
user does not exist nearby (S152; No), solo play is implemented
from time to time. Solo play, for example, is represented with a
play state by displaying a one-person game on the display unit 71.
Random numbers are thereby generated (S154) in order to judge
whether the number for solo play has been output (S156). When such
number is not output, this routine is ended (S156; No). When such
number is output, solo play data is extracted from the posture
control data, sound control data and display control data and set
in the movement control program (S158).
[0248] When the user does exist (S152; Yes), the control unit 60
reads the present time from the internal clock (S160), and judges
whether this time is a time to wake up (S162).
[0249] When it is a time to wake up (S164; Yes), the control unit
60 sets a movement control program by extracting (wakeup) data for
waking the robot up from the posture control data, sound control
data and display control data (S164). The robot thereby performs
wakeup operations such as "Good morning", "I'm awake" and the like.
If it is not a time to wake up (S164; No), it is subsequently
distinguished whether it is a time to send off the user (S166).
[0250] When it is a time to send the user off (S166; Yes), the
control unit 60 sets a movement control program by extracting data
for sending the user off from the posture control data, sound
control data and display control data (S168). The robot thereby
performs sendoff operations such as "It's time to go", "Have a good
day" and the like. If it is not a time to send the user off (S166;
No), it is subsequently distinguished whether it is a predetermined
time for the user to come home (S170).
[0251] When it is a time for the user to come home (S170; Yes), the
control unit 60 sets a movement control program by extracting data
for welcoming the user home from the posture control data, sound
control data and display control data (S172). The robot thereby
performs welcome operations such as "Welcome home", "Good to see
you" and the like. If it is not a time to welcome the user home
(S170; No), it is subsequently distinguished whether it is a
predetermined time for the user to go to sleep (S174).
[0252] When it is a time for the user to go to sleep (S174; Yes),
the control unit 60 sets a movement control program by extracting
operational data for sleeping from the posture control data, sound
control data and display control data (S176). The robot thereby
performs goodnight operations such as "Good night", "See you
tomorrow" and the like, and thereafter enters a power saving mode
(sleep mode) If it is not a time to go to sleep (S174; No), it is
subsequently distinguished whether it is a predetermined time of
the user's alarm setting (S178).
[0253] When it is a time of the user's alarm setting (S178; Yes),
the control unit 60 sets a movement control program by extracting
alarm data from the posture control data, sound control data and
display control data (S180). The robot thereby performs operations
for informing the time such as "It's time", "Wake up" and "It's
(present time)". If it is not a time of the alarm setting (S178;
No), this routine is ended.
[0254] As shown in FIG. 21, the control unit 60 performs display
control of the display unit 21 pursuant to the display control set
in the control program (S202). The posture of the robot is
controlled by controlling the motors 205 and 206 pursuant to the
posture control data set in the control program (S204). Further,
sound is output from the speaker 72 with the vocalization mechanism
(synthesizer, sound data reproduction) pursuant to the sound
control data set in the control program (S206).
[0255] FIG. 22 illustrates an operational example of the robot when
the operational data of "delight" is set in the control
program.
[0256] FIG. 23 illustrates an operational example of the robot when
the operational data of "pleasure" is set in the control
program.
[0257] FIG. 24 illustrates an operational example of the robot when
the operational data of "sorrow" is set in the control program.
[0258] FIG. 24 illustrates an operational example of the robot when
the operational data of "affection" is set in the control
program.
[0259] As described above, the electronic toy of the present
invention can be connected to the likes of a PHS, mobile phone and
standard circuit, and the user may view the situation inside the
house by forwarding the image obtained by the robot to
himself/herself.
[0260] When the remaining battery level is low, the control unit 60
yields expressions of vocalized words such as "I'm going to sleep
to save the batteries, okay?" pursuant to the battery voltage
detection sensor 56.
[0261] As described above, the robot as the electronic toy depicted
in the present embodiment expresses emotions with its entire body,
and it is thereby possible to yield so-called healing element in
the toy since operations as though seeking communication with the
user are enabled. Moreover, various conversations are also
realized.
[0262] FIG. 26 and FIG. 27 illustrate examples of another robot as
the electronic toy. The components in FIG. 26 and FIG. 27
corresponding with FIG. 1 have the same reference numerals, and the
explanation of such components is omitted.
[0263] The robot of this example comprises the same structure and
functions as the robot illustrated in FIG. 1, but has a display
unit 71 which covers approximately the entire front part (face) of
the headportion 10. The display unit 71, for example, may employ an
LCD display unit, but is not limited thereto. Moreover, the
.smallcircle..times. switch 54 is disposed on the upper surface of
the head portion.
[0264] As previously depicted in FIG. 19, the display unit 71, as
illustrated in FIG. 28 and FIG. 29, expresses various expressions
(emotions) of the robot. The robot is able to decide these
expressions in correspondence with the various modes described
later. FIG. 28(a) is expressing the facial state of "pleasure",
FIG. 28(b) "dizziness", FIG. 28(c) "anger", and FIG. 28(d)
"sentimentality", respectively. Further, FIG. 29(a) is expressing
the state of "sadness" and FIG. 29(b) "sleep". The "sleep" state is
the power saving mode, and is similar to the power saving mode of a
personal computer. In addition, the control 60 stores approximately
300 facial display animations for changing the facial expression.
For example, three basic facial patterns are prepared for the
respective modes of "delight", "anger", "sorrow" and "pleasure",
and sound and movement are additionally combined in correspondence
with the respective modes.
[0265] FIG. 30 is a diagram explaining an example wherein the robot
illustrated in FIG. 1 or FIG. 26 has its own biorhythm. The user
biorhythm data of ROM 62 described above can be replaced with the
biorhythm function program of the robot. Moreover, for example, it
would also be possible to integrate a function that changes
sinusoidally and making this the function for representing
emotions. The personal biorhythm of the robot generates random
numbers when the insulation paper is removed from the battery
housing and power is supplied, a random start position (initial
value) as depicted with the plurality of points in the sinusoidal
wave of FIG. 30 is selected based on the results thereof, and the
biorhythm is accordingly made to differ per robot. Incidentally, as
the random numbers, the data spread of the switching operation when
the switch (not shown) is pressed with the mechanical movement upon
activating the motor may also be used as such random numbers for
setting the initial value.
[0266] The amplitude value of the function created by the biorhythm
is utilized as one of the emotion parameters of the control
parameter (c.f. FIG. 18). The four operational modes are set in
accordance with the value of the emotion parameter. The first range
containing the center of the amplitude is the "normal mode", and a
"pleasure mode" in which the robot is of a happy feeling is defined
in a prescribed range thereabove, and a "delight mode" in which the
robot is full of delight is defined in a prescribed range further
thereabove. Moreover, a "sorrow mode" in which the robot is sad is
defined in a prescribed range below the "normal mode", and an
"anger" in which the robot is angry is defined in a prescribed
range further therebelow. Although the robot repeats these modes
periodically, the time in which the robot exists in the "delight
mode" and "anger mode" is relatively short.
[0267] Further, a biorhythm with a short cycle may be set for
demonstration exhibitions in front of the store by performing
specific switching operations. For instance, one cycle can be set
to 5 minutes. Expressive changes and gestures accompanying the
emotional changes of the robot can thereby be shown to the audience
in a short time span in order to introduce the capabilities and
characteristics of this robot.
[0268] Control examples of the robot employing the expressions
illustrated in FIG. 28 and FIG. 29 are now explained.
[0269] FIG. 29(a) shows an expression saying, "Stop it!" when a
prank is played on the robot. In order to perform this type of
gesture in a timely manner, it would be amusing if this kind of
expression is displayed on the display unit when a high level of
sound is continuously provided to the robot.
[0270] Thus, in a mode for performing this type of operation,
output of the microphone 5 as the sound detection means is
monitored with the control unit 60 via the low pass filter for
eliminating noise, and it is determined whether a sound signal
exceeding a prescribed level is continued for a prescribed time;
for example, beyond 10 seconds. If such sound signal continues,
since it is "noisy", the control unit selects the expression of the
robot illustrated in FIG. 29(a) from the storage means (62, 63) and
displays this on the display unit. Moreover, since the selective
operation of the expression is conducted pursuant to the value of
the aforementioned emotion parameter, the same results can be
obtained even if the emotion parameter value is changed to an
"unpleasant" level.
[0271] The expression of FIG. 29(b) represents a "sleep" state. It
would be amusing if the robot would express this type of sleeping
expression when a blanket is placed over the robot or when the
periphery becomes dark.
[0272] Thus, in a mode for performing this type of operation, the
light sensor 53 (e.g., CCD, photodiode, phototransistor, etc.) as
the light detection means detects the peripheral light intensity.
The control unit 60 monitors this light intensity and judges
whether a dark state continues beyond a prescribed time; for
example, beyond 10 seconds. If such dark state continues, since the
"periphery is dark", the control unit selects the "sleeping"
expression of the robot illustrated in FIG. 29(b) from the storage
means (62, 63) and displays this on the display unit. Moreover,
since it would be amusing if the robot exhibits a reluctant gesture
when a blanket is placed over the robot, the mechanical components
of the arm 30 or the like may also be made to move for a prescribed
time.
[0273] When the switch 54 of the head portion is continuously or
intermittently (repeated tapping) operated for a prescribed time
(or a prescribed number of times), this can be considered as the
user tapping or padding the head of the robot. It would be amusing
if the robot reacts to this type of operation.
[0274] Thus, the control unit 60 monitors the output of the switch
54 or touch sensor 51, and distinguishes whether the operation is
continued for a prescribed time; for example, 10 seconds. When
operation is being made, the expression, words, sound, etc. in
accordance with the emotion of the robot at such time are output.
For example, if the emotion parameter is in an "unpleasant" state,
the unpleasant expressions such as the "painful" expression shown
in FIG. 29(a) or an "angry" expression is displayed. When the
emotion parameter is in a state of "delight", the "sentimental"
expression shown in FIG. 28(d) is displayed.
[0275] FIG. 31 is a flowchart for explaining an example of a
"soliloquy mode" of the robot reflecting the aforementioned
biorhythm.
[0276] The control unit (CPU) 60 executes the soliloquy mode when
corresponding to a soliloquy start condition; for example, the
condition falling under "user absent" and "generation of prescribed
random numbers" (S270; Yes). Foremost, the emotion parameter
representing the biorhythm amplitude, which is one type of control
parameter, is read (S272). It is then judged which of the foregoing
5 modes corresponds thereto from this value (S274). Mode judgment
is conducted by comparison with the threshold values of the
respective modes, and the result thereof is output (S274 to S284).
Display of the expression corresponding to each of the judged modes
and, as necessary, robot control accompanying the movement and
sound is additionally performed (S286).
[0277] For instance, when it is judged as the "anger mode", as
shown in FIG. 32, "I'm mad", "I quit" and "No more robot" are
sequentially displayed on the display unit 71 capable of display 8
characters. This display is repeated for a prescribed time. In
addition, an angry pose (not shown) of the robot can also be
made.
[0278] Similarly, when it is judged as other modes, a display
corresponding to such operational mode is selected and, as
necessary, the corresponding movement is made. FIG. 33 and FIG. 34
illustrate display examples of words in the normal mode. In the
former examples a sentence is created using the term "IT
(information technology)" stored in advance or input by the user.
"Know what?", "IT is" and "the trend" are sequentially displayed on
the screen. In the later example, a sentence is created in a
haiku-like format (Japanese unrhymed verse form having three lines
containing 5, 7 and 5 syllables, respectively).
[0279] When the value of the emotion parameter exists in the range
of the delight mode or anger mode, the robot may perform a
"single-action performance" of delight or anger corresponding
thereto. For instance, while playing background music, the robot
can say, "I feel good! I will now impersonate (so and so) !" and do
an impersonation, or rotate its arms or display "question mark "
eyes and so on.
[0280] Next, the text communication mode for the robot to seek
simulated communication with the user through Q&A is
explained.
[0281] FIG. 35 is a flowchart explaining this mode. For example,
when a certain condition such as the user existing nearby is
satisfied by sound, movement, switch operation, light or the like
(S240; Yes), the text communication mode is commenced. The text
communication mode is for seeking communication with the user by
the robot displaying text on the display unit. The control unit 60,
as illustrated in FIG. 38 and FIG. 39, selects a question from
pre-stored question data (S242). The respective questions are made
to be distinguishable in advance; namely, those that change the
emotion of the robot depending on the response as depicted in FIG.
38, and those that do not affect the emotion of the robot
irrespective of the response as depicted in FIG. 39. The control
unit 60 displays the selected question on the display unit (S244)
When the .smallcircle..times. button is operated (S245), it is
judged whether the question will affect the emotion (S246). If the
question will not affect the emotion (S246; No), response storage
processing is performed as necessary) (S256). This processing is
used, for instance, when the user presses ".smallcircle." to the
question of "Do you like", "cars?", in the modes described later by
remembering that the user "likes cars" upon storing such
response.
[0282] When a question that will affect the emotion is inquired
(S246; Yes), and an answer of ".smallcircle." is given to, for
example, a question shown in FIG. 36 asking, "Are", "we",
"friends?" (S248; Yes), the ".smallcircle." corresponding
processing is performed. In the case of this example, the robot
expresses its delight, for example, by making the "affectionate"
pose shown in FIG. 25 and displaying the "affectionate" expression,
and raises the emotion parameter in the plus direction (S250).
Meanwhile, when the answer is".times." (S248; No), ".times."
corresponding processing is performed. In the case of this example,
the robot expresses its sadness, for example, by making the
"sorrowful" pose shown in FIG. 24 and displaying the "dislike"
expression, and lowers the emotion parameter in the minus direction
(S252). This, as shown in FIG. 40, moves the biorhythm in the state
of sorrow. The robot thereby moves to a mode for expressing a
sorrowful expression.
[0283] Next, favorable image calculation is conducted. The
favorable image is a parameter corresponding to the robot's
feelings toward the user. In the aforementioned question, plus n
points are added when a response is made which makes the robot
happy. Moreover, minus m points are added when a response is made
which makes the robot sad. The values of n and m differ depending
the respective questions. The favorable image is determined based
on such integrated values (S254).
[0284] Simulated communication (transmission) between the robots is
now explained with reference to FIG. 41 to FIG. 45.
[0285] FIG. 41 shows an example of conducting data exchange by
connecting the robots bi a communication cable 741. The
communication interfaces 74 of the control unit as shown in FIG. 43
are connected via the connector (not shown) provided on the back
face of the robot.
[0286] FIG. 42 shows an example of connecting a PHS or mobile phone
742 to the communication interface 74 of the robot and conducting
data exchange between the PHS or mobile phone 742 in a distant
place and the connected robot via a mobile communication network as
shown in FIG. 44. Moreover, as shown in FIG. 42, a card module of
the PHS or mobile phone may be integrated in the back face of the
robot. The connection example of the communication interface 74 and
the PHS or mobile phone 742, 743, etc. described in this embodiment
of the present invention includes cases where a telephone
communication function itself is installed in the robot.
[0287] FIG. 45 shows an example of connecting the communication
interface 74 of the robot with the personal computer 743 connected
to the Internet 745 as the communication network, and, similarly,
of conducting data communication with other robots connected to the
Internet 745. Further, the description of FIG. 45 is omitting
providers and the like that provide Internet connection
services.
[0288] The composition shown in FIG. 46 (as well as in FIG. 49 and
FIG. 50 described later) shows a system capable of obtaining robot
data by communication with the server device. Thus, the
communication interface 74 of the robot is connected to the server
device 750 of such robot via a communication means such as the PHS
or mobile phone 743, Internet 745 or telephone communication
network. Supplied from the server device 750 via a communication
network such as the Internet 745 are data such as words and current
affair terms corresponding to the user characteristics or
attributes described later and control data for controlling the
robot gesture.
[0289] FIG. 47 explains an example of data exchange upon connecting
robot A and robot B with a communication cable 41. The robots are
foremost connected with the cable. Next, the mode selection status
is entered into by simultaneously operating the ".smallcircle." and
".times." button(switch 54) of the respective robots, for example,
and the "communication" mode is selected thereby. When both robots
enter the communication mode, communication parameters are
exchanged between the robots, communication conditions and so on
are set, and communication is started.
[0290] Robot A transmits the user name of robot A, terms and so
forth which it stores. The user name, for instance, is stored by
the user inputting his/her name through sequential selection of the
corresponding character displayed on the display unit. Stored
terms, for example, can be obtained by storing the response to the
questions inquired by the robot as described above (S256). This
includes various terms such as the likes and dislikes of the user,
age, male/female, personality, etc. Data is forwarded from robot A
to robot B and, when robot B confirms such data, an ACK signal
representing the reception of data is transmitted. When data
reception ends in failure, a NACK signal is transmitted. When robot
A receives the NACK signal, robot A retransmits the data. When
robot A receives the ACK signal, it distinguishes the success of
data transmission, enters the standby state, and awaits the signal
from robot B.
[0291] Subsequent to the transmission of the ACK signal, robot B
transmits to robot A data on the user name of robot B and stored
terms which it retains. When robot A confirms the reception of such
data, an ACK signal representing the reception of data is
transmitted. When data reception ends in failure, robot A transmits
a NACK signal, and robot B receiving such signal retransmits the
data.
[0292] Pursuant to such data exchange procedures, transferred from
robot A to robot B are, for example, ".DELTA..DELTA." (user name),
".quadrature..quadrature..quadrature..quadrature." (word 1),
".times..times..times..times." (word 2), and so on, and transferred
from robot B to robot A are the likes of
".smallcircle..smallcircle." (user name),
".quadrature..quadrature..quadrature..quadrature." (word 1), and so
on.
[0293] These words are applied to a standard sentence selected from
a plurality of standard sentences stored in advance, and output by
at least one of a sound and screen display of text. The output
timing of sound and display and selection of the standard sentence,
for example, may be set in accordance with the initial exchange of
communication parameters.
[0294] For example, as shown in FIG. 47, when robot A vocalizes
"Hi! Is (name of user) taking good care of you?", robot B
thereafter vocalizes "Uh-huh. But (he/she) is mean sometimes, and
makes a (adjective; "scary" for example) face". Then, robot A
vocalizes "(Adjective; "Scary" for example)? Well, (name of user)
looks (adjective; "freaky" for example), too, robot B thereafter
vocalizes "(Adjective; "Freaky" for example)!! It's not easy being
a robot. See you later!", and robot A finally vocalizes "I know
what you mean. Bye-bye!" When the user of the robot hears such
vocalization nearby, he/she will receive the impression as though
the robots are having a conversation.
[0295] Further, instead of the connection cable 741, an infrared
communication interface employed in remote controllers and portable
terminals may also be used.
[0296] FIG. 48 is a communication diagram explaining the procedures
in the case of conducting data communication between robots with a
portable telephone.
[0297] Here, since robot A and robot B are distant from each other,
the vocalization or display of words will be as though a
soliloquy.
[0298] Foremost, the respective users of robot A and robot B
connects his/her robot to a PHS or mobile phone and makes a call to
the telephone of the other party. When the communication line
between the telephones is connected, communication parameters are
exchanged, and, for example, the mutual data communication speed is
set to the slower speed among the telephones. For example, in the
case of a PHS (communication speed of 64 k bit/second) and a mobile
phone (9600 bit/second), data communication is conducted at 9600
bit/second. When the communication parameter is set, data is
transmitted from one of the robots (robot A) to the other robot
(robot B). For example, words such as "Taro" (user name), "nap"
(personal favorite), "MD" (personal favorite), "pachinko" (personal
favorite), "Sazaesan (character of cartoon)" (personal favorite),
"Pochi (name of dog)" (personal favorite), "Thunderbird" (personal
favorite) and so on are transmitted. Robot B transmits the ACK
signal when there are no abnormalities in the received data, and
transmits the NACK signal when there are abnormalities. When robot
A receives the ACK signal representing reception from robot B, it
enters a standby state. Robot retransmits the data upon receiving
the NACK signal.
[0299] Subsequent to the transmission of the ACK signal, robot B
transmits data which it stores to robot A. For example, robot B
transmits words such as "Hanako" (user name), "Chocolate" (personal
favorite), "F-1 racers" (personal favorite), "tea mushroom"
(personal favorite), Pico (personal favorite), and the like.
[0300] Robot A and Robot B respectively select a standard sentence
stored in advance, complete the sentence by applying the received
data in the black space in the standard sentence, and output
communication results by conducting at least one of a vocalization
or text display. Attributes of the word to be filled in such black
space should be predetermined; for example, the user's name,
personal favorite, personal dislike, age, weather, and so on.
[0301] For instance, robot A would vocalize "Yeah, I received data
from `Hanako` who likes `chocolate`", "Hey, are `F1 racers`
delicious?", "Hanako taught me `tea mushrooms,` but I don't know
what they are . . . ", "Are `Hanako's` pants cool?", "Maybe
`Hanako` is a "Pico" mania", and so on.
[0302] And, for example, robot B would vocalize "Let's see. I
received data from `Taro` who likes to take `naps`", "Are MDs the
thang with young hipsters?", "I love `pachinko`", "Is `Sazaesan`
really smart?", "`Taro` said this year's `Pochi` is well made.",
"Wouldn't it be scary if they had a `Thunderbird` ramen?", and so
on.
[0303] Since simulated conversation is created by exchanging data
that is mutually retained by the robots as described above, users
in distant places can also enjoy themselves.
[0304] FIG. 49 and FIG. 50 show examples of renewing the data
retained by the robot with the server device 750 illustrated in
FIG. 46.
[0305] It would be amusing if the robot could speak words in view
of the times. It would also be amusing if the robot is able to
speak words corresponding to the individual characteristics of the
user such as age, gender, hobbies, and so on. Nevertheless, it is
difficult cost-wise to realize such functions in an electronic
toy.
[0306] Thus, as shown in FIG. 50, the aforementioned function is
provided inexpensively by adequately providing from the server
device data such as requisite words pursuant to the server device
and data (control program) for controlling the operation of the
robot upon speaking such words. The control program may be used for
controlling the series of operations pursuant to such program, or
may be used for designating the operation of one among a plurality
of operational control programs such as "delight", "anger",
"sorrow" and "pleasure" pre-stored in the robot.
[0307] Data exchange procedures in the aforementioned case is now
explained with reference to FIG. 49. Foremost, as shown in FIG. 49,
the communication interface 74 of the robot is connected to a PHS,
mobile phone or personal computer 743, and then connected to the
server device 750 via a communication network, the Internet 745 for
example, in order to establish the circuit for conducting data
transmission. Communication parameters required for establishing
communication such as the communication speed, specification of
electronic toy, ID, password, and so on are transmitted from robot
A to the server device 750. The server device 750 conducts
authentication on whether to permit the connection, and thereby
permits access to robot A. The robots request the transmission of
updated data. Here, it is possible to designate the likes of
designation ideas and user-adaptable data. The server 750 transmits
a requested designation ideas, for example, in only a required
number of words. In the example illustrated in the diagram, "divine
nation comment", "train accident", "New Years", "Christmas" and so
on are transmitted. It is also possible to forward a new standard
sentence suitable for such words. Moreover, as necessary, it is
also possible to provide control program data 1, program data 2,
program data 3, for controlling the robot operation upon vocalizing
standard sentences employing the aforementioned words. Similarly,
it is also possible to select and transmit user-corresponding words
among the word groups prepared beforehand in correspondence with a
plurality of user characteristics. For instance, when the user
characteristic is a businessman, "convertible bonds", "clear note",
"monster" and the like are transmitted. Here, it is also possible
to define the operation of the robot with respect to a specific
word. In such a case, control program data (41, 42) is also
transmitted together with the word data.
[0308] When robot A receives the data, it stores this in the memory
63. The ACK signal is transmitted to the server, the circuit is
opened, and the update is completed. When data reception ends in
failure, the NACK signal is transmitted to the server, and
retransmission of the data is requested. When the server device
receives the ACK signal from robot A, or when the circuit is
opened, it ends the communication with robot A.
[0309] Robot A applies the acquired words in the standard sentences
and conducts at least one of a vocalization or text display
(sentence display). Further, although the robot has a function of
converting text data into sound, sound data of words and standard
sentences may be received from the server device and vocalized by
encoding the same.
[0310] "Action mail" is now explained. Action mail is for
displaying or reading the contents of the e-mail received by the
robot as well as to perform prescribed actions corresponding
thereto; for example, the movement of the arms and legs and
representation of expressions.
[0311] FIG. 51 and FIG. 52 show structural examples in the case of
conducting action mail. The e-mail sender downloads beforehand the
action mail software, which can be downloaded via the Internet, in
his/her personal computer 743a. The personal computer 743a in an
environment connected to a communication network such as the
Internet 745 capable of e-mail communication. The sender creates an
e-mail message by operating an input device such as a keyboard
device. The aforementioned software downloaded into the personal
computer includes a message/operation editing program for
conducting text input, message editing, control movement input, and
so on; a data/sound conversion program for converting the message
into sound data; and an e-mail program with a data file attachment
function capable of transmitting sound data as the attachment
file.
[0312] The sender creates an e-mail by utilizing control
information and the message/operation editing program. E-mail, for
example, as shown in FIG. 52, designates the name of sender (4
letters for example), message (44 letters for example), and
operation of the robot. These can be assembled with text data.
Next, the text code is converted into a sound signal, an FM
modulation signal for example, with the data/sound conversion
program. Information on the name, message and robot operation, for
example, may be classified in three-second silent intervals as
shown in FIG. 52. Further, headers and footers (not shown) may also
be suitably added. Such FM sound is converted into sound data; for
example, sound data formats such as WAV, MP3, ram, and so on. The
e-mail program transmits this sound data file, upon attaching it to
the e-mail, to the party on the other side of the line using the
robot.
[0313] E-mail is transmitted to the e-mail server device of the
other party via the Internet 745. Moreover, although simplified in
the diagram, various server devices containing a communication
circuit and e-mail server as well as connection service providers
are included in the Internet 745.
[0314] The receiver downloads beforehand into his/her personal
computer the communication software having an action mail reception
function obtainable via the Internet. Decoding of the sound file is
included in the reception function. The receiver connects the robot
to his/her personal computer 743b. The receiver accesses the e-mail
server (not shown) with the personal computer 743b and downloads
the e-mail sent to such receiver. When the e-mail requires the use
of the robot, the attached sound file is reproduced with the
aforementioned communication software in order to demodulate the
sound signal. This sound signal is supplied to the control unit 60
of the robot via the communication interface 74. The control unit
60 demodulates the FM signal and converts this into digital data.
Control information such as the name of sender, message and
movement is distinguished from the data. As described above, this
is distinguishable with the blank space between the data. The
control unit 60 converts the text data into image data and displays
the same on the display unit 71. Here, foremost, the name of the
sender is displayed, and a long message can be scroll displayed
thereafter on a small display unit screen. The text data may also
be read aloud. This is repeated a prescribed number of times.
Needless to say, the entire message may be displayed when employing
a large display unit. Further, the control unit 60 controls the
motors 205 and 206 based on the operational control information and
makes the robot perform operations corresponding to the message.
The control of the action operation may also be performed pursuant
to the control code stored beforehand in the ROM of the robot, or
by the sender designating a control program formed from a series of
control codes. Moreover, the sender may program, to his/her liking,
the series of movements of the robot by assembling control codes
corresponding to the individual operations.
[0315] The message display and action movement corresponding to the
reception of action e-mail may be performed simultaneously.
Further, the action may be performed first, and the message
displayed thereafter. Or, the message may be displayed first, and
the action movement performed thereafter. These may be repeated or
combined. Moreover, the sender may create a voice message and
transmit such message as an attachment file, and the robot may
reproduce this from the speaker as a voice message.
[0316] When the, robot has a built-in telephone function of a PHS
or mobile phone, the control unit 60 downloads the action mail
software via the communication function and may possess an email
receiving function. In such a case, the control unit 60 of the
robot receives the e-mail and may perform the conversion of the
sound file implemented by the personal computer 742b, and the
personal computer will no long be necessary. Further, the
structures illustrated in FIG. 44 to FIG. 46 are also able to
conduct action mail.
[0317] The server device may also be made to be the sender of the
action mail. For example, the robot may speak a one-word message,
today's fortune, shopping information, weather forecast, current
affairs, and the like together with action in accordance with the
characteristics and attributes of the user. For instance, if it
snowed the previous day, the server device will transmit "It snowed
a lot yesterday. And boy was it cold!" (message) and a "jerky move"
(movement+facial expression).
[0318] FIG. 53 to FIG. 56 show examples of movements (action
movements) accompanying the facial expressions displayed together
with the message. FIG. 53 is representing "delight" by raising both
arms upward at an angle and displaying hearts on the face. FIG. 54
is representing "anger" by raising the hands near the head and
displaying slant eyes on the face. FIG. 55 is representing "sorrow"
by lowering both hands and displaying tear-filled eyes. FIG. 56 is
representing "pleasure" by placing both arms forward and displaying
a smiling expression on the face.
[0319] Although a structural example of the movement mechanism of
the lower body of the robot was exemplified in FIG. 5 to FIG. 8,
another structural example of the movement mechanism of the lower
body of the robot is now explained.
[0320] FIG. 57 and FIG. 58 are perspective views showing an example
of a robot structured so as to change the movement of its lower
body pursuant to the "volume", "speed", "rhythm" and so on of sound
such as music.
[0321] In this example, the robot makes gestures as though of
dancing by opening and closing its legs left and right in
accordance with the music. Among these movements, FIG. 57 shows the
first state where the robot is standing upright with both feet
together. FIG. 58 shows the second state where the robot is opening
its legs left and right. The robot consecutively moves from the
first state to the second state, and consecutively moves from the
second state to the first state. Upon opening its legs left and
right, as shown in FIG. 58, the robot is made to bend the knees
pursuant to the mechanism described later so as to simulate the
movement of a human.
[0322] FIG. 59 and FIG. 60 are perspective views showing the drive
portions of the leg opening/closing mechanism 300, and FIG. 59
shows the closed state of the legs and FIG. 60 shows the open state
of the legs. A motor 301 is built in the lower part of the left leg
of the robot, and the driving force is increased by the gear
mechanism 302. Driving force rotates and drives the left leg cam
mechanism 306 inside the waist portion frame upon going through the
hip joint portion 305 of the waist portion frame 304 via the drive
shaft 303. One end of the link 308 is rotatably connected to the
cam 307 of this mechanism via a roller bearing. The other end of
the link 308 is rotatably mounted on the roller bearing 311 on the
upper end of the right leg axis 310. Mounted on the bottom end of
the right leg axis 310 is a roller portion 312 for sliding on the
floor surface, and oscillates the right leg axis in the left and
right directions with the hip joint portion 313 mounted on the
waist portion frame 304 as the center. As a result, the right leg
axis 303 and left leg axis 310 are able to move symmetrically in
the left and right directions based on a (virtual) central axis
(line) extending in the upward and downward directions of the
central portion of the torso pursuant to the cam mechanism 306,
link 308, hip joints 305, 313, and so on.
[0323] FIG. 61 is a perspective view showing a structural example
of the right leg 320. A hip joint portion 313 is mounted on the
upper part of the right leg axis 310. A right leg is rotatably
mounted against the waist portion frame 304 with a pin (not shown)
on the upper part of this hip joint portion 313 so as to move such
right leg in the left and right directions. The lower part of the
hip joint portion 313 is mounted, with a pin 322 (not shown), on
the concave portion of the upper part of the above-knee portion 321
of the leg so as to be rotatable in the front and back directions
of the robot. The concave portion of the lower end portion of the
above-knee portion 321 is rotatably mounted in the front and back
directions with the protrusion of the below-knee portion front
cover 323 and the pin 324. The lower central part 323b of the
below-knee portion front cover 322 is opened in a reverse V-shape.
The roller portion 312 of the lower end portion of the right leg
axis 310 is made to contact the ground surface or floor surface (or
the mounting face of the robot) (not shown) by being positioned in
the center penetration hole 325a of the ground portion extending in
the front and back directions of the robot, and mounted in a
rotatable manner with a pair of protrusions 325 in the shape of an
approximate reverse V-shape and pins 326 respectively disposed on
both sides of such penetration hole 325a. The below-knee portion
back cover 327 engages with the below-knee front cover 323 while
sandwiching the right leg axis 310. A U-shaped opening 327a in
which the right leg axis is position is provided to the upper face
portion of the below-knee back cover 327. Moreover, a long hole
327b is provided at a position opposite the reverse V-shaped
opening 323b of the below-knee portion front cover of the
below-knee portion back cover 327. Pins 326 that rotatably connect
the roller portion 312 with the ground portion 325 are position in
the reverse V-shaped opening 323b and long hole 327b. The
below-knee portion U-shaped opening 327a, reverse V-shaped opening
323b and long hole 327b are structured such that the right leg axis
310 and connection pin 326 do not interfere with (do not contact)
the covers 323, 327 when the knee portion, which is the portion
connection the above-knee portion and below-knee portion, is
bent.
[0324] A protrusion 327c (c.f. FIG. 57, FIG. 58) is formed on the
inside bottom portion of the below-knee back cover 327. This
protrusion 327c contacts the inclined face 325b formed on the
ground 325. When the right leg axis 310 opens toward the right side
of the robot as shown in FIG. 60 pursuant to the rotation of the
motor 301, the right ankle slides from the state shown in FIG.
62(a) across the floor surface of the ground portion 325, and, as
shown in FIG. 62(b), relatively rotates the below-knee portion 327
in the clockwise direction with the connection pin 326 of the
roller portion/ground portion in the center. Thereby, the
protrusion 327c of the below-knee portion contacts the upper part
of the inclined face 325b of the ground portion 325, and pushes the
below-knee portion 327 upward. Here, since the position of the hip
joint 313 does not change, the lower part of the above-knee portion
321 and the upper part of the below-knee portion 323 are pushed
forward, thereby bending the knee of the leg.
[0325] FIG. 63 shows the appearance of the left leg 330 having a
built-in motor 301. An eccentric cam 307 is mounted on the upper
end portion of the left leg axis 303, and a spherical engagement
member 309 for engaging with the link 308 is mounted on the cam 307
with a screw (c.f. FIG. 64). The motor is built in the below-knee
portion 331, and the below-knee portion 331 and the ground portion
332 are rotatably connected with a pin. A friction member (not
shown) such as rubber for preventing sliding is adhered to the
bottom of the ground portion 332 of the left foot. Although a
knee-bending mechanism as with the right leg portion 320 is not
provided to the left leg portion 330, a knee-bending mechanism
similar to the right leg may also be provided to the left leg.
[0326] With the aforementioned lower body mechanism 300, the
mechanical portion only occupies the lower part of the torso, and
it is possible to make the bulk of the robot torso internally
empty. This is convenient in that the inside of the torso may be
used for electric circuits or an upper body mechanism. Moreover,
since a relatively heavy motor is disposed in the below-knee
portion of the leg, it is easy to stabilize the robot. Further,
with the aforementioned mechanism, although the knee of the right
leg will bend, the knee of the left leg is fixed, and, by providing
a friction member at the bottom portion, it is possible to prevent
the unstable posture of the robot, movement and rotation of the
robot, and so on.
[0327] FIG. 64 shows an example of the cam 307 of the left leg
axis. As shown in FIG. 64, it is possible to adjust the degree of
opening the legs in the left and right directions by changing the
mounting position of the engagement member 309 of the cam 307.
Since the link 308 and engagement member 309 (and 311) have a
spherical shape, unnecessary force is not inflicted between the
link and engagement member (or cam) even when the legs are opened.
Adjustment can be made by providing beforehand a plurality of screw
holes in the cam and mounting the engagement member in an
appropriate screw hole, or by changing the cam.
[0328] FIG. 65 is a block diagram explaining an example of
synchronizing (corresponding) music or sound with the movement of
the robot.
[0329] In this example, in place of the ROM 62 of the control unit,
or in addition to the ROM 62, used is a square chip card (micro IC
card) 621, wherein one side thereof is approximately 2 cm, having
recorded thereon music information and control data. Exchange of
songs is thereby facilitated. Needless to say, music information
and control data may be recorded on the ROM 62. When the user
inserts the chip card 621 in the robot and orders a movement by
operating a switch (not shown), the control unit 60 reads the sound
data (information) from the chip card 621, converts this into a
sound signal with the sound reproduction processing function 601 of
the control unit 60, and supplies this to the speaker 72 at an
appropriate level. A song with a prescribed rhythm is thereby
played from the speaker 72. Further, the control unit 60 reads
control data from the chip card 621 and controls the motor 301 with
the rhythm control function 602 of the control unit 60. The motor
301 is able to control the rotation speed, normal/reverse rotation,
length of step and so on pursuant to PWM control and level control
of the supplied voltage. By previously storing data representing
the rhythm of the song in the control data, movement of the legs
matching the performance of the song is enabled, and it is thereby
possible to make the robot move as though it is dancing to the
song.
[0330] FIG. 66 is a block diagram explaining an example of making
the robot move in correspondence with music or sound. Sound data is
at least previously stored in the chip card 621. When the user
inserts the chip card 621 in the robot and orders a movement by
operating a switch, the control unit 60 reads the sound data from
the chip card 621, converts this into a sound signal with the sound
reproduction processing function 601 of the control unit 60, and
supplies this to the speaker 72 at an appropriate level. A song
with a prescribed rhythm is thereby played from the speaker 72.
Further, the control unit 60 samples the sound signal with its
sampling function 603 and extracts the rhythm (cycle of accents) of
the song from such sound signal with the rhythm extracting function
604. A rotation corresponding to the rhythm of this song is set in
the motor control function 605. The motor control function 605 sets
the rotation speed, normal/reverse rotation, length of step and
soon by performing PWM control and level control of the supplied
voltage. When performing this type of control, movement of the legs
matching the performance of the song is enabled without having to
previously record data representing the rhythm of the song in the
control data, and it is thereby possible to make the robot move as
though it is dancing to the song.
[0331] It is also possible to move the legs of the robot in concert
with the sound collected by the microphone 52. For example, when
the user claps his/her hands, speaks or sings near the microphone
52, such sounds are sampled with the sampling function 603 and
rhythm is extracted from the sound signal with the rhythm
extracting function 604. The rotation corresponding to the rhythm
of this song is set to the motor control function 605. This is also
amusing in that the robot would move correspondingly to such
cases.
[0332] Next, an example of providing a bipedal locomotion mechanism
to the lower body of the robot is explained.
[0333] FIG. 67 to FIG. 75 are diagrams explaining the bipedal
locomotion state. FIG. 67 is a perspective view showing the state
where the left leg is positioned backward and the right leg is
positioned forward. FIG. 68 is a perspective view showing the state
where the right leg and left leg are approximately together. FIG.
69 is a perspective view showing the state where the left leg is
positioned forward and the right leg is positioned backward. FIG.
70 is a side view showing the leg mechanism of the left leg. The
operation of the leg mechanism will be described in detail after
the explanation of the respective components thereof. FIG. 71 shows
the waist portion frame 401. The waist portion frame 401 comprises
a stopper 401a for stopping the long rod 410 from being raised
excessively, a drive shaft hole 401b of the cam pulley, a
connection axis 401d for rotatably mounting the above-knee portion
402, and a guide pin 401c for engaging with the long hole 411b of
the short rod 411.
[0334] FIG. 72 shows the above-knee portion 402 rotatably connected
to the waist portion frame 401. A connection portion 402a to be
connected to the connection axis 401d of the waist frame 401 is
provided to the upper part of the above-knee portion 402. A
connection portion 402b to be connected to the below-knee portion
403 is provided to the lower part of the above-knee portion
402.
[0335] FIG. 73 shows the below-knee portion 403. A connection
portion 403a to be connected to the connection portion 402b of the
above-knee portion 402 and a connection portion 403b to be
connected to the short rod 411 are provided to the upper part of
the below-knee portion 403. A connection portion 403c to be
rotatably connected to the connection portion 404a of the ground
portion 404 is provided to the lower part of the below-knee portion
403.
[0336] FIG. 74 shows the ground portion 404. A connection portion
404a to be connected to the connection portion 403c of the
below-knee portion 403 and a connection portion 404b to be
connected to the long rod 410 are provided to the upper part of the
ground portion 404.
[0337] FIG. 75 shows the cam pulley 420, long rod 410 and short rod
411. The cam pulley 402 is connected to the axis to be rotatively
driven by a motor not shown. A drive pin 420a is provided in an
eccentric position from the drive shaft (not shown) of the pulley
at the outside of the cam pulley 420. A cylindrical cam 420b is
provided in an eccentric position from the drive shaft (not shown)
of the pulley at the inside of the cam pulley 420. A long hole 410a
is provided to the upper part of the "dogleg" shaped rod, and a pin
420a is inserted in this long hole 410a so as to rotatably engage
such rod and long hole. The long hole 410a prevents the toe of the
foot of the robot from lowering excessively. A connection portion
410b to be connected with the connection portion 404b of the ground
portion 404 is provided to the lower part of the "dogleg" shaped
rod. An annular engagement portion 411a for engaging with the cam
420b is provided to the upper part of the short rod 411. A long
hole 411b for engaging with the guide pin of the frame 401c is
provided to the center portion of the short rod 411. A connection
portion 411c to be connected with the connection portion 403b of
the below-knee portion 403 is provided to the lower part of the
short rod 411.
[0338] According to the foregoing structure, as shown in FIG. 70,
the waist portion frame 401 and the above-knee portion 402 are
rotatably connected via the connection portions 401d and 402a, and
the above-knee portion 402 and the below-knee portion 403 are
rotatably connected via the connection portions 402b and 403a.
Further, the below-knee portion 403 is rotatably connected to the
ground portion 404 via the connection portions 403c and 404a. The
short rod connects the cam 420b and the below-knee portion 403 via
the connection portions 403b, 422c. When the cam pulley 420 is
rotatively driven, the eccentric cam 420b swings the below-knee
portion 403 with the short rod 411 and lifts the below-knee portion
403. The above-knee portion 403 also oscillates pursuant thereto.
The long rod 410 connects the drive pin (cam) 420a and the ground
portion 404 via the connection portions 410b, 404b. When the cam
pulley 420 is rotatively driven, the drive pin 420a lifts the
ground portion 404 with the long rod 410. The lifting and lowering
of the toe of the foot upon moving the legs is set.
[0339] This mechanism, as shown in FIG. 67 to FIG. 69, lifts the
toe of the left foot while maintaining the balance with the right
foot on the ground, and advances the legs by moving the left foot
from the backward to the forward position with the heel on the
ground. When the entire left foot is contacting the ground,
similarly, the right foot is moved to the forward position
repeatedly to realize the walking motion.
[0340] FIG. 76 and FIG. 77 show movements of the left leg pursuant
to the rotation of the drive shaft. In this example, the ground
portion is not in contact with the floor, and shows the movement in
a state where the foot is hanging in the air.
[0341] The respective diagrams of FIG. 76(1) to FIG. 76(4) and FIG.
77(5) to FIG. 77(8) show movements of the respective legs when the
drive shaft of the cam is rotated in increments of 45 degrees. FIG.
76(1) shows the state where the rotational angle of the cam axis is
0 degrees (basic position). In this state, the short rod 411 is
swung forward by the cam 420a with the guide pin 401c as the
fulcrum, and the leg is thereby moved forward. FIG. 76(3) shows a
state where the cam axis rotated 90 degrees. In this state, the
short rod 411 is in the approximate center position of oscillation
pursuant to the cam 420a, and the legs are together. FIG. 77(5)
shows a state where the cam axis rotated 180 degrees. In this
state, the short rod 411 is swung backward by the cam 420a with the
guide pin 401c as the fulcrum, and the leg is thereby moved
backward. FIG. 77(7) shows a state where the cam axis rotated 270
degrees. This state corresponds to the state where the legs are
together. Nevertheless, this state is different from the case of
FIG. 76(3) above in that the upper end of the long rod 411 is not
in contact with the stopper, and the degree of rotational freedom
of the ground portion 404 with the connection portion 404a as the
center is large. As shown in FIG. 76(1) to FIG. 77(5), the lifting
of the long rod 410 is prevented by the upper end of the long rod
410 contacting the stopper 401a, thereby preventing the tip of the
foot (ground portion) from lowering excessively. Further, the frame
weight of the robot when the ground portion is in contact with the
ground is conveyed to the backside of the ground portion 404 in
order to stabilize the posture.
[0342] Pursuant to the series of operations described, as shown in
FIG. 78(a) and FIG. 78(b), the toe is raised, the foot is moved
forward in a state where the heel is in contact with the ground,
and, when the foot moves forward, the entire sole is made to
contact the ground. As described later, the toe side of the sole
has a mechanism built therein for changing the direction of the
overall robot. Moreover, a roller is built in the heel side of the
sole for sliding across the ground. As the roller, for example,
employed may be a metal roller such that it may concurrently act as
a weight, and the stability of the posture of the overall robot can
be sought thereby. The mode of leg movement of the robot as
described above is advantageous for this type of mechanism.
[0343] The robot will move in reverse by counter-rotating the cam
axis.
[0344] FIG. 79 and FIG. 80 show structural examples (leg mechanism
of left leg) of another leg mechanism. Components in these diagrams
corresponding with those illustrated in FIG. 70 are given the same
reference numerals.
[0345] In this example, a pressing plate 410c is integrally mounted
on the long rod 410. The inclination (posture) and lifting/lowering
of the toe (or heel) of the feet of the robot are set by pressing
the connection portion (rear axis) 404b of the ground portion 404
in a set timing as a result of pushing this pressing plate with the
cam. Further, the shape of the cam and hole is adjusted such that
the toe of the foot (ground portion) of the robot can be lifted
higher. Moreover, in this example, the advancing power (or
retreating power) of the robot is increased by increasing the
driving force of the toes by pressing the front end side (toe side)
of the ground portion 404 on the ground, or by actively applying
grind-like energization force with a spring.
[0346] In the leg mechanism of the left leg shown in FIG. 79, the
waist portion frame 401 and the above-knee portion 402 are
rotatably connected via the connection portions 401d and 402a, and
the above-knee portion 402 and below-knee portion 403 are rotatably
connected via the connection portions 402b and 403a. Further, the
below-knee portion 403 is rotatably connected to the ground portion
404 via the connection portions 403c and 404a. A spring SP as the
energization means is mounted on a part of the ground portion 404,
for example, between the connection portion 404b and a part of the
case of the below-knee portion 403 so as to apply force to
continuously lift the back part (heel of foot) of the ground
portion 404. This spring SP operates in the direction of
continuously keeping the eccentric cam 420b and the pressing plate
410 in contact. Moreover, the spring SP only needs to operate so as
to lift the heel of the ground portion 404 in the upward direction,
and the mounting position thereof may be selected accordingly.
[0347] The short rod 411 connects the eccentric cam 420b and the
below-knee portion 403 with the connection portion 403b, 411c. When
the cam pulley is rotatively driven, the eccentric cam 420b swings
the below-knee portion 403 back and forth with the short rod 411
and lifts the below-knee portion (leg) 403. The above-knee portion
403 also oscillates so as to bend the knee pursuant thereto. The
long rod 410 is guided by the guide pin 420c, and connects the
eccentric cam 420b and the ground portion 404 via the connection
portions 404b, 410b. When the cam pulley 420 is rotatively driven,
the pressing plate 410c is lifted by the eccentric cam 420b with
the pin 420c as the guide, and further lowers the engagement
portion (rear axis) 404b of the ground portion 404 with the long
rod 410. The posture (inclination) of the robot during the walking
motion, or the lifting and lowering of the toe of the foot upon
moving the legs is set.
[0348] In this type of mechanism, the toe of the stepping foot may
be lifted in a timing of moving the stepping foot forward, and the
toe of the stepping foot may be lowered in a timing of moving the
stepping foot backward. It is thereby possible to improve the
running performance of a robot capable of walking without falling
down.
[0349] FIG. 80 shows the cam pulley 420, log rod 410, short rod 411
and spring SP of another mechanical example described above. The
cam pulley 402 is connected to the drive shaft (not shown) to be
rotatively driven by a motor. A guide pin 420c is provided in a
concentric position to the drive shaft of the pulley at the outside
of the cam pulley 420. A cylindrical cam 420b is provided to the
cam pulley 420 in an eccentric position from the drive shaft of the
pulley. A long hole 410a is provided to the upper part of the
approximate "dogleg" shaped rod, and a pin 420a is inserted in this
long hole 410a so as to rotatably engage such rod and long hole.
The guide pin 420c is movably engaged with the long hole 410a. The
pressing plate 410c is formed at the lower part of the long hole
410a. The upper surface of the pressing plate 410c contacts the
eccentric cam 420b and moves the log rod in the upward and downward
direction in accordance with the movement of the cam 420b. A
connection portion 410b to be connected with the connection portion
404b of the ground portion 404 is provided to the lower part of the
"dogleg" shaped rod 410. An annular engagement portion 411a for
rotatably engaging with the cam 420b is provided to the upper part
of the short rod 411. A long hole 411b for engaging with the guide
pin of the frame 401c is provided to the center portion of the
short rod 411. A connection portion 411c to be connected with the
connection portion 403b of the below-knee portion 403 is provided
to the lower part of the short rod 411.
[0350] Further, the above-knee portion 402, below-knee portion 403
and ground portion 404 in the other embodiment described above are
structured in a similar manner as with the first embodiment.
[0351] FIG. 81 and FIG. 82 show the mechanism for changing the
direction of the robot. FIG. 81 is a side view of the ground
portion, and a drive roller 404c is disposed at the toe side and a
sliding roller 404d is disposed at the heel side thereof. FIG.
82(a) is a diagram of the ground portions 404 of the left and right
legs viewed from the front of the robot, and FIG. 82(b) is a
diagram of the ground portions 404 of the left and right legs
viewed from the bottom.
[0352] As shown in FIG. 82(b), disposed in the front part within
the ground portion 404 are a motor 404e, a gear mechanism 404f for
increasing the rotational quantum force of this motor, and a drive
roller 404c rotated by this gear mechanism 404f. A plurality of
drive rollers 404c may be provided, and, in this example, two drive
rollers 404c, 404c are provided and connected additionally with a
drive belt 404g. The drive direction pursuant to the drive roller
404c and the drive belt 404g is set in obliquely against the front
and back direction of the robot. Although the motor and gear
mechanism are also disposed obliquely in correspondence thereto,
these may be set appropriately. When increasing the number of drive
rollers 404c, the ground plane increases, and the stability of the
robot increases. It is also possible to increase the speed of
turning.
[0353] Preferably, as shown in FIG. 82(b), when both legs are
together, the left and right drive rollers and the drive direction
by the drive belt 404g are in an "inverted V-shape" positioned on
the approximately identical circumference. A freely rotatable
sliding roller 404d is positioned at the front part within the
ground portion 404. By structuring this roller with a relatively
heavy material, metal for example, this will concurrently act as
the weight for adjusting the balance of the robot. Needless to say,
an item corresponding to a weight for maintaining the balance may
also be separately provided to the ground portion 404.
[0354] FIG. 83 shows another example of a mechanism for changing
the direction of the robot. The components shown in FIG. 83 that
correspond to those illustrated in FIG. 82(b) are given the same
reference numerals, and the explanation thereof is omitted. This
example only shows the left leg side of the robot, and the right
leg side (not shown) is structured symmetrical to the example of
the left leg side. The drive roller 404c and the drive belt 404g
depicted in FIG. 82 are structured with a drive rubber roller 404h.
The drive rubber roller 404h, for example, is structured by
covering a large friction rubber over the periphery of a plastic
pulley. In a state where both legs are together, the drive
direction by the left and right rubber drive rollers 404h is in an
"inverted V-shape" positioned on the approximately identical
circumference. A freely rotatable sliding roller 404d is positioned
at the back part within the ground portion 404. This structure will
also operate similarly as with the example shown in aforementioned
FIG. 81 and FIG. 82.
[0355] By providing a drive roller or a drive belt to the sole of
the foot of the robot, while performing a bipedal locomotion, it
becomes possible to change (turn) directions, which is technically
difficult in a bipedal locomotion mechanism. Needless to say, the
robot can turn even in a non-walking state. Further, by adopting a
structure of disposing the drive roller or drive belt obliquely
against the front and back direction of the robot, the posture of
the robot is more stable and the directional change of the robot
can be performed in a shorter time span in comparison to driving
the drive roller in a perpendicular direction against the advancing
direction in order to make the turn.
[0356] Moreover, the aforementioned bipedal locomotion mechanism of
the legs walks in a state where the heel is constantly contacting
the ground. Providing the drive roller or the drive belt at the toe
side is suitable for this walking structure. In other words,
assuming that the drive roller is provided to the heel side, the
robot may turn when the toe is lifted, and the posture of the robot
will become unstable. Further, this will be an unnatural movement
as the robot movement simulating a person. With respect to this
point, when providing the drive belt or the like to the toe side,
the posture is stabilized since the turn is made with the foot with
the entire sole thereof in contact with the ground, and the
movement will look natural. The turn during the walk in particular
is stable.
[0357] The control unit 60 is able to change the direction of the
robot in order to avoid obstacles by activating the aforementioned
turnabout mechanism upon detecting an obstacle in the forward
direction of the robot with the likes of a light sensor 53. The
position of the sensor for detecting obstacles may also be at the
tip of the ground portion. In such a case, the sensor may be a
switch, supersonic sensor, or the like.
[0358] As described above, in the embodiments of the present
invention, it is possible to reduce the consumption of the battery
since the robot is activated upon previously distinguishing that
the user is nearby.
[0359] Further, the electronic toy is not limited to being battery
powered, and the power supply may also be via an AC power supply or
AC power supply adaptor.
[0360] Moreover, the robot of the embodiments stores a program for
deciding on its own actions, and self-activates various operations
in accordance with the time. The subsequent action is decided with
respect to whether there was a reaction; for example, whether a
sound could be heard or whether a switch was touched. When no one
watching, wasteful movements are not made that much, but whether
the user is nearby is periodically (in fixed time intervals)
confirmed. When someone is nearby, by taking an even larger action,
the user will view this as though the robot is moving on its own at
all times.
[0361] Further, the robot of the embodiments comprehends the
biorhythm of the user in order to presume the health and mood of
the user, and, when it is presumed that the user is not feeling
well, it takes (is programmed to take) humane action of cheering
the user up and so on.
[0362] Moreover, since the robot periodically performs solo play,
this would be amusing for the user when he/she discovers such solo
play.
[0363] The robot of the embodiments possesses a self-emotion
parameter, and vocalizes or displays words corresponding to the
present emotion. This is amusing since it would seem as though the
robot has emotions.
[0364] The robot of the embodiments reacts to pranks such as
continuously talking loudly near the robot, covering the robot with
a cloth, or hitting the robot repeatedly. Thus, this is also
amusing.
[0365] The robot of the embodiments conducts communication by text.
For example, this is amusing in that the robot inquires questions
to the user or soliloquizes.
[0366] Further, the emotion of the robot is affected by the
response to such questions, and the mood will become good or bad.
As this is humanlike, this is amusing in that such emotion is
displayed on the display unit or represented by movement.
[0367] Moreover, a standard sentence is formed so as to realize
conversation by data exchange upon connecting the robots. This is
amusing in that it would seem as though the robots are having a
conversation by outputting this by sound or on the display
unit.
[0368] Further, the mechanical structure shown in the embodiments
is able to obtain two degrees of freedom of the arm, one degree of
freedom of the neck, and expressions of the face (eyes) with a
minimal structure, thereby realizing emotional movements and
expressions of the robot.
[0369] Moreover, the electronic toy and the electronic robot of the
present invention are also applicable to so-called pet robots,
therapy products (e.g., healing robots), domestic robots comprising
a function of monitoring patients and elderly persons, and so on,
and may be enjoyed by adults and elderly persons without being
limited to use as toys for children. Needless to say, the present
invention may also be applied in adult toys and playthings.
[0370] Further, the walking robot as the electronic toy of the
embodiments is able to lift and move the tip (toe) or the back end
(heel) of the ground portion (foot) at a larger angle upon
advancing or retreating by alternately moving both legs. In
addition, the driving force (or friction) to the toe is increased.
Thus, the running performance in places with relatively bad
foothold is improved, and the falling of the robot is thereby
reduced.
[0371] Moreover, it is also possible to combine the respective
embodiments described above. For example, the mechanism of the
upper body of the robot shown in FIG. 5 can be suitably combined
with the mechanism of the lower body shown in FIG. 59 or FIG. 70.
In addition, it is also possible to combine the various control
modes explained in the embodiments, those in FIG. 11 to FIG. 56 for
example, to the robot structured as described above.
[0372] As described above, the electronic toy. of the present
invention is able to communicate with the user from the electronic
toy side since it automatically activates when the user is nearby.
It is also possible to suppress the wasteful consumption of the
power source.
[0373] Further, it is amusing in that the robot behaves so as to
communication with the user by employing text. In addition, the
amusement is increased since the robot selects words and movements
to be output pursuant to its emotions, which is humanlike.
[0374] Moreover, obtained is an electronic toy (robot toy) with
favorable walking performance.
* * * * *