U.S. patent application number 14/158110 was filed with the patent office on 2014-07-24 for method and apparatus for interactive play.
The applicant listed for this patent is Casparus Cate, Benjamin Huyck. Invention is credited to Casparus Cate, Benjamin Huyck.
Application Number | 20140206253 14/158110 |
Document ID | / |
Family ID | 51208047 |
Filed Date | 2014-07-24 |
United States Patent
Application |
20140206253 |
Kind Code |
A1 |
Huyck; Benjamin ; et
al. |
July 24, 2014 |
METHOD AND APPARATUS FOR INTERACTIVE PLAY
Abstract
An interactive toy object apparatus having a toy body that
includes a plurality of object body portions, an object control
circuit secured to the body that includes an object processor, an
object memory device, and one or more object transceivers, a
plurality of object inputs and object outputs secured to one or
more of the object body portions and in communication with the
object control circuit; and a first control program stored in the
object memory device and operable by the object processor, wherein
the interactive object is capable of communicating with a
controller, via the one or more object transceivers, to receive or
transmit at least one of commands, inputs, and outputs,
therebetween.
Inventors: |
Huyck; Benjamin; (Chicago,
IL) ; Cate; Casparus; (Chicago, IL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Huyck; Benjamin
Cate; Casparus |
Chicago
Chicago |
IL
IL |
US
US |
|
|
Family ID: |
51208047 |
Appl. No.: |
14/158110 |
Filed: |
January 17, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61754769 |
Jan 21, 2013 |
|
|
|
Current U.S.
Class: |
446/175 |
Current CPC
Class: |
A63H 3/28 20130101; A63H
2200/00 20130101; A63H 30/04 20130101; A63H 3/02 20130101 |
Class at
Publication: |
446/175 |
International
Class: |
A63H 30/04 20060101
A63H030/04 |
Claims
1. An interactive toy object apparatus comprising: a toy body that
includes a plurality of object body portions; an object control
circuit secured to the body that includes an object processor, an
object memory device, and one or more object transceivers; a
plurality of object inputs and object outputs secured to one or
more of the object body portions and in communication with the
object control circuit; and a first control program stored in the
object memory device and operable by the object processor, wherein
the interactive object is capable of communicating with a
controller, via the one or more object transceivers, to receive or
transmit at least one of commands, inputs, and outputs,
therebetween.
2. The apparatus of claim 1, wherein the plurality of object inputs
include a first tactile sensor secured to a first object body
portion and a second tactile sensor secured to a second body
portion, and wherein the plurality of object outputs include a
speaker.
3. The apparatus of claim 2, wherein the toy body is a stuffed
animal comprising at least in part of a cloth material and a
stuffing material.
4. The apparatus of claim 3, wherein the plurality of object inputs
include a microphone for receiving a voice prompt, and wherein the
first control program is configured to respond to the voice prompt
by activating one or more of the plurality of object outputs to
provide an annunciation of at least one of sound, motion, or
light.
5. The apparatus of claim 4, wherein the plurality of object
outputs includes the speaker and one or more lights.
6. The apparatus of claim 5, wherein the plurality of object inputs
includes a gyroscope for sensing a position change of the toy body
from a first position to an inverted second position, wherein the
first control program is configured to activate one or more of the
plurality of object outputs to provide an annunciation of at least
one of sound, motion, or light upon sensing the toy body moving
from the first position to the inverted second position.
7. The apparatus of claim 6, wherein the plurality of object inputs
includes an accelerometer for sensing a position change of the toy
body from a first position to a second position, wherein the first
control program is configured to activate one or more of the
plurality of object outputs to provide an annunciation of at least
one of sound, motion, or light upon sensing the toy body moving
from the first position to the second position.
8. The apparatus of claim 7, wherein the one or more transceivers
are configured to receive wireless programming instructions, and
wherein the programming instructions include the assignment of one
or more of the plurality of object inputs to one or more of the
object outputs.
9. The apparatus of claim 8, wherein the one or more transceivers
are configured to receive wireless instructions, and wherein the
instructions include the assignment of one or more of the plurality
of object inputs to one or more of the object outputs, such that
activation of one or more of the plurality of object inputs
activates associated object outputs to provide at least one of
sound, motion, or light.
10. The apparatus of claim 9, wherein once the instructions are
received, the object outputs respond to the associated object
inputs without further instructions being received by the
transceivers.
11. The apparatus of claim 10, wherein the toy body includes a pair
of arms, a pair of legs, and a chest, wherein the arms and chest
each include an object input and an object output.
12. The apparatus of claim 3, wherein the one or more transceivers
are paired with a controller using a wireless connection, and the
controller includes a display screen that displays an avatar of the
toy body.
13. A method of interactive play with a toy comprising: providing a
toy object that includes a toy body and an object control circuit
secured to the body that includes an object processor, an object
memory device, and at least one communication device; activate one
of a plurality of object inputs secured to one of a plurality of
object body portions forming the toy body; and play an audio track
stored in the object memory device via one or more object outputs
secured to one or more of the body portions, wherein the audio
track is assignable to be activated by one or more of the object
inputs.
14. The method of claim 13, wherein the toy body is a stuffed
animal comprising at least in part of a cloth material and a
stuffing material.
15. The method of claim 14, wherein the audio track is assignable
via a controller capable of communicating with the communication
device.
16. The method of claim 15, wherein the audio track is chosen from
a plurality of audio tracks that are stored in a controller memory
of the controller.
17. The method of claim 14, further including displaying an avatar
image of the toy body on the controller, wherein the body portions
of the toy body displayed in the avatar image are selectable on the
controller for assignment to one of the audio tracks.
18. The method of claim 17, further including activating one or
more object outputs associated with the body portion during the
assignment of a selected audio track.
19. A method of interactive play with a toy comprising: providing a
controller having a display screen, a controller processor, a
controller memory, and a wireless controller transceiver, wherein
the controller is configured to communicate with a toy object via
the controller transceiver; and displaying on the display screen a
plurality of selections that include one or more of selecting an
audio track from a library of audio tracks in the controller
memory, downloading an audio track for storage in the controller
memory, and recording an audio track for storage in the controller
memory.
20. The method of claim 19, further including selecting one of the
audio tracks stored in the library of the controller memory and
communicating the audio track to the toy object via the controller
receiver; receiving the communicated audio track at the toy object;
storing the communicated audio track in an object memory of the toy
object; and playing the audio track via a speaker secured to the
toy object upon receipt of an activation of an object input
situated on a body portion of the toy object.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present Application is a non-provisional application and
claims the benefit of U.S. provisional patent application No.
61/754,769 having the same title as the present Application and
filed on Jan. 21, 2013, of which the present Application hereby
incorporates by reference in its entirety.
FIELD
[0002] The method and apparatus relate to interactive objects, more
particularly, interactive toys.
BACKGROUND
[0003] Various types of toys are available for play with a user,
such as a child. The interaction of these toys with the user has
traditionally been limited to actions performed by the toy in
response to an action of the user. For example, pushing a specific
spot on a stuffed animal can initiate an action such as a sound or
movement. Typically, the toy includes a basic circuit that receives
a hard input from a user and responds in a pre-programmed manner
with a hard output. Such limited interactive capabilities with a
user, particularly a child, can lead to rapid boredom and
subsequent non-use of the toy.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Embodiments of the method and apparatus are disclosed with
reference to the accompanying drawings and are for illustrative
purposes only. The method and apparatus are not limited in its
application to the details of construction or the arrangement of
the components illustrated in the drawings. The method and
apparatus are capable of other embodiments or of being practiced or
carried out in other various ways. In the drawings:
[0005] FIG. 1 illustrates a perspective view of an exemplary
interactive object;
[0006] FIG. 2 illustrates a perspective view of an exemplary
controller;
[0007] FIG. 3A illustrates a schematic view of an exemplary object
control circuit in communication with object inputs and object
outputs;
[0008] FIG. 3B illustrates a schematic view of another exemplary
object control circuit in communication with object inputs and
object outputs;
[0009] FIG. 4 illustrates a schematic view of an exemplary
controller in communication with controller inputs and controller
outputs;
[0010] FIG. 5 is a flow chart that represents an exemplary pairing
process from a user perspective;
[0011] FIG. 6 is a flow chart that represents an exemplary process
related to a user's selection of a "sing song" button;
[0012] FIG. 7 is a continuation of the process shown in FIG. 6;
[0013] FIG. 8 is a flow chart that represents an exemplary process
related to a user's selection of a "read story" button;
[0014] FIG. 9 is a flow chart that represents an exemplary pairing
process from an apparatus perspective;
[0015] FIG. 10 is a flow chart that represents an exemplary process
related to an input from a featured selection button;
[0016] FIG. 11 is a flow chart that represents an exemplary process
related to an input from another featured selection button;
[0017] FIG. 12 is a flow chart that represents an exemplary process
related to an input from an additional featured selection
button;
[0018] FIG. 13 is a flow chart that represents an exemplary process
related to an input from a further featured selection button;
[0019] FIG. 14 is a flow chart that represents an exemplary process
related to an object input;
[0020] FIG. 15 is a flow chart that represents an exemplary process
related to another object input; and
[0021] FIG. 16 is a flow chart that represents an exemplary process
related to another object input.
BRIEF SUMMARY
[0022] In at least some embodiments, the method and apparatus for
interactive play relates to an apparatus that includes, a
controller comprising, a controller processor, a controller memory
device, a communication device, a controller display screen, one or
more controller transceivers, and a first control program resident
in the controller memory device and operable by the controller
processor. In addition, the apparatus includes, an interactive
object comprising, an object control circuit, an object processor,
object memory device, one or more object transceivers, and a second
control program resident in the object memory device and operable
by the object processor, wherein, the controller is capable of
communicating with the interactive object, via the transceivers, to
receive and transmit at least one of commands, inputs, and outputs,
therebetween.
[0023] In additional embodiments, the method and apparatus for
interactive play relates to an interactive toy object apparatus
having a toy body that includes a plurality of object body
portions; an object control circuit secured to the body that
includes an object processor, an object memory device, and one or
more object transceivers; a plurality of object inputs and object
outputs secured to one or more of the object body portions and in
communication with the object control circuit; and a first control
program stored in the object memory device and operable by the
object processor, wherein the interactive object is capable of
communicating with a controller, via the one or more object
transceivers, to receive or transmit at least one of commands,
inputs, and outputs, therebetween.
[0024] In other additional embodiments, the method and apparatus
for interactive play relates to a method of interactive play with a
toy that includes providing a toy object that includes a toy body
and an object control circuit secured to the body that includes an
object processor, an object memory device, and at least one
communication device; activate one of a plurality of object inputs
secured to one of a plurality of object body portions forming the
toy body; and play an audio track stored in the object memory
device via one or more object outputs secured to one or more of the
body portions, wherein the audio track is assignable to be
activated by one or more of the object inputs.
[0025] In further additional embodiments, the method and apparatus
for interactive play relates to a method of interactive play with a
toy that includes providing a controller having a display screen, a
controller processor, a controller memory, and a wireless
controller transceiver, wherein the controller is configured to
communicate with a toy object via the controller transceiver, and
displaying on the display screen a plurality of selections that
include one or more of selecting an audio track from a library of
audio tracks in the controller memory, downloading an audio track
for storage in the controller memory, and recording an audio track
for storage in the controller memory.
[0026] Other embodiments, aspects, features, objectives and
advantages of the method and apparatus will be understood and
appreciated upon a full reading of the detailed description and the
claims that follow.
DETAILED DESCRIPTION
[0027] Referring to FIG. 1, an exemplary interactive object 102 is
illustrated. In at least some embodiments, as shown in FIG. 1, the
interactive object 102 can include a stuffed animal toy (e.g.,
plush teddy bear). Further embodiments of the interactive object
102 can include any one of various objects, for example, dolls,
action figures, plastic figures, toy vehicles, etc. The interactive
object 102 can be made of various materials, such as plastic,
cloth, stuffing, etc. The interactive object 102 is configured to
interact with a user, either alone, or in combination with a wired
and/or wireless controller 104. In at least some embodiments, as
shown in FIG. 2, the controller 104 can include a mobile device,
such as, a mobile phone, tablet, laptop computer, etc. Further
embodiments of the controller 104 can include any one of various
non-mobile devices as well, such as a desktop computer. A first
control program 190 and/or first control program interface 197 is
installed on and operated by the controller 104 to initiate actions
to be principally performed by the interactive object 102. In
addition, a second control program 198 is installed on and operated
by the interactive object 102 to receive and execute instructions
received from the controller 104, and to receive and transmit
instructions from the interactive object 102 to the controller 104.
In addition, the second control program 198 facilitates
communication from various object inputs 120 and object outputs 122
and can also function independently of the controller 104.
[0028] The interactive object 102 illustrated in FIG. 1 includes a
toy body 103 having a body interior 111 and a body exterior 116,
and further including various object body portions 119, such as
feet 105, arms 106, hands 107, legs 108, head 110, chest 112, ears
114, stomach 115, etc., that can include cavities therein for the
installation of devices, circuit boards, batteries, etc. Any one of
these, or other portions of the interactive object 102, can be
utilized for operation with the object inputs 120 and the object
outputs 122, as discussed further below. The object inputs 120 and
object outputs 122 are in communication with an object control
circuit 124. The object control circuit 124 includes electrical
components and wiring that is utilized to operate the interactive
object 102. In at least some embodiments, the object control
circuit 124 includes one of or both of integrated circuits and
discrete components, mounted together on a printed circuit board
(PCB), or mounted on multiple PCBs, or other fixed locations about
the interactive object 102. As shown in FIG. 3A, an exemplary
object control circuit 124 can further include a power supply 126,
a processor-based PCB 128 having an object processor 129 and in
communication with an input/output (I/O) PCB 130, one or more
object transceivers 132, and an object memory device 134.
[0029] Referring to FIG. 3B, another exemplary object control
circuit 124 can include a main PCB with the power supply 126
mounted thereon or wired thereto an object processor 129 (e.g., an
8-Bit Host Microprocessor), an object memory device 134, one or
more object transceivers 132 (e.g., Bluetooth transceiver 135,
near-field-communication (NFC) transceiver 137, etc.), input/output
connection block 138, sound card 140, and a media card input 142
(e.g., SD card reader, Memory Stick reader, USB slot, embedded
Flash memory, etc.). The Bluetooth transceiver 135 can include, for
example, a Bluetooth Radio, such as Bluetooth Radio 4.0 with
Bluetooth Low Energy technology, and a Bluetooth Dongle.
[0030] Referring again to FIGS. 3A and 3B, various object inputs
120 and object outputs 122 can be in communication with the object
control circuit 124. Object inputs 120 can include numerous types
of sensors/switches (e.g., tactile, pressure, proximity,
conductivity, magnetic, etc.). For example, tactile sensor 150, an
accelerometer 152 (e.g., a tri-axial accelerometer 152 for tilt and
shock sensing), a gyroscope 154, toggle-style switches 156, object
microphones 158, etc. The object inputs 120 are included to allow
for interaction with a user, such that when the interactive object
102 is manipulated by the user, via the object inputs 120, one or
more of various actions and responses are initiated by the
interactive object 102. The object outputs 122 respond to the
instructions from the object control circuit 124 as processed by
the second control program 198, which can also receive instructions
from the first control program 190, via the controller 104. The
object outputs 122 can include numerous types of devices, for
example, body lights (e.g., light emitting diodes (LEDs)) 160,
object speakers 162, vibration motors 164, etc. One or more of the
various object inputs 120 and object outputs 122 can be located
with the object control circuit 124 on a PCB, or otherwise
positioned on the inside or outside of the interactive object
102.
[0031] Referring again to FIG. 1, in at least some embodiments, the
interactive object 102 is a stuffed bear with various object body
portions 119, such as hands 107, feet 105, ears 114, head 110,
chest 112, and stomach 115. Body lights 160 are positioned in or on
various object body portions 119. In addition, sensors 150 are
positioned in or on one or more of the object body portions 119 and
an accelerometer 152 can be placed in an interior cavity (not
shown) of the bear. Many components, such as the power supply 126,
object control circuit 124, object inputs 120, and object outputs
122 are installed in cavities (not shown) residing in the object
body portions 119. The object control circuit 124, along with
associated wiring to and from the object inputs 120 and object
outputs 122, would likewise be positioned in an interior cavity.
The interactive object 102 can further include lenses 170 situated
over the body lights 160. Although the cavities of the object body
portions 119 are not shown, they are understood to reside inside
the interactive object 102.
[0032] Referring to FIGS. 2 and 4, the controller 104 can be a
mobile device, such as a smartphone or smart device (e.g., IPHONE,
IPAD, GALAXY S3, etc.), that includes various components commonly
known and customarily provided in numerous mobile devices. The
components can include one or more controller transceivers 133
(e.g., Bluetooth transceiver 135, near-field-communication (NFC)
transceiver 137, etc.), a controller processor 131 (e.g., a
microprocessor, microcomputer, application-specific integrated
circuit, etc.), a controller memory device 199, a user interface
196, one or more controller outputs 194, and one or more controller
inputs 193 that contribute to the user interface 196. The one or
more controller inputs 193 can include for example, controller
microphone 178, keypad (can be physical or image generated on a
controller display screen 180), controller touch sensors 191,
buttons 181, camera 192, etc. The one or more controller outputs
194 can include for example, controller display screen 180,
controller speaker 179, etc. The various components can also
include a power supply, such as a battery to allow the mobile
device to be portable. All of the components can be coupled to one
another, and in communication with one another, by way of one or
more internal communication links (e.g., an internal bus).
[0033] In at least some embodiments, the controller transceivers
133 utilize a wireless technology for communication, such as, but
not limited to, cellular-based communication technologies such as
analog communications (using AMPS), digital communications (using
CDMA, TDMA, GSM, iDEN, GPRS, EDGE, etc.), and next generation
communications (using UMTS, WCDMA, LTE, IEEE 802.16, etc.), or
variants thereof, or peer-to-peer, or ad hoc communication
technologies such as HomeRF (radio frequency), Bluetooth, near
field communications (NFC), and IEEE 802.11 (a, b, g or n), or
other wireless communication technologies such as Infra-Red (IR).
The controller 104 can utilize one or more of the aforementioned
technologies, as well as other technologies not currently
developed, to communicate with the interactive object 102. In at
least some embodiments, it is preferable that the controller 104
and interactive object 102 communicate utilizing Bluetooth and/or
NFC protocols. In particular, this is practical as it does not
require an outside network interface (e.g., WiFi).
[0034] As discussed above, the controller 104 includes the first
control program 190 and/or first control program interface 197,
which reside on the controller 104 to initiate actions to be
performed by the interactive object 102. In addition, the second
control program 198 resides on the interactive object 102 to
receive and execute instructions received from the controller 104,
as well as to receive and transmit instructions from the
interactive object 102 to the controller 104. When the first
control program 190 is resident on the controller 104, the control
program interface 197 is not required. If the first control program
190 is not resident on the controller 104, but merely accessed by
the controller 104 for operation, then the first control program
interface 197 is resident on the controller 104 to facilitate
communication with the first control program 190 installed on
another device/source.
[0035] The first control program 190 is, in at least some
embodiments, a software application. The software application can
be configured to run on various types of controllers 104. The type
of controller 104 typically determines the operating system
utilized. For example, the first control program 190 can be
configured to operate on one or more of IOS, ANDROID, and WINDOWS 8
operating systems. In at least some embodiments, the first control
program 190 is installed directly into the memory of the controller
104. Installation of the first control program 190 can be performed
utilizing one of many installation methods. For example, the first
control program 190 can be downloaded via wired or wireless
communication, with Internet stores such as GOOGLE PLAY, APP STORE,
WINDOWS STORE, etc. Once downloaded, the first control program 190
resides on the controller 104 and can be configured for
communication with the interactive object 102. The interfacing and
operation of the controller 104 and the interactive object 102 can
be performed in numerous manners. At least one exemplary embodiment
is illustrated in the flow charts found in FIGS. 5-8, as described
below.
[0036] The first control program 190 includes various screen views
that are configured to include arrangements of selection buttons
113 and other objects, such as an avatar 211, for display on the
controller display screen 180 of the controller 104. Through
selection or other manipulation of the controller 104 by a user,
various different screen views can be displayed offering new, old,
or modified selections (e.g., selection buttons 113). The first
control program 190 includes a home screen 127 that displays an
avatar 211 of the interactive object 102 and a first set of
selection buttons 113, whose identifier or value can dynamically
change based on user selections. It is to be further understood
that the term selection button 113 can include any one of the
identified buttons in the process steps listed below.
[0037] Referring to FIG. 5, the process begins after the first
control program 190 has been installed onto the controller 104,
either by downloading it onto the controller 104, or via another
known method, and the interactive object 102 is powered and ready
to receive a communication from the controller 104. FIG. 5
represents an exemplary pairing process. The application can be
launched on the controller 104 in step 200 by selecting an option
displayed on the controller 104. Once the application is
initialized, a home screen 127 for the first control program 190
can be displayed on the controller display screen 180 of the
controller 104. If the interactive object 102 and controller 104
are equipped with NFC capabilities, the controller 104 can be
touched to the interactive object 102 in step 202, to pair them
together using NFC, wherein "pairing" involves establishing an
agreed protocol for communication therebetween. Alternatively, in
step 204, if the interactive object 102 and controller 104 are
equipped with Bluetooth and/or NFC capabilities, then in step 204,
a "connect" button, which is displayed on the home screen 127 as
selection button 113 of the controller 104, can be pressed to
initiate pairing therebetween. Further, if the controller 104 and
interactive object 102 have been previously paired, then the
connection therebetween can occur automatically once they are in
close proximity (Bluetooth antenna range) of each other.
[0038] If the interactive object 102 is not found by the controller
104 in step 206, then in step 208, a message can be displayed on
the controller 104 indicating that no interactive objects 102 have
been found. Alternatively, if the interactive object 102 is found
in step 206, an indication is provided to the user in step 210,
such as by displaying an avatar 211 (FIG. 2) of the interactive
object 102 on the controller 104. In addition, if the interactive
object 102 is found, annunciation can be provided at the
interactive object 102 as well, such as illuminate one or more body
lights 160 (e.g., green LED), as found in step 212. Further, as
shown in step 214, audio (i.e., sound, sound file, and audio track)
can be played through the speaker 162 of the interactive object 102
to annunciate the connection. In addition to displaying an avatar
211 of the interactive object 102 in step 210, various selection
buttons 113 can be displayed for the user to select a subsequent
action. Audio tracks can include digital audio files such as "MP3"
and "WAV" files, as well as various other digital file formats.
[0039] Referring now to FIG. 6, numerous optional selections can be
displayed on the controller 104 using selection buttons 113. These
selection buttons 113 can be used to initiate as a plethora of
actions by the interactive object 102. To provide various exemplary
selections, FIGS. 6-8 are provided. These exemplary selections are
not intended to be limiting with regard to the capabilities of the
interactive object 102 or the controller 104. In step 300, a "sing
songs" button displayed on the controller 104 is selected by the
user. This prompts the display of additional sub-menu selections in
step 302, for example, a "save songs" button can be displayed, a
"download new songs" button can be displayed, and/or a list of
selectable songs that were previously downloaded can be
displayed.
[0040] Choosing from the selections, a user selects one of the
selectable songs displayed in step 304. This selection prompts the
song to be played from the object speaker 162 in the interactive
object 102 and/or the controller speaker 179 in step 306. In
addition, selection of the song can initiate illumination of one or
more body lights 160 on the interactive object 102, as in step 308.
Further, selection of the song in step 304 can initiate the playing
of an animation on the controller 104 in step 310. Annunciation of
the end of the selected song can be provided by the interactive
object 102, such as by sounding a noise (e.g., a "giggle"), as in
step 312, and/or, illuminating one or more LEDs on the interactive
object 102, as in step 314. In step 316, the user is prompted with
a choice to repeat the song. If the user chooses not to repeat,
then at step 318, the sub-menu is displayed from step 302.
[0041] Beginning again from step 302 with a listing of available
songs displayed, if the user selects "save songs" button at step
330, then at step 332, the avatar 211 of the interactive object 102
is displayed with avatar body portions 123 on the controller 104.
The avatar body portions 123 are displayed on the controller 104 to
assist a user with identifying the status of the various object
inputs 120. More particularly, the avatar body portions 123 are
displayed with body portion indicators 151 (e.g., illuminated or
contrast colored display screen portions (e.g., pixels)). The body
portion indicators 151 represent unassigned object inputs 120 and
serve as user input points on the controller display screen 180 for
a user to touch to make a selection. For example, an avatar body
portion 123 (e.g., an avatar foot 125) can be shown with a body
portion indicator 151 illuminated green if no song is assigned and
red if a song is already assigned to a particular object body
portion 119. Additionally, as in step 334, body lights 160 on the
interactive object 102 can also be illuminated to correspond with
the body portion indicator 151 on the avatar 211. For example, a
body portion light 160 on the foot 105 of the interactive object
102 would be illuminated with a color that corresponds to the color
displayed on the body portion indicator 151 of the avatar 211. This
provides easy identification of the available choices for assigning
a song. In step 336, the user selects one of the displayed songs
and assigns it to the desired object body portion 119 of the
interactive object 102. The assignment can be accomplished in many
ways, such as by touching the song and then the desired avatar body
portion 123 displayed on the controller 104. After the assignment,
the newly assigned body portion light 160, and/or the body portion
indicator 151 can be illuminated to acknowledge the assignment, as
shown in steps 338 and 339. For example, before the selection of a
song, a green illumination that is shown at a body portion light
160 of the interactive object 102, and/or body portion indicator
151 on the controller 104, can change to red upon a successful
assignment. In step 340, the controller 104 can display an option
to assign further songs to the interactive object 102, in which if
the additional assignments are desired, the process then returns to
step 336.
[0042] Referring to FIG. 7, if no additional assignments are
desired, then the process moves to step 341, wherein the user can
activate the newly assigned object input 120 by activation of one
of the object inputs 120 (e.g., sensor 150) that is associated with
a particular object body portion 119. Upon activation, the process
moves to step 342 where the assigned song is played from the object
speaker 162. In addition, step 343 includes illuminating one or
more body lights 160 on the interactive object 102. At step 344, a
separate sound (e.g., giggle) is played from the object speaker 162
when the song is done playing. Further, in step 346, the one or
more body lights 160 can change color to signify that the
interactive object 102 is ready for the next instruction. At step
348, the user can activate another object body portion 119 to play
another song, thereby returning to step 341. If no action is taken,
the avatar 211 is displayed on the controller 104 with body portion
indicators 151 illuminated to identify assigned and unassigned
object body portions 119 in step 350, and the body lights 160 on
the interactive object 102 are likewise illuminated to correspond
with the avatar 211 body portion indicators 151 on the avatar body
portions 123 in step 352. The process can then return to step 210
or step 302 at this point to provide additional options to the
user.
[0043] Referring again to step 302 in FIG. 6, where the "download
new songs" selection button 113 is displayed on the controller 104,
the user can select "download new songs" in step 402, which brings
up a listing of available media, such as songs, stories,
applications, etc. in step 404. The media can be provided by
numerous sources, such as GOOGLE PLAY, APP STORE, WINDOWS STORE,
etc. In step 406, the media is selected by the user for downloading
to the device. Once downloaded, the media would be displayed by the
controller 104 as available media for selection, such as found in
step 304 and step 336.
[0044] Returning again to step 210, and with reference to FIG. 8,
the user can select "read story" at step 502 from the selection
buttons 113, wherein the controller 104 then displays a selection
of available stories at step 504. Further selection buttons 113 can
be displayed providing additional actions, such as "download new
stories" and "record story voices." If the user selects "download
new stories" at step 506, the process then moves to step 404 to
view the selection of stories available for download. At step 508,
the user can select and play one of the available stories. During
the playing of the story, the animation for the story is displayed
on the controller 104 in step 510, and in step 512, the audio track
of the story is played through the object speakers 162 to provide
the appearance that the interactive object 102 is telling the
story. Additionally, in step 514, the body lights 160 are
illuminated with a designated color, such as blue. At step 516, the
first control program 190 and controller processor 131 check if the
story includes a subsequent page, if no, the process returns to
step 504, if yes, then the user can advance to the next page by
pressing a "next page" selected button 113 displayed on the
controller 104, as in step 517, and/or activating an object input
120, such as a sensor 150 located in the object body portion 119
(e.g., stomach 115, chest 112, etc.), as in step 518. Advancing to
a subsequent page returns the process to step 510. Once the story
is complete, the process returns to step 504, wherein the user can
make a new selection. The user may choose to record a story to
provide a familiar voice to the interactive object 102. To do so,
the user selects "record story voices" at step 520, and a list of
available stories for recording is displayed at step 522. At step
524, the user selects to record one of the stories, and at step
526, the story text is displayed on the controller 104 and the user
reads it into the controller microphone 178. The pages of the story
text can be advanced by a "next page" selection on the controller
104. At the termination of the story, the process returns to step
504 and new options can be displayed related to the newly recorded
story.
[0045] Numerous sounds and functions can be preprogrammed into the
object memory device 134 during manufacturing. In addition, sounds
and functions can be downloaded to the interactive object 102 from
the controller 104. Returning to step 210, once the interactive
object 102 is paired with the controller 104, the object inputs 120
can be manipulated to activate various object outputs 122,
including lights, sounds, etc. For example, the user can activate
the sensor 150 positioned in the stomach 115 of the interactive
object 102, and a sound such as a "laugh," can be played from the
object speaker 162. In addition, body lights 160 can be illuminated
in and an animation can be displayed on the controller 104 for
viewing by the user. Other actions can include turning the
interactive object 102 upside-down, thereby activating an object
input 120, such as the accelerometer 152 and/or gyroscope 154,
resulting in a preselected or random audio track being played from
the object speaker 162.
[0046] The above exemplary processes have been described with
primary focus on the actions and responses from the controller 104
and interactive object 102 from the perspective of the user. Below
and with reference to FIGS. 9-16, additional exemplary processes
are described with primary focus on the actions as performed from
the perspective of the interactive object 102 and controller 104
(i.e., first control program 190 and second control program 198).
Referring to FIG. 9, the first control program 190 has downloaded
to the device, via communication from the controller 104 to a
suitable program source, such as described above. At step 602, the
controller 104 receives a command to load the first control program
190. At step 604, upon successful loading, the controller 104
displays one or more selection buttons 113, at least one identified
as a "connect" button. It is to be understood throughout, that in
at least some embodiments, a reference to "displays" on the
controller 104 includes displaying on the controller display screen
180. In addition, audio can be played from the controller speaker
179. Pairing of the controller 104 and the interactive object 102
can occur in various ways, in at least some embodiments, the
pairing can occur automatically, as in step 606, when the first
control program 190 loaded.
[0047] The pairing can also occur via NFC by touching the
interactive object 102 with the controller 104 or placing it within
near proximity to the interactive object 102, wherein the
interactive object 102 can have a NFC tag installed. The NFC tag,
being in at least some embodiments, a programmable device that
provides or triggers an action instruction in the controller 104
when the controller senses the tag to be in near proximity to the
controller 104. In addition, as in step 608, the controller 104
receives a connect instruction, via the "connect" button, and
initiates the pairing process. At step 610, the controller 104
verifies that pairing is successful and the process moves to step
612. If pairing is unsuccessful, annunciation is provided by the
controller 104 and the process returns to step 604 for further
instruction. At step 612, the controller 104 displays the avatar
211 of the interactive object 102. Further, the controller 104 can
transmit a command to the interactive object 102, via communication
initiated between the object processor 129 and the controller
processor 131, utilizing the object transceiver(s) 132 and
controller transceiver(s) 133. The command can include an
annunciation of the pairing to be performed by the interactive
object 102, such as illuminating body lights 160, etc. In addition
to displaying the avatar 211, one or more additional or replacement
selection buttons 113 are displayed on the controller 104.
[0048] In at least one embodiment, as shown in FIG. 10, the
controller 104 displays a selection button 113 identified as a
"sing songs" button, as in step 702. In step 704, the controller
104 (controller processor 131) receives an input selection from the
"sing songs" button and in step 706, the controller 104 responds by
displaying a list of songs on the controller 104, optionally with a
play button adjacent each song. In step 708, the controller 104
receives an input selection to play a specific song and transmits
the audio track for the song to the interactive object 102 with a
command to play the song over the object speaker 162, as in step
710. Additionally, the controller 104 can display animation on the
controller 104 utilizing the avatar 211 or another object, as in
step 712. Further, a command can be transmitted from the controller
104 to the interactive object 102 that instructs the interactive
object 102 to illuminate one or more body portion indicators 151,
as in step 714. Optionally, after step 708, the controller 104 can
display a query asking if the song should be played on multiple
toys, as in step 716. If so, then at step 718, the audio track is
transmitted to the interactive object 102 as well as any other
paired interactive objects 102 chosen. Alternately, the controller
104 can provide a listing of paired objects and then act on the
selected objects only.
[0049] During the playing of the audio track on the object speaker
162, the controller 104 can display animation of the avatar 211
associated with the interactive object 102, along with any
additional avatars displayed as a result of pairing with additional
objects, as in step 720. Further, the process moves to step 714,
where a command is transmitted from the controller 104 to one or
more interactive objects 102 that instructs them to illuminate one
or more body portion indicators 151. At step 722, the controller
104 processes that the audio track is finished playing, and at step
724 the controller 104 transmits a command to the interactive
object 102 to annunciate the end of the song, such as playing audio
track that includes a "giggle", and/or in step 726, the controller
104 transmits a command to the interactive object 102 to illuminate
one or more body lights 160 with a distinctive color. At the end of
the "giggle" audio track in step 728, the process returns to step
706 to display the song list again on the controller 104.
[0050] Still referring to FIG. 10, at step 706, a selection button
113 identified as a "download new songs" button can be displayed on
the controller 104. If the controller 104 receives a signal that
the "download new songs" button has been selected, the process
moves to step 730, wherein a command is initiated on the controller
104 to engage in communication with a network (e.g., internet) to
access a resource for new songs, such as GOOGLE PLAY, APP STORE,
WINDOWS STORE, etc. Assuming a new song has been selected, in step
732 the controller 104 downloads the new song and saves it to the
controller memory device 199 in a folder (e.g., "library") and adds
the new song to the list of available songs for selection.
[0051] Referring now to FIG. 11, and returning to step 612, which
can include the controller 104 displaying a selection button 113
identified as a "store songs to object" button, or a subsequent
display, as in step 802. At step 804, the controller 104 receives
input from the selector button 113 identified as a "store songs to
object" button and in step 806, displays a song storage screen with
a list of songs. At step 808, the controller 104 queries if
multiple objects are paired and if so, at step 810, the controller
104 prompts for selection of a single interactive object 102 to
receive the song. Once the desired interactive object 102 has been
identified, at step 812, the controller 104 displays the avatar 211
for that interactive object 102 with body portion indicators 151
illuminated to identify available selections for assignment of the
song to an object body portion 119 (e.g., green to indicate an
available body portion and red to indicate an unavailable body
portion (previously assigned)). To assist with assignment of the
song, the controller 104 transmits a command to the interactive
object 102 to illuminate the body lights 160 of the interactive
object 102 with colors similar to the avatar's 211 body portion
indicators 151. At step 816, the controller 104 receives a
selection of one song that was displayed on the list of available
songs and in response, at step 818, prompts for a selection of a
body portion displayed by the body portion indicators 151 on the
controller 104. At step 820, the controller 104 receives the
selection and transmits the song (audio file/track) to the object
memory device 134. At step 822, a tag is assigned to link the song
with the object input 120 associated with the selected object body
portion 119. This tag is stored in the object memory device 134
and/or the controller memory device 199. Then at step 824, the body
portion indicators 151 of the selected avatar body portion 123 on
the avatar 211 changes from a green light to a red light to
annunciate the assignment of that object input 120. Further, in
step 826, the controller 104 transmits a command to change the body
portion light 160 from green to red to annunciate the assignment of
the selected object input 120.
[0052] Referring now to FIG. 12, and continuing from step 612, the
controller 104 displays a selection button 113 identified as a
"read story" button in step 902. At step 904, the controller 104
receives input from the "read story" button, and in step 906,
displays a story selection home screen that includes various
selection buttons 113, such as, a "story selection list" button, a
"record story voices" button, and a download new stories" button.
At step 908, the controller 104 receives an input from the "story
selection list" button and, at step 910, the controller 104
displays a list of available stories with "play" buttons. The
controller 104 receives a "play" button input selecting one of the
stories to play. At step 914, the controller 104 queries if
multiple interactive objects 102 are paired and if so, at step 916
the controller 104 displays a prompt for selection of a single
interactive object 102 and moves to step 918 after receiving an
interactive object 102 selection. If only one interactive object
102 is paired, then the process moves to step 918. At step 918, the
controller 104 displays an animation of the story. At step 920, the
controller 104 transmits the audio track for the story to the
interactive object 102 with a command to play the audio track. The
audio track is played over the object speaker 162. In addition, the
controller 104 transmits a command to the interactive object 102 to
illuminate one or more body lights 160 at step 922. At step 924,
the first control program 190 and controller processor 131 checks
if the story includes a subsequent page, if no, the process returns
to step 906 to receive a new instruction, if yes, then the
controller 104 displays a selection button 113 identified as a
"next page" button on the controller 104, in step 926. In step 928,
the controller 104 receives input from the "next page" button.
Alternatively, in step 930, the controller 104 can receive an input
communication from the interactive object 102, such as when an
object input 120 (e.g., a sensor 150) located in an object body
portion 119 (e.g., stomach 115, chest 112, etc.), is activated.
Upon receiving an input, at step 928 or step 930, the controller
104 initiates a command to display the next page (animation) on the
controller 104 in step 932, which advances the process to step 918,
wherein the subsequent story page is displayed and corresponding
audio track is played, as in step 920.
[0053] In addition to reading a story with a pre-recorded voice, a
new story voice can be recorded and played with the story.
Returning to step 906 and referring to FIG. 13, the controller 104
receives an input from the "record story voices" button in step
1002, prompting the controller 104 to display a record home screen
with a list of available stories with a selection button 113
identified as a "record button for each story" in step 1004. In
step 1006, the controller 104 receives an input selection of one of
the record buttons. In step 1008, the controller 104 displays the
animation of the first story page on the controller 104 (including
the text for story) and in step 1010, the controller 104 initiates
the controller microphone 178 to begin receiving audio and records
audio received by the controller microphone 178. At step 1012,
first control program 190 and controller processor 131 check if the
story includes a subsequent page, if no, the process advances to
step 1014, where a command is initiated by the first control
program 190 to save the recorded audio as an audio track to the
controller memory device 199 and adds the audio track to the
listing of available stories. In step 1016, after the audio track
has been saved, the process returns to step 1004 to display the
record home screen. If a subsequent page is found in step 1012,
then in step 1018, the controller 104 displays a selection button
113 identified as a "next page" button to advance the pages of the
story being displayed on the controller 104, and waits to receive
input from the "next page" button or an input from the interactive
object 102, via an object input 120. In step 1020, the controller
104 receives input from the "next page" button and in step 1022,
the controller 104 initiates a command to advance the display of
the story to the next page in the story and returns to step
1008.
[0054] Manipulation of the interactive object 102, with or without
pairing to the controller 104, can initiate an object output 122 to
be activated by the second control program 198. Returning to step
610 (FIG. 9) and referring now to FIG. 14, in step 1102, an object
input 120 is manipulated to activate an object input 120. For
example, the interactive object 102 receives an input from the
accelerometer 152 and/or gyroscope 154 indicating that the
interactive object 102 has been at least partially inverted. Upon
receiving the input, at step 1102, at either the first control
program 190 or the second control program 198 (via transmission to
the controller 104), one of the object outputs 122 is commanded to
activate by the first control program 190, communicating the
command to the second control program 198, or directly via the
second control program 198 (if interactive object 102 is not paired
with the controller 104). The object output 122 can include various
actions, for example, in step 1104, an audio track is transmitted
from the controller 104 (if not stored in the object memory device
134) to the interactive object 102 for playing through the object
speaker 162. Otherwise, if resident, the audio track can be played
from the object memory device 134 without the need for any
communication with the controller 104. In addition, in a like
manner, the object outputs 122 can include the body lights 160,
which are illuminated one of various colors, as noted in step
1106.
[0055] In another example, as shown in FIG. 15, the object input
120 that is sensed by the second control program 198, can originate
from a sensor 150 (e.g., activated by pushing on the stomach) in
step 1202, which in steps 1204 and 1206, initiates a command for an
audio track to be transmitted from the controller 104 (if not
stored in the object memory device 134) to the interactive object
102 for playing through the object speaker 162, followed by the
instruction to play the audio track. Otherwise, if resident in the
object memory device 134, the audio track can be accessed from the
object memory device 134 without the need for any communication
with the controller 104. In addition, in a like manner, the object
outputs 122 can include the body lights 160, which are illuminated
one of various colors, as noted in step 1206. Further, as discussed
above, the sensors 150 (as well as other object outputs 122) can be
programmed to initiate specific actions, such as playing a
particular audio track (e.g., a song). For example, as discussed in
FIG. 16, at step 1302, if the second control program 198 senses an
input from one of the sensors 150 (e.g., in hand 107 or foot 105 of
the object), then in step 1304, the second control program 198
and/or the first control program 190 checks if a specific output
action (e.g., play an audio track) is assigned to that particular
sensor 150. If not, then in step 1306, no action is taken. If for
example the assigned action was to play an audio track, then the
stored audio track would be accessed by the second control program
198, in step 1308. In step 1310, the audio track would be played by
the second control program 198 from the object speaker 162. In
addition to the audio track being played, a command to illuminate
one or more body lights 160 can be provided by the second control
program 198 in response to sensing the object input 120 (e.g.,
sensor 15). In step 1314 the end of the audio track is sensed by
the second control program 198 and in step 1316, a specific audio
track (e.g., a "giggle" track) is played to annunciate the end of
the audio track. Likewise one or more other object outputs 122 can
be utilized to annunciate the end of the audio track, such as the
body lights 160, as seen in step 1318. Finally, the process ends in
step 1320 when the specific audio track finishes.
[0056] Although various references above include action described
as being performed by the controller 104, it is to be understood
that these various actions are more specifically performed via
instruction from the first control program 190 as operated by the
controller processor 131. Likewise, although various references
above include action described as being performed by the
interactive object 102, it is to be understood that these various
actions are more specifically performed via instruction from the
second control program 198 as operated by the object processor 129.
It should be noted that although this process is described with
respect to songs, stories, etc., numerous types of media (e.g.,
photos) and numerous functions (e.g., speaking phrases, vibrating
body portions, etc.), can be assigned to operate with various
object body portions 119. With regard to the first control program
190 and second control program 198, one or both of the programs can
be configured (via real-time or pre-programmed) to play various
games and activities to take advantage of the various object inputs
120, object outputs 122, controller inputs 193 and controller
outputs 194.
[0057] The interactive object 102 can, in at least some
embodiments, include numerous objects, such as medical devices,
kitchen appliances, and other consumer products that can benefit
from the interfacing and control afforded by the controller 104.
The first control program 190 and second control program 198 can be
tailored to address the desired features for these additional
objects. It is specifically intended that the method and apparatus
for interactive play is not to be limited to the embodiments and
illustrations contained herein, but include modified forms of those
embodiments including portions of the embodiments and combinations
of elements of different embodiments as come within the scope of
the following claims. Further, the steps described herein with
reference to the method of operation (processes) are not to be
considered limiting and can include variations, such as additional
steps, removed steps, and re-ordered steps.
[0058] It should be appreciated that the present disclosure is
intended to encompass numerous embodiments as disclosed herein and
further described by the following:
[0059] (i). An interactive toy object apparatus comprising: [0060]
a toy body that includes a plurality of object body portions;
[0061] an object control circuit secured to the body that includes
an object processor, an object memory device, and one or more
object transceivers; [0062] a plurality of object inputs and object
outputs secured to one or more of the object body portions and in
communication with the object control circuit; and [0063] a first
control program stored in the object memory device and operable by
the object processor, wherein the interactive object is capable of
communicating with a controller, via the one or more object
transceivers, to receive or transmit at least one of commands,
inputs, and outputs, therebetween.
[0064] (ii). The apparatus of (i), wherein the plurality of object
inputs include a first tactile sensor secured to a first object
body portion and a second tactile sensor secured to a second body
portion, and wherein the plurality of object outputs include a
speaker.
[0065] (iii). The apparatus of any one of (i)-(ii), wherein the toy
body is a stuffed animal comprising at least in part of a cloth
material and a stuffing material.
[0066] (iv). The apparatus of any one of (i)-(iii), wherein the
plurality of object inputs include a microphone for receiving a
voice prompt, and wherein the first control program is configured
to respond to the voice prompt by activating one or more of the
plurality of object outputs to provide an annunciation of at least
one of sound, motion, or light.
[0067] (v). The apparatus of any one of (i)-(iv), wherein the
plurality of object outputs includes the speaker and one or more
lights.
[0068] (vi). The apparatus of any one of (i)-(v), wherein the
plurality of object inputs includes a gyroscope for sensing a
position change of the toy body from a first position to an
inverted second position, wherein the first control program is
configured to activate one or more of the plurality of object
outputs to provide an annunciation of at least one of sound,
motion, or light upon sensing the toy body moving from the first
position to the inverted second position.
[0069] (vii). The apparatus of any one of (i)-(vi), wherein the
plurality of object inputs includes an accelerometer for sensing a
position change of the toy body from a first position to a second
position, wherein the first control program is configured to
activate one or more of the plurality of object outputs to provide
an annunciation of at least one of sound, motion, or light upon
sensing the toy body moving from the first position to the second
position.
[0070] (viii). The apparatus of any one of (i)-(vii), wherein the
one or more transceivers are configured to receive wireless
programming instructions, and wherein the programming instructions
include the assignment of one or more of the plurality of object
inputs to one or more of the object outputs.
[0071] (ix). The apparatus of any one of (i)-(viii), wherein the
one or more transceivers are configured to receive wireless
instructions, and wherein the instructions include the assignment
of one or more of the plurality of object inputs to one or more of
the object outputs, such that activation of one or more of the
plurality of object inputs activates associated object outputs to
provide at least one of sound, motion, or light.
[0072] (x). The apparatus of any one of (i)-(ix), wherein once the
instructions are received, the object outputs respond to the
associated object inputs without further instructions being
received by the transceivers.
[0073] (xi). The apparatus of any one of (i)-(x), wherein the toy
body includes a pair of arms, a pair of legs, and a chest, wherein
the arms and chest each include an object input and an object
output.
[0074] (xii). The apparatus of any one of (i)-(xi), wherein the one
or more transceivers are paired with a controller using a wireless
connection, and the controller includes a display screen that
displays an avatar of the toy body.
[0075] (xiii). A method of interactive play with a toy comprising:
[0076] providing a toy object that includes a toy body and an
object control circuit secured to the body that includes an object
processor, an object memory device, and at least one communication
device; [0077] activate one of a plurality of object inputs secured
to one of a plurality of object body portions forming the toy body;
and [0078] play an audio track stored in the object memory device
via one or more object outputs secured to one or more of the body
portions, wherein the audio track is assignable to be activated by
one or more of the object inputs.
[0079] (xiv). The method of (xiii), wherein the toy body is a
stuffed animal comprising at least in part of a cloth material and
a stuffing material.
[0080] (xv). The method of any one of (xiii)-(xiv), wherein the
audio track is assignable via a controller capable of communicating
with the communication device.
[0081] (xvi). The method of any one of (xiii)-(xv), wherein the
audio track is chosen from a plurality of audio tracks that are
stored in a controller memory of the controller.
[0082] (xvii). The method of any one of (xiii)-(xvi), further
including displaying an avatar image of the toy body on the
controller, wherein the body portions of the toy body displayed in
the avatar image are selectable on the controller for assignment to
one of the audio tracks.
[0083] (xviii). The method of any one of (xiii)-(xvii), further
including activating one or more object outputs associated with the
body portion during the assignment of a selected audio track.
[0084] (xix). A method of interactive play with a toy comprising:
[0085] providing a controller having a display screen, a controller
processor, a controller memory, and a wireless controller
transceiver, wherein the controller is configured to communicate
with a toy object via the controller transceiver; and [0086]
displaying on the display screen a plurality of selections that
include one or more of selecting an audio track from a library of
audio tracks in the controller memory, downloading an audio track
for storage in the controller memory, and recording an audio track
for storage in the controller memory.
[0087] (xx). The method of (xix), further including: [0088]
selecting one of the audio tracks stored in the library of the
controller memory and communicating the audio track to the toy
object via the controller receiver; [0089] receiving the
communicated audio track at the toy object; [0090] storing the
communicated audio track in an object memory of the toy object; and
[0091] playing the audio track via a speaker secured to the toy
object upon receipt of an activation of an object input situated on
a body portion of the toy object.
[0092] (xxi). An apparatus comprising: [0093] a controller
comprising: [0094] a controller processor; [0095] a controller
memory device; [0096] a communication device; [0097] a controller
display screen; [0098] one or more controller transceivers; [0099]
a first control program resident in the controller memory device
and operable by the controller processor; and [0100] an interactive
object comprising: [0101] an object control circuit; [0102] an
object processor; [0103] object memory device; [0104] one or more
object transceivers; and [0105] a second control program resident
in the object memory device and operable by the object processor,
[0106] wherein the controller is capable of communicating with the
interactive object, via the transceivers, to receive and transmit
at least one of commands, inputs, and outputs, therebetween.
[0107] (xxii). The apparatus of (xxi), wherein the controller is
paired with the interactive object using a wireless connection.
[0108] (xxiii). The apparatus of any one of (xxi)-(xxii), wherein
the interactive object is configured to be operated without a
continuous connection to the controller.
[0109] While the principles of the method and apparatus for
interactive play have been described above in connection with
regard to a specific apparatus, it is to be clearly understood that
this description is made only by way of example and not as a
limitation on the scope of the method and apparatus for interactive
play. It is specifically intended that the method and apparatus for
interactive play not be limited to the embodiments and
illustrations contained herein, but include modified forms of those
embodiments, including portions of the embodiments and combinations
of elements of different embodiments as come within the scope of
the following claims. In addition, the various methods of use
described herein can include additional steps not described herein
or can omit steps described herein. Further, the various steps can
be performed in a different order than described herein.
* * * * *