U.S. patent application number 11/339572 was filed with the patent office on 2007-07-26 for reflected light controlled vehicle.
Invention is credited to Yehiel Avraham Olti, Gyora Mihaly Pal Benedek, Shai Seger.
Application Number | 20070173171 11/339572 |
Document ID | / |
Family ID | 38286156 |
Filed Date | 2007-07-26 |
United States Patent
Application |
20070173171 |
Kind Code |
A1 |
Pal Benedek; Gyora Mihaly ;
et al. |
July 26, 2007 |
Reflected light controlled vehicle
Abstract
The remotely controlled toy vehicle of the present invention is
configured to respond to a tracking signal of a narrow beam of
non-visible light that is projected onto the surface of the ground
in proximity of the toy vehicle. The sensors mounted on the toy
vehicle are configured to receive, and respond to, the light energy
of the beam that is reflected off the surface of the ground. The
controller, which is the source of the tracking signal, is a
handheld component configured to project the beam of non-visible
light to a location as desired by the user. A beam of visible light
is also projected as an indicator of the location of the tracking
spot. Operation of the toy vehicle is controlled by moving the
tracking spot. Preferably, the control circuitry is configured such
that the toy vehicle follows the tracking spot. Alternatively, the
remotely controlled toy vehicle of the present invention is
configured to respond to an auditory tracking signal.
Inventors: |
Pal Benedek; Gyora Mihaly;
(Haifa, IL) ; Seger; Shai; (Haifa, IL) ;
Olti; Yehiel Avraham; (Karmiel, IL) |
Correspondence
Address: |
DR. MARK FRIEDMAN LTD.;C/o Bill Polkinghorn
9003 Florin Way
Upper Marlboro
MD
20772
US
|
Family ID: |
38286156 |
Appl. No.: |
11/339572 |
Filed: |
January 26, 2006 |
Current U.S.
Class: |
446/175 |
Current CPC
Class: |
A63H 30/04 20130101;
A63H 33/22 20130101; A63H 17/36 20130101 |
Class at
Publication: |
446/175 |
International
Class: |
A63H 30/00 20060101
A63H030/00 |
Claims
1. A method for remotely guiding a toy vehicle, the method
comprising: (a) providing a first light source configured to
project at least a first narrow beam tracking signal; (b)
projecting said at least a first tracking signal so as to provide a
tracking spot on a surface; (c) providing a motorized toy vehicle
configured with at least one sensor responsive to a position of
said at least a first tracking spot; wherein output from said
sensor affects at least one operational feature of said motorized
toy vehicle, and said operational feature is effective to change a
location of the toy vehicle; and (d) altering said position of said
at least a first tracking spot so as to alter said at least one
operational feature and thereby guide the toy vehicle to a change
of location.
2. The method of claim 1, wherein said first tracking signal is
implemented as a beam of light within the spectrum of non-visible
light.
3. The method of claim 2, wherein said first tracking signal is
implemented as infrared light.
4. The method of claim 2, further including providing at least a
second light source providing a beam of visible light, configured
to indicate said position of said tracking spot.
5. The method of claim 1, wherein said light source is implemented
in a handheld component operated by a user.
6. The method of claim 1, wherein said light source is implemented
so as to be wearable by a user.
7. The method of claim 1, wherein said tracking spot is projected
onto said surface in proximity to said motorized toy vehicle.
8. The method of claim 1, further including providing a control
circuit in electronic communication with said at least one sensor,
said control circuit configured to receive output from said at
least one sensor and to control said at least one operational
feature.
9. The method of claim 8, wherein said at least one sensor is
implemented as a plurality of sensors and said at least one
operational feature is implemented as a plurality of operational
features.
10. The method of claim 9, wherein said plurality of operational
features are implemented so as to include at least locomotion and
directional steering of said motorized toy vehicle and said control
circuit is in electronic communication with at least a drive motor
and a steering mechanism.
11. A light guided toy vehicle comprising: (a) a first light source
configured to project at least a first narrow beam tracking signal
so as to provide a tracking spot on a surface; and (b) a motorized
toy vehicle configured with at least one sensor responsive to a
position of said at least a first tracking spot; wherein output
from said sensor affects at least one operational feature of said
motorized toy vehicle, and said operational feature is effective to
change a location of the toy vehicle.
12. The light guided toy vehicle of claim 111, wherein said first
tracking signal is configured as a beam of light within the
spectrum of non-visible light.
13. The light guided toy vehicle of claim 12, wherein said first
tracking signal is configured as infrared light.
14. The light guided toy vehicle of claim 13, further including at
least a second light source configured to project a beam of visible
light, so as to indicate said position of said tracking spot.
15. The light guided toy vehicle of claim 11, wherein said light
source is configured in a handheld component operated by a
user.
16. The light guided toy vehicle of claim 11, wherein said at least
one sensor is configured to respond to said tracking spot when said
tracking spot is projected onto said surface in proximity to said
motorized toy vehicle.
17. The light guided toy vehicle of claim 11, further including a
control circuit in electronic communication with said at least one
sensor, said control circuit configured to receive output from said
at least one sensor and control said at least one operational
feature.
18. The light guided toy vehicle of claim 17, wherein said at least
one sensor is configured as a plurality of sensors and said at
least one operational feature is configured as a plurality of
operational features.
19. The light guided toy vehicle of claim 18, wherein said
plurality of operational features included locomotion and
directional steering of said motorized toy vehicle and said control
circuit is in electronic communication with at least a drive motor
and a steering mechanism.
20. A method for remotely guiding a toy vehicle, the method
comprising: (a) providing a source of at least a first auditory
tracking signal configured to emit said at least a first auditory
tracking signal; (b) emitting said at least a first auditory
tracking signal; (c) providing a motorized toy vehicle configured
with at least one sensor responsive to a position of said at least
a first auditory tracking signal; wherein output from said sensor
affects at least one operational feature of said motorized toy
vehicle, and said operational feature is effective to change a
location of the toy vehicle; and (d) altering said position of said
at least a first auditory tracking signal so as to alter said at
least one operational feature and thereby guide the toy vehicle to
a change of location.
21. The method of claim 20, wherein said source of said at least a
first auditory tracking signal is implemented as an electronic
device.
22. The method of claim 20, wherein said at least a first auditory
tracking signal implemented so as to be at a pre-determined
frequency and said at least one sensor is configured to respond
substantially solely to said pre-determined frequency.
23. The method of claim 20, wherein said at least one sensor is
implemented so as to respond to a loudest auditory signal
received.
24. The method of claim 20, wherein said at least a first auditory
tracking signal is implemented as at least a first voice command.
Description
FIELD AND BACKGROUND OF THE INVENTION
[0001] The present invention relates to remote controlled toy
vehicles and, in particular, it concerns a remotely controlled toy
vehicle that is responsive to a reflected tracking signal.
[0002] Remotely controlled toy vehicles that are responsive to a
tracking signal are known in the art. One such device is disclosed
in U.S. Pat. No. 6,780,077 to Baumgartener et al. The Baumgartener
et al. toy includes a conventionally remotely controlled master toy
vehicle with a transmitter configured to broadcast an IR tracking
signal, and a slave toy vehicle with at least two directional IR
receivers configured to receive a direct signal from the master toy
vehicle in order to follow or evade the master toy vehicle.
[0003] Remotely controlling a toy vehicle with a directly received
tracking signal drastically limits the range of movement of the toy
vehicle since the source of tracking signal must be moved from one
location to another in order to direct the movement of the tracking
toy vehicle.
[0004] There is therefore a need for a remotely controlled toy
vehicle that is responsive to a reflected tracking signal.
SUMMARY OF THE INVENTION
[0005] The present invention is a remotely controlled toy vehicle
that is responsive to a reflected tracking signal.
[0006] According to the teachings of the present invention there is
provided, a method for remotely guiding a toy vehicle, the method
comprising: a) providing a first light source configured to project
at least a first narrow beam tracking signal; b) projecting said at
least a first tracking signal so as to provide a tracking spot on a
surface; c) providing a motorized toy vehicle configured with at
least one sensor responsive to a position of said at least a first
tracking spot; wherein output from said sensor affects at least one
operational feature of said motorized toy vehicle, and said
operational feature is effective to change a location of the toy
vehicle; and d) altering said position of said at least a first
tracking spot so as to alter said at least one operational feature
and thereby guide the toy vehicle to a change of location.
[0007] According to a further teaching of the present invention,
said first tracking signal is implemented as a beam of light within
the spectrum of non-visible light.
[0008] According to a further teaching of the present invention,
said first tracking signal is implemented as infrared light.
[0009] According to a further teaching of the present invention,
there is also provided at least a second light source providing a
beam of visible light, configured to indicate said position of said
tracking spot.
[0010] According to a further teaching of the present invention,
said light source is implemented in a handheld component operated
by a user.
[0011] According to a further teaching of the present invention,
said light source is implemented so as to be wearable by a
user.
[0012] According to a further teaching of the present invention,
said tracking spot is projected onto said surface in proximity to
said motorized toy vehicle.
[0013] According to a further teaching of the present invention,
there is also provided a control circuit in electronic
communication with said at least one sensor, said control circuit
configured to receive output from said at least one sensor and to
control said at least one operational feature.
[0014] According to a further teaching of the present invention,
said at least one sensor is implemented as a plurality of sensors
and said at least one operational feature is implemented as a
plurality of operational features.
[0015] According to a further teaching of the present invention,
said plurality of operational features are implemented so as to
include at least locomotion and directional steering of said
motorized toy vehicle and said control circuit is in electronic
communication with at least a drive motor and a steering
mechanism.
[0016] There is also provided according to the teachings of the
present invention, a light guided toy vehicle comprising: a) a
first light source configured to project at least a first narrow
beam tracking signal so as to provide a tracking spot on a surface;
and b) a motorized toy vehicle configured with at least one sensor
responsive to a position of said at least a first tracking spot;
wherein output from said sensor affects at least one operational
feature of said motorized toy vehicle, and said operational feature
is effective to change a location of the toy vehicle.
[0017] According to a further teaching of the present invention,
said first tracking signal is configured as a beam of light within
the spectrum of non-visible light.
[0018] According to a further teaching of the present invention,
said first tracking signal is configured as infrared light.
[0019] According to a further teaching of the present invention,
there is also provided, at least a second light source configured
to project a beam of visible light, so as to indicate said position
of said tracking spot.
[0020] According to a further teaching of the present invention,
said light source is configured in a handheld component operated by
a user.
[0021] According to a further teaching of the present invention,
said at least one sensor is configured to respond to said tracking
spot when said tracking spot is projected onto said surface in
proximity to said motorized toy vehicle.
[0022] According to a further teaching of the present invention,
there is also provided, a control circuit in electronic
communication with said at least one sensor, said control circuit
configured to receive output from said at least one sensor and
control said at least one operational feature.
[0023] According to a further teaching of the present invention,
said at least one sensor is configured as a plurality of sensors
and said at least one operational feature is configured as a
plurality of operational features.
[0024] According to a further teaching of the present invention,
said plurality of operational features included locomotion and
directional steering of said motorized toy vehicle and said control
circuit is in electronic communication with at least a drive motor
and a steering mechanism.
[0025] There is also provided according to the teachings of the
present invention, a method for remotely guiding a toy vehicle, the
method comprising: a) providing a source of at least a first
auditory tracking signal configured to emit said at least a first
auditory tracking signal; b) emitting said at least a first
auditory tracking signal; c) providing a motorized toy vehicle
configured with at least one sensor responsive to a position of
said at least a first auditory tracking signal; wherein output from
said sensor affects at least one operational feature of said
motorized toy vehicle, and said operational feature is effective to
change a location of the toy vehicle; and d) altering said position
of said at least a first auditory tracking signal so as to alter
said at least one operational feature and thereby guide the toy
vehicle to a change of location.
[0026] According to a further teaching of the present invention,
said source of said at least a first auditory tracking signal is
implemented as an electronic device.
[0027] According to a further teaching of the present invention,
said at least a first auditory tracking signal implemented so as to
be at a predetermined frequency and said at least one sensor is
configured to respond substantially solely to said pre-determined
frequency.
[0028] According to a further teaching of the present invention,
said at least one sensor is implemented so as to respond to a
loudest auditory signal received.
[0029] According to a further teaching of the present invention,
said at least a first auditory tracking signal is implemented as at
least a first voice command.
BRIEF DESCRIPTION OF THE DRAWINGS
[0030] The invention is herein described, by way of example only,
with reference to the accompanying drawings, wherein:
[0031] FIG. 1 is a schematic top elevation of a first preferred
embodiment of toy vehicle constructed and operative according to
the teachings of the present invention, shown here with the body
covering removed;
[0032] FIG. 2 is a schematic block drawing of the primary elements
of a preferred embodiment of a handheld controller constructed and
operative according to the teachings of the present invention;
[0033] FIG. 3A is a schematic of a single sensor control scheme
according to the teachings of the present invention;
[0034] FIG. 3B is a schematic of a dual sensor control scheme
according to the teachings of the present invention;
[0035] FIG. 3C is a schematic of a triple sensor control scheme
according to the teachings of the present invention;
[0036] FIG. 4 is a photograph of a prototype of a second preferred
embodiment of a vehicle constructed and operative according to the
teachings of the present invention, shown here with schematic
representations of the fields of view of each of the sensors;
and
[0037] FIG. 5 is an exploded vehicle of a preferred embodiment of a
handheld controller constructed and operative according to the
teachings of the present invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0038] The present invention is a remotely controlled toy vehicle
that is responsive to a reflected tracking signal.
[0039] The principles and operation of a remotely controlled toy
vehicle that is responsive to a reflected tracking signal according
to the present invention may be better understood with reference to
the drawings and the accompanying description.
[0040] By way of introduction, the remotely controlled toy vehicle
of the present invention is configured to respond to a reflected
tracking signal. Preferably, a tracking signal of a narrow beam of
non-visible light is projected to the surface of the ground
creating a tracking spot in proximity of the toy vehicle. The
sensors mounted on the toy vehicle are configured to receive, and
respond to, the reflected light energy of the beam emanating from
the tracking spot. The output from the sensors is used to control
the operational features of the toy vehicle such as, but not
limited to, locomotion and directional steering. Optionally, other
sensors may be deployed on the toy vehicle that are configured to
effect other operational features such as, but no limited to,
turning on and off lights and sound effects, opening doors, and
firing "weapons".
[0041] In a preferred embodiment of the present invention, the
controller, which is the source of the tracking signal, is a
handheld component configured to project the beam of non-visible
light to a location as desired by the user. Optionally, the
controller also projects a beam of visible light along a path
substantially parallel to the beam of non-visible light as an
indicator of the location of the tracking spot. Optionally, the
controller may be configured so as to enable it to be Urn such as,
but not limited to, clipped on a belt, sticking out from a pocket,
tucked in a head or hat band, or attached to a wrist or ankle band,
in such a way that a tracking spot is projected onto the ground in
order to provide hands free control of the toy vehicle.
Alternatively, a separate wearable controller may be provided with
the toy.
[0042] Operation of the toy vehicle is controlled by moving the
tracking spot. Preferably, the control circuitry is configured such
that the toy vehicle follows the tracking spot, as will be
discussed in more detail with regard to the Figures. Optionally,
the control circuitry may be configured such that the toy vehicle
evades the tracking spot.
[0043] Referring now to the drawings, it should be noted that
directional terms such as left, right, forward and reverse are used
with regard to the drawings being discussed and are not intended as
limitations to the principles of the present invention.
[0044] FIG. 1 illustrates the chassis 2 of a preferred embodiment
of the toy vehicle of the present invention. In this embodiment,
the operational feature of locomotion is affected by the rear drive
wheels 4, which are driven by the drive motor 6. The operational
feature of directional steering is affected by the steerable front
wheels 8 by means of the steering motor 20 and associated steering
mechanism, as in conventional remote control vehicles. It is within
the scope of the present invention, however, to provide an
embodiment implementing a two drive-motor, skid steering, mode of
locomotion and directional steering.
[0045] The front 10 and rear 12 sensor arrangements of this
embodiment include three sensors each. This sensor arrangement will
be discussed in greater detail with regard to FIG. 3C. The sensors
are configured such that their receptive field of view is limited
to the ground surface in proximity of the toy vehicle as will be
discussed with regard to FIGS. 3A-3C.
[0046] The control circuit 16 is in electronic communication with
the sensors 10 and 12, the drive motor 6 and the steering motor 20.
Output signals from the sensors are received by the control circuit
16, which in turn controls the operational features of the toy
vehicle such as, but not limited to, locomotion and steering by
operating the drive motor 6 and the steering motor 20.
[0047] FIG. 2 illustrates the basic components of the handheld
controller 30 of the present invention, which includes a battery
32, an "on" "off" switch 34, a circuit board 36, a source of
non-visible light 38, preferably high intensity infra-red LED, a
source of visible light 40, preferably high intensity LED of laser,
and lenses 42. The exploded view of FIG. 5 illustrates these
components deployed in an exemplary handheld case composed of case
segments 44a, 44b and 44c. It should be noted that the controller
case may be of substantially any design such as, but not limited
to, being similar to a handgun, laser pointer or pen light. It will
be appreciated that the visible light source 40 and the nonvisible
light source 38 may be activated independently on each other. That
is, each may have its own on/off switch.
[0048] FIGS. 3A-3C illustrate various sensor arrangements and
schemes for vehicle operation for each arrangement. Each of the
sensors is deployed in an enclosure that limits its field of
sensing "view", that is, the area within which stimuli is detected.
It will be appreciated that the area of the field of view of the
sensor varies with the requirements of each vehicle and the
operational scheme chosen. It should be noted that providing
sensors with an adjustable field of view is within the scope of the
present invention. It will be understood that providing sensor
enclosures that also allow the sensors to detect the signal emitted
by the controller when the controller is pointed directly at the
vehicle is within the scope of the present invention. Such an
arrangement provides the option of carrying or wearing a controller
in such a manner that the vehicle will follow the user. As
mentioned above, this following option may also be achieved by
wearing a controller configured to project a tracking spot on the
ground near the user as the user walks along.
[0049] FIG. 3A illustrates a single sensor arrangement. An example
of an operational scheme for such an arrangement is simple forward
locomotion. When no tracking spot is detected by the sensor 50, the
drive motor is off. When the sensor 50 detects a tracking spot
within region 52, the drive motor is activated and the vehicle will
move straight forward.
[0050] FIG. 3B illustrates a dual sensor arrangement. Such an
arrangement affords both locomotion and direction steering. When no
tracking spot is detected by the sensors 60 and 62, the drive motor
is off. When both sensors 60 and 62 detect a tracking spot, i.e.
the tracking spot is projected into region 68, only the drive motor
is activated. When a tracking spot is detected by only one of
sensors 60 and 62, i.e. the tracking spot is projected into either
region 64 or 66, both the drive motor and the steering motor are
activated Detection of the tracking spot by sensor 60 in region 64
will cause the vehicle to move forward while turning to the left.
Similarly, detection of the tracking spot by sensor 62 in region 66
will cause the vehicle to move forward while turning to the
right.
[0051] The triple sensor arrangement illustrated in FIG. 3C also
affords both locomotion and direction steering. When no tracking
spot is detected by the sensors 70, 72 and 62 the drive motor is
off. When sensor 70 detects a tracking spot, i.e. the tracking spot
is projected into region 78, only the drive motor is activated.
When a tracking spot is detected by sensor 70, i.e. the tracking
spot is projected into region 76, both the drive motor and the
steering motor are activated will cause the vehicle to move forward
while turning to the left. Similarly, detection of the tracking
spot by sensor 74 in region 80 will cause the vehicle to move
forward while turning to the right. Therefore, the toy vehicle of
FIG. 1, which is configured with triple sensor arrangements on both
the front 10 and rear 12 is afforded full six direction operation,
forward straight, forward left and forward right, and reverse
straight, reverse left and reverse right.
[0052] The second preferred embodiment 90 of the vehicle of the
present invention illustrated in FIG. 4 is configured with a triple
sensor arrangement in the front and a single sensor arrangement in
the rear. Therefore, it is afforded four-direction operation,
forward straight, forward left and forward right, and reverse
straight. The sensors and regions are numbered according to the
corresponding sensor schemes listed above with regard to FIGS. 3A
and 3C.
[0053] A variant embodiment of the present invention includes the
use of sound detection sensors rather than light sensors. In such
an embodiment, sensor arrangements and schemes for vehicle
operation substantially as described above with regard to FIGS.
3A-3C are provided in order to track the source of an auditory
signal. The auditory signal may be provided by a controller devise
configured to emit an auditory tracking signal of a pre-determined
frequency wherein the sensors are configured to respond
substantially only to signals at the pre-set frequency. The
frequency need not be within the normal human auditory range. Such
a controller may be, for example, an electronic device, a whistle,
or substantially any other sound producing devise. Alternatively,
the sensors may be configured to respond to the loudest auditory
signal received such as, but not limited to, a human voice,
clapping hands, and a ringing bell.
[0054] Yet another variant embodiment of the vehicle of the present
invention configured to respond to auditory signals is configured
to respond to voice commands, which may include, for example,
calling the toy by name. This may be implemented in conjunction
with the sensor arrangements described with regard to FIGS. 3A-3C,
such that the vehicle tracks the position of the source of the
voice. Alternatively, the vehicle may be configured to respond to
voice commands such as, but not limited to, go, turn right, go
straight, turn left, stop, and back up. It should be noted that the
inclusion of voice recognition software in such an embodiment is
within the scope of the present invention.
[0055] It will be appreciated that the above descriptions are
intended only to serve as examples and that many other embodiments
are possible within the spirit and the scope of the present
invention.
* * * * *