U.S. patent application number 10/001362 was filed with the patent office on 2002-05-02 for hybrid vehicle operations simulator.
Invention is credited to Sheridan, Thomas B..
Application Number | 20020052724 10/001362 |
Document ID | / |
Family ID | 26668927 |
Filed Date | 2002-05-02 |
United States Patent
Application |
20020052724 |
Kind Code |
A1 |
Sheridan, Thomas B. |
May 2, 2002 |
Hybrid vehicle operations simulator
Abstract
The invention uses an actual mobile vehicle whose operation is
to be simulated, combined with computer-based image generation
devices, and velocity, acceleration, and/or position measurement
tools to improve the simulated operation of the vehicle, including
perception of and response to hazards. Using the actual vehicle
whose operation is to be simulated improves operator vestibular
cues, visual cues, and motion fidelity, thereby producing a safer,
less expensive means to produce the simulation.
Inventors: |
Sheridan, Thomas B.; (West
Newton, MA) |
Correspondence
Address: |
Ronald J. Kransdorf
Wolf, Greenfield & Sacks, P.C.
Federal Reserve Plaza
600 Atlantic Avenue
Boston
MA
02210
US
|
Family ID: |
26668927 |
Appl. No.: |
10/001362 |
Filed: |
October 23, 2001 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
60242614 |
Oct 23, 2000 |
|
|
|
Current U.S.
Class: |
703/8 |
Current CPC
Class: |
G09B 9/05 20130101 |
Class at
Publication: |
703/8 |
International
Class: |
G06G 007/48 |
Claims
What is claimed is:
1. A vehicle operation simulator comprising: a mobile vehicle
operable in a natural environment having at least one vehicle
control; a scene generator; a scene display in communication with
said scene generator and viewable by a vehicle operator, an
environment view being presented on said scene display which is
created at least in part by said scene generator; and wherein said
mobile vehicle carries the operator and is controlled by the
operator in accordance with said environment view, said mobile
vehicle responding to actuation of said at least one vehicle
control and said environment view responding to at least one of
operation of said at least one vehicle control, operator movement,
and vehicle movement.
2. The vehicle operation simulator of claim 1 including at least
one of an inertial acceleration measurement unit, a gyroscopic
measurement unit, and a pendulum, responding to motion of said
mobile vehicle in up to six degrees of freedom to provide input to
said scene generator.
3. The vehicle operation simulator of claim 1 including at least
one of an inertial acceleration measurement unit, a gyroscopic
measurement unit, and a pendulum, responding to motion of said
operator's head in up to six degrees of freedom to provide input to
said scene generator.
4. The vehicle operation simulator of claim 1 including a
measurement unit responding to a velocity of said mobile vehicle in
up to six degrees of freedom to provide input to said scene
generator.
5. The vehicle operation simulator of claim 1 including at least
one of a global positioning system unit and a laser triangulation
unit, responding to changes in position of said mobile vehicle in
up to six degrees of freedom to provide input to said scene
generator.
6. The vehicle operation simulator of claim 5 including at least
one of an electromagnetic arm, a mechanical arm with potentiometer,
and magnetic sensors, responding to changes in position of said
operator's head in relation to said mobile vehicle in up to six
degrees of freedom to provide input to said scene generator.
7. The vehicle operation simulator of claim 1 including a
computer-based mathematical model of activity of said vehicle,
responding to at least one of said vehicle control and movement of
said vehicle, to provide data on at least one of position of said
vehicle in up to six degrees of freedom as input to said scene
generator.
8. The vehicle operation simulator of claim 1 wherein said
environment view is wholly comprised of elements from said scene
generator.
9. The vehicle operation simulator of claim 8 wherein said scene
display is affixed to at least one of said mobile vehicle and a
head-mounted display worn by the vehicle operator.
10. The vehicle operation simulator of claim 1 wherein said
environment view is a composite of at least one element from said
scene generator, and at least one element from the natural
environment.
11. The vehicle operation simulator of claim 10 wherein said at
least one element from the natural environment is captured with a
video camera and input to said scene generator.
12. The vehicle operation simulator of claim 10 wherein said at
least one element from the natural environment is visible to said
operator through a partially transparent viewing screen.
13. The vehicle operation simulator of claim 12 wherein the viewing
screen is affixed to at least one of the mobile vehicle and a
head-mounted display worn by the vehicle operator.
14. The vehicle operation simulator of claim 10 wherein said scene
generator includes a mechanism maintaining equivalent light
brightness between at least one element from said scene display and
the natural environment.
15. The vehicle operation simulator of claim 1 wherein the at least
one element presented in said environment view differ in a
controlled fashion from the actual behavior of said mobile
vehicle.
16. The vehicle operation simulator of claim 15 wherein the vehicle
responds to operator actuation of vehicle control in accordance
with movement represented in said environment view rather than the
movement of said vehicle.
17. The vehicle operation simulator of claim 1 including one or
more sensors responding to movement of said operator within said
mobile vehicle to provide input to said scene generator and said
scene display.
18. The vehicle operation simulator of claim 1 including secondary
vehicle control for said mobile vehicle, said secondary vehicle
control to be actuated by a second operator.
19. The vehicle operation simulator of claim 18 wherein said mobile
vehicle responds exclusively to said secondary vehicle control when
said secondary vehicle control is actuated.
20. The vehicle operation simulator of claim 18 wherein said mobile
vehicle selectively responds to both said secondary vehicle control
and said vehicle control when said secondary vehicle control is
actuated.
21. The vehicle operation simulator of claim 1 including
parameter-constraining apparatus limiting at least one of the
movement of said mobile vehicle and the actuation of said vehicle
control.
22. The vehicle operation simulator of claim 1 wherein said scene
display includes at least one of a mirror, a flat opaque viewing
screen, a curved opaque viewing screen, and electronic display, and
a partially transparent half-silvered mirror.
23. A method for simulated operation of a vehicle in a natural
environment including the steps: (a) generating an environment
view; (b) presenting the environment view to an operator carried by
a mobile vehicle, said mobile vehicle operating in a natural
environment; (c) the operator actuating controls for said mobile
vehicle, movement of said mobile vehicle responding to said
actuation; and (d) altering said environment view in response to at
least one of vehicle movement, operator actuation of controls,
operator head movement, and operator movement within said
vehicle.
24. The method of claim 23 wherein the elements presented in said
environment view differ in a controlled fashion from those
commensurate with the actual behavior of said mobile vehicle.
25. The method of claim 23 wherein said environment view comprises
at least one element rendered by said scene generator and at least
one element from the natural environment.
26. The method of claim 25 wherein step (a) includes maintaining
equivalent light brightness between at least one element from said
scene display and the natural environment.
27. The method of claim 23 wherein said environment view is wholly
comprised of elements from said scene generator.
28. The method of claim 23 wherein said vehicle has secondary
vehicle control, and including the step of said secondary vehicle
control being actuated by a second operator under selected
conditions.
29. The method of claim 28 wherein said mobile vehicle responds
exclusively to said secondary vehicle control when said secondary
vehicle control is actuated.
30. The method of claim 28 wherein said mobile vehicle selectively
responds to both said secondary vehicle control and said vehicle
control when said secondary vehicle control is actuated.
31. The method of claim 23 wherein said environment view is
presented on at least one of a scene display affixed to said mobile
vehicle and a head-mounted display worn by the vehicle
operator.
32. The method of claim 23 wherein step (c) includes limiting at
least one of the movement of said mobile vehicle and actuation of
said vehicle control.
Description
FIELD OF THE INVENTION
[0001] This invention relates to vehicle simulation, and more
particularly to methods and apparatus for simulating the human
operation of a moving vehicle under selected operating
conditions.
BACKGROUND OF THE INVENTION
[0002] Since the advent of computer technology first made them
possible, vehicle simulators have been used for a number of
purposes, including research, training, and vehicle engineering.
Simulators have become increasingly prevalent and useful for
reproducing the experience of operating aircraft, motor vehicles,
trains, spacecraft, and other vehicles. Aviation simulators have
become particularly prevalent, with nearly every airline now using
simulators for training and research--the first time a commercial
pilot flies a new aircraft, it is often filled with passengers. The
military services use simulators extensively for training personnel
in the operation of ships, tanks, aircraft, and other vehicles.
[0003] Despite their varied usage, simulators, especially those
providing the operator(s) with motion cues, have primarily remained
the tools of large organizations with significant financial
resources. The use of motor vehicle (i.e., driving) simulators has
not yet progressed beyond manufacturers, suppliers, government
agencies (including the military), and academic institutions,
largely because of their cost. However, driving simulators have a
number of valuable safety applications. First, they offer research
into driver response behavior. Highways are becoming populated by
more vehicles, moving at greater speeds, with a greater portion of
drivers comprised of older adults with reduced sensory and response
capabilities. Cellular telephones, navigation systems, and other
devices (often developed by third parties and added to vehicles
without suitable integration with manufacturer-supplied devices)
place increased demands on the driver's attention, and drugs
continually arrive on the market that affect driver alertness.
These factors mandate a better understanding of driver limitations,
particularly those of older drivers. Researching driver behavior in
emergencies by examination of real accidents has limited yield,
because every accident is unique to some extent, determining
causation is difficult, and controlled experimental research is
inherently not possible for real accidents. Driving simulators
could provide data on the driver's response to emergency situations
without exposure to actual risk.
[0004] Second, simulators can provide an improved means for
training and evaluating drivers. Most driver training is conducted
either in classrooms or in automobiles in normal traffic, which
rarely exposes trainees to unexpected hazards. Devices that would
allow trainees to experience potential collision situations,
visibility hazards, or other unusual driving situations, without
actual exposure to risk would provide useful training.
[0005] Third, simulators provide manufacturers and suppliers useful
data from which to further develop their products. Vehicle
manufacturers, suppliers and car/truck fleet owners usually perform
developmental tests in actual vehicles, but this is limited to
experiences not involving collision or other hazards. The use of
simulators to perform these functions is costly, particularly for
programming and measuring motion and configuring the simulator to
represent the appropriate vehicle, limiting the usefulness of these
simulators for most research applications.
[0006] Simulators are primarily tasked with recreating the sensory
cues their "real-world experience" counterparts offer. Most state
of the art simulators do a credible job recreating visual and
audible stimuli, but only the most expensive provide credible cues
for the vestibular senses (controlled by semicircular canals in the
inner ear, which sense rotary acceleration, and otolith organs,
which sense translational acceleration) and the muscle and joint
sensors of motion. Motor vehicle simulators, in particular,
struggle to provide a faithful motion representation without
exorbitant cost. Because of the cost of providing such
functionality, relatively few driving simulators even attempt to
provide motion cues, despite the fact that tests reveal subjects
tend to over-steer or experience vertigo because of a mismatch
between visual and motion cues. Those that provide motion cues
usually do so with an expensive hydraulic simulator base, but very
few motion-base driving simulators are in use, and even these lack
the ability to accurately convey translational acceleration. Some
state of the art simulators promise to rectify this problem, but
this capability typically entails a significant cost. Clearly,
simulators that represent motion cues faithfully are not
cost-effective research or training tools for most
applications.
[0007] The same financial barriers that prevent more widespread use
of driving simulators have also prevented the development of
simulators for the operation of wheelchairs, skis, snowboards, and
many other vehicles which move, sometimes at high speeds, when in
actual operation. Clearly the availability of more cost-effective
simulators could enable better research, training, and engineering,
resulting in safer and more user-friendly vehicles, both those
indicated above and others, and in better, safer operation
thereof.
SUMMARY OF THE INVENTION
[0008] The present invention overcomes the cost and motion fidelity
issues of other vehicle operation simulators with a novel
combination of existing devices. In order to generate realistic
motion cues, the operator is carried by and operates an actual
vehicle. Vehicle examples include but are not limited to an
automobile, motorcycle, aircraft, wheelchair, bicycle, skis, and a
ship. According to one aspect of the invention, the vehicle is
operated in a "natural environment" using its normal vehicle
control, and in accordance with visual and audible cues provided by
a virtual reality device. The natural environment may be an open
space (e.g., a large field or a parking lot), a track, an unused or
seldom used roadway, snow-covered mountain slope, air space, or
other environment appropriate for the mobile vehicle being used.
The virtual reality device takes advantage of recent advances in
computer processing and image generation speeds to create a
realistic vehicle operation environment, including hazards which
are not present in the actual/natural environment. Thus the
invention provides realistic motion cues at reasonable cost,
thereby creating training, research, and product development
opportunities not previously possible.
[0009] In its most preferred embodiment, the present invention is
operated in a large, open area free of obstacles and hazards. Since
its intent is to provide a realistic vehicle operation experience,
these areas provide the greatest opportunity to simulate all types
of uninterrupted operation experiences. For example, if the
invention were to simulate the operation of an automobile, it would
be difficult to simulate driving on a highway for any useful period
if the invention were used in an urban area, or even in a small
field. However, for certain uses, it is envisioned that the
invention may be operated on certain less-trafficked roads or
streets.
[0010] The present invention provides both apparatus for a vehicle
operation simulator and a method for simulating vehicle operation.
The apparatus includes a mobile vehicle having vehicle controls, a
scene generator, and a scene display that receives input from the
scene generator and presents at least a partial virtual environment
view to the operator of the vehicle. A method for use includes the
scene generator creating at least one element within an environment
view, transmitting an electronic signal comprising the at least one
element within the environment view to the scene display, the scene
display presenting the environment view to the vehicle operator,
and, based on resulting operator actuation of vehicle control or
vehicle movement, regenerating the environment view to provide
continually updated cues to the operator, including visual and/or
audible cues.
[0011] The scene display may present the operator an environment
view consisting of artificial elements (produced by the scene
generator) or a combination of artificial and natural elements. The
components of the scene display may, for example, include a
head-mounted unit, a projector, and/or a projection screen which is
either partially transparent (e.g., half-silvered) or opaque (e.g.,
a flat screen). Depending on the equipment used, the environment
view may consist of a viewable image presented within the
head-mounted unit worn by the operator, an image projected onto
flat or curved screens shaped like the windshield and/or side and
rear windows, or images projected onto a semi-transparent screen so
as to superimpose artificial elements on the operator's view of the
surrounding natural environment.
[0012] The scene generator transmits an electronic signal to the
scene display comprising at least one element within the
environment view, which includes the location of natural and
artificial images within the display. The scene generator
continually regenerates the environment view for display to the
vehicle operator via the scene display. The scene generator may
alter artificial images within the environment view in response to
vehicle movement, operator actuation of vehicle controls, and
predetermined artificial image movement. Components of the scene
generator may include a general-purpose programmed computer, and a
means for transmitting a signal to the scene display.
[0013] The environment view may be presented to the vehicle
operator to suggest behavior--for example, a velocity--different
from the actual behavior exhibited by the vehicle. In these
embodiments, a mechanism may be employed to allow the vehicle to
respond to control actuation as though the vehicle were behaving as
shown in the environment view. For example, the environment view
might be presented to the operator of an automobile as though it
were traveling at 70 miles per hour, when the vehicle actually is
travelling at only 35 miles per hour. This mechanism might alter
the operator's actuation of the steering wheel, for example, to
cause a much sharper turn than under normal operation at 35 miles
per hour, or at least to provide a simulated view of such a sharper
turn.
[0014] The scene generator may take one or more forms of vehicle
and operator movement and/or position data multiple as input. These
may include acceleration data from an accelerometer or gyroscopic
inertial measurement unit, velocity data from a unit which measures
either translational or rotational velocity in up to six degrees of
freedom, or position data from a positional measurement unit.
[0015] A mechanism may be used to maintain equivalent light
brightness between a natural environment seen outside the vehicle
and an image projected on such natural environment. For example,
embodiments in which images are projected on to a semi-transparent
screen so as to superimpose artificial elements on a natural
environment will need to calibrate the brightness of such images to
the brightness of the natural environment in order to present a
realistic operating environment. This mechanism may include a
component mounted on the exterior of the vehicle to take light
brightness measurements of the natural environment. This mechanism
provides these measurements to the scene generator, continually
incorporating any shifting brightness of the environment into the
generation of artificial images so that they remain realistic under
changing conditions.
[0016] The vehicle may employ secondary vehicle controls to enhance
operator safety, such that the vehicle responds exclusively to the
secondary controls, or to both controls when the secondary controls
are actuated. This secondary vehicle control might be used in
instances where only a secondary operator can see the actual
movement of the vehicle in relation to the natural environment, and
might, in an automobile for example, include conventional dual
controls used in driver training vehicles.
[0017] The vehicle may employ parameter-constraining apparatus that
act to restrict the movement of the vehicle or the actuation of
vehicle control. In one example, on a wheelchair this apparatus
might restrict movement to prevent it from exceeding certain
speeds. In another example, on an airplane this apparatus might
restrain control actuation to prevent a roll so sharp it would
disorient the operator.
[0018] Further features and advantages of the present invention, as
well as the structure and operation of various embodiments of the
present invention, are described in detail below with reference to
the accompanying drawings. In these drawings, the same or
equivalent reference numerals are used to represent the same
element in the various figures.
IN THE DRAWINGS
[0019] FIG. 1 is a block diagram of an illustrative embodiment of
the invention.
[0020] FIG. 2 is a view from the operator's perspective of an
illustrative embodiment.
[0021] FIG. 3 is a flowchart illustrating operation of the
invention.
[0022] FIGS. 4A, 4B, 4C, and 4D are semi-block diagrams
illustrating four different ways in which the invention may be
implemented.
DETAILED DESCRIPTION
[0023] Referring to FIG. 1, the vehicle operation simulator for an
illustrative embodiment includes a mobile vehicle 100, which may be
any vehicle that moves with at least one degree of freedom, for
which movement represents an ordinary feature of operation, and
which includes at least one component for regulation or control of
said movement. Examples include, but are not limited to, an
automobile, aircraft, ship, truck, railroad train, motorcycle,
wheelchair, bicycle, snowboard, roller skates, and skis. Mobile
vehicle 100 may be operated in a natural environment; for example,
in an open space appropriate to the mobile vehicle. This open space
should be large and preferably free of other vehicles, potential
hazards, and pedestrians. However, it is also contemplated that the
invention be practiced on unused or seldom-used tracks, streets,
air space, snow slopes, roadway, or other environment on which the
particular vehicle 100 might normally be operated.
[0024] A scene generator 130 is provided which generates an
electronic signal and transmits it to scene display 140, which
presents an environment view 170 to human operator 120. Typically,
the scene generator includes a programmed general-purpose computer
to generate images and sounds associated with a virtual environment
which may include obstacles and hazards, including, but not limited
to, other vehicles, animals, people, or fixed objects. These
computer-generated elements typically exhibit behavior and
characteristics of their real-world counterparts. For example, a
computer-generated image of a person might be animated so as to
appear to cross the street in front of the mobile vehicle 100. It
should be noted that other equipment might also provide the
functionality of the scene generator 130 including, for example, an
array of projectable photographic or video images.
[0025] In some embodiments, the environment view is completely
artificial. In one example, the environment view may include a
computer-generated artificial background, at least one
computer-generated artificial element, and have no elements taken
from the natural environment surrounding the mobile vehicle. In
other embodiments, the environment view may be comprised of a
composite of natural elements and artificial elements. In one
example, computer-generated artificial elements might be
superimposed on a display screen that also allows the view of a
natural environment to pass through. In another example, the scene
generator would superimpose computer-generated artificial elements
against a backdrop of a color video signal of the actual natural
environment, or of a selectively modified natural environment. For
example, a simulation conducted during the day may be modified to
simulate night driving. The scene generator may also receive input
on the state of the natural environment from, for example,
vehicle-mounted cameras, and use this input in generating at least
one element within an environment view that is related in
predetermined ways to the actual environment.
[0026] The scene display 140 may take many forms. In some
embodiments, the scene display is a head-mounted display that
presents the environment view in the field of vision of the
operator, and allows for a realistic field of vision no matter how
the operator's head is oriented or where in his field of vision the
operator's eyes are focused. In other embodiments, the scene
display includes an electronic display and/or a projection unit
affixed to the vehicle. Either the head-mounted or fixed display
may include a half-silvered mirror, allowing the items projected on
to the half-silvered mirror and the natural environment 180 behind
it to comprise the environment view. In other embodiments, the
scene display includes a projection unit and a flat screen,
constructing an environment view 170 consisting entirely of
projected elements. The environment view may consist of images
projected on a single surface or, where appropriate, multiple
surfaces. For example, the simulation of the operation of a
helicopter might require the use of multiple display surfaces to
present the operator simulated views above the airplane and on
either side, as well as in front.
[0027] In some embodiments the operator's actuation of vehicle
control 110 is input to a computerized mathematical model 135,
which may run on the same computer as the scene generator. This
mathematical model may then provide input to scene generator 130,
causing the scene generator to alter the environment view presented
on the scene display as appropriate to compensate for vehicle
orientation and/or position, and the operating environment to be
simulated.
[0028] Data on vehicle activity may also be provided to the scene
generator via the measurement unit 150. This unit may measure the
velocity of the vehicle (by measuring, for example, an automobile's
wheel rotation and angle), measure its translational or rotational
acceleration (for example, with an accelerometer, inertial
acceleration measurement unit, or gyroscopic sensors), or measure
changes in its position (using, for example, a global positioning
system device, or laser triangulation in the operating area).
Regardless of the measurement device used, however, this velocity,
acceleration, or position data will encompass up to six degrees of
freedom including translation and rotation. In one example, the
measurement unit might discern a ship's velocity by combining
measurements of water flow past a predetermined point on the ship's
hull with measurements of the rudder angle over time. In another
example, the measurement unit might discern an automobile's
acceleration or deceleration relative to the ground and/or
gyroscopic changes in its heading over time. (The use of inertial,
position, and velocity measurement units will be well-known by
those skilled in the art.) Data from either of these measurement
units may supplement or replace input from vehicle control 110 to a
mathematical model and/or the scene generator. The scene generator
may then alter the environment view as appropriate given the mobile
vehicle's movement (i.e., changes in angle or position relative to
the earth) using conventional computer graphic transformations of
image geometry.
[0029] Operator 120 actuates vehicle control 110 to control mobile
vehicle 100, triggering cues to the operator's motion sense organs.
Some embodiments may employ additional features to ensure the
safety of the operator. For example, air bags and lap belts may be
used to secure the operator in place during operation. Either
vehicle control 110, or the motion of mobile vehicle 100, may be
constrained by parameter-constraining apparatus 160. The
parameter-constraining apparatus may comprise a computer system
designed to assume control of the vehicle under certain hazardous
conditions, a governor mechanism designed to limit vehicle
velocity, or a mechanism limiting turn radius, angle of descent
and/or other motion parameters. This apparatus may restrain motion
either totally or in a manner dependent on vehicle operating
conditions. The constraints may limit actuation of vehicle
controls, but preferably limit the response of the vehicle to the
controls.
[0030] Depending on the embodiment, scene generator 130 may also
take input from light brightness measurement unit 190 and video
camera 200. A light brightness measurement unit may provide data
enabling the scene generator to maintain consistent brightness
between the natural environment and any artificial elements that
are superimposed. Therefore, this unit may be mounted or otherwise
affixed to the vehicle so as to enable measuring the light
brightness of the environment view as seen by the operator, as will
be appreciated by those skilled in the art.
[0031] One or more video cameras may provide one or more video
signals depicting the natural environment, for use when the natural
environment is not otherwise visible to the operator. Therefore the
video camera(s) may also be mounted or otherwise positioned on the
vehicle's exterior or on the operator's head so as to capture the
visible elements of the natural environment from a perspective
collinear with the operator's field of vision; methods for
appropriate capture of the natural environment using video camera
apparatus will also be well-known by those skilled in the art.
While the camera(s) may provide a video image directly to scene
display 140, it is preferable that camera output be provided, as
shown, to scene generator 130, where it may be used to reproduce
either the actual--or a modified version of--the natural
environment.
[0032] FIG. 2 depicts the interior of mobile vehicle 100 which, for
the illustrative embodiment, is automobile 200 with controls 210
including a steering wheel, an accelerator, a brake, and other
suitable controls such as a gear shift, clutch, de-fogger, etc.
(controls not shown). Scene generator 130 may be a programmed
general-purpose computer stored within automobile 200. A
half-silvered mirror 220, integrated with or separate from the
vehicle's windshield, or attached to the head-mounted display,
receives either projected images from a projector (not shown)
situated within automobile 200 (in the case of a screen display),
or a signal from the scene generator (in the case of the
head-mounted display). Either the image projector or the
head-mounted unit, combined with the half-silvered mirror 220, form
scene display 140. Obstacles 240 are placed in the environment view
such that it appears superimposed on the natural environment also
viewable through the half-silvered mirror 220.
[0033] Some embodiments may also include a secondary vehicle
control 230 to promote the safe operation of the automobile. A
secondary vehicle operator, who monitors the operator's actions and
corrects or overrides vehicle control actuation that would result
in danger or injury, operates secondary vehicle control 230. The
secondary operator may experience the same environment view as the
operator, may experience only the natural environment, may
experience both environments (for example, on a split screen view),
or may experience some other view appropriate to maximize safe
operation of the vehicle.
[0034] FIG. 3 is a flow diagram of a method for simulating vehicle
operation utilizing the apparatus of FIG. 1. In step 310 an
environment view is created, which may consist of artificial
elements designed to wholly comprise the environment view, or
artificial elements intended to be superimposed on natural elements
to comprise the environment view. The scene generator transmits
these elements to the scene display. In step 320, the scene display
presents the environment view to the operator. In some embodiments,
if scene display is accomplished via projection on a viewing
surface, the viewing surface may encompass the field of vision
regardless of the operator's head movement--i.e., the viewing
surface will allow the operator to see a projected image in all
relevant directions for the particular vehicle.
[0035] In step 330, the operator actuates vehicle control in
accordance with the environment view. The actuation of vehicle
control will include at least one operator act--for example,
applying rotational force to a steering wheel, controlling force on
an accelerator, applying force to a brake pedal, applying force on
one edge of a snowboard, and/or applying force on the control stick
of an airplane.
[0036] As shown in step 340, the vehicle responds to actuation of
the vehicle control. In some instances, parameter-restraining
apparatus may be employed to restrict vehicle movement, to enhance
operator safety or for other reasons. This apparatus may act to
restrain control actuation by, for example, preventing the operator
from applying more than a predetermined downward force on the
accelerator, or from applying more than a predetermined rotational
force on the steering wheel. This apparatus may alternatively (or
in addition) restrict vehicle movement resulting from operation of
the control by, for example, preventing the vehicle from exceeding
a predetermined speed or executing an overly sharp turn. The scene
generator may react to the controls as operated, or to the
constrained control operation or vehicle movement.
[0037] The actuation of vehicle control in step 330 and/or vehicle
movement in step 340 will provide input to the regeneration of the
environment view in step 310. If the environment view responds to
control actuation, sensors on one or more vehicle controls may
provide input to a mathematical model of vehicle activity, which in
turn provides input to the scene generator. If the environment view
responds to vehicle movement, a measurement unit mounted on the
vehicle may provide input to the scene generator. In either case,
the scene generator processes this input to update at least some
elements within the environment view, and the scene display
presents the environment view to the operator. The frequency of
this update will vary based on the processing power of the computer
within the scene generator, but may take place thousands of times
per second.
[0038] In some embodiments, the scene generator may create elements
of an environment view that do not coincide with the actual
behavior of the vehicle. In these embodiments, a mechanism may
supplement, detract from, or otherwise alter the force applied by
the operator to actuate vehicle control and/or the vehicle response
to such actuation, in order to simulate vehicle control actuation
under the conditions presented in the environment view. For
example, if the environment view is presented to simulate an
automobile moving at 70 miles per hour, but the vehicle is actually
moving at 35 miles per hour, a mechanism may translate a rotational
force the operator applies to the steering wheel to a much sharper
actual turn of the front axle, consistent with a velocity of 70
miles per hour. Also, for example, if the environment view is
presented to simulate an automobile traveling in the snow, a
mechanism may translate the downward force applied to the brake
pedal to a much weaker force, or otherwise alter the force,
actually applied to the brake pads to simulate deceleration in
slippery conditions. Those skilled in the art will be able to offer
several methods through which this may be accomplished. Regardless
of the method employed, data on the operator's actuation of vehicle
control will be fed to the scene generator for continual
regeneration of the environment view.
[0039] FIGS. 4A-4D depict alternative components suitable for use
in implementing the apparatus depicted in FIG. 1. Components within
FIGS. 4A-4D are numbered according to the corresponding component
from FIG. 1 and given alphabetic suffixes corresponding to the
specific figure. In some instances, particular components shown in
FIG. 1 comprise more than one component shown in FIGS. 4A-4D; in
these instances identifiers are assigned in FIGS. 4A-4D so as to
indicate a numeric association between the components For example,
scene display 140 in FIG. 1 equates to scene display half-silvered
mirror 143B and scene display projector 145B in FIG. 4B.
[0040] Referring to FIG. 4A, operator 120A wears head-mounted scene
display 140A. This head-mounted display receives a signal from
scene generator 130A. Depending on the vehicle whose operation is
to be simulated, the display may consist of, for example, a roughly
planar surface and three-dimensional elements therein (for
simulation of automobile operation, for instance), or a relatively
unobstructed view of the open space before the vehicle (for an
airplane, for instance). The head-mounted display and scene
generator are capable of presenting the vehicle operator with an
environment view commensurate with head movement toward the left,
right, up, or down, and commensurate with vehicle movement, since
the operator remains in a relatively fixed position within the
vehicle. At any one time, however, it presents the operator with an
environment view comprised of the operator's field of vision given
his/her head orientation. Thus, the operator's environment view
varies as a function of both vehicle movement or position, and of
head movement or position.
[0041] The inertial measurement unit (IMU) 150A ascertains
acceleration of both the vehicle and the operator's head, and
provides this input to scene generator 130A so as to regenerate the
environment view for rendering on the head-mounted scene display.
The scene generator maintains a realistic simulation of the
operator's field of vision by accepting data on head and vehicle
acceleration from the IMU, regenerating the environment view based
on this data, and transmitting it to the head-mounted scene display
140A. Those skilled in the art will be able to offer several
alternatives for how the transmission of velocity data from the IMU
to the scene generator might be accomplished. Those skilled in the
art will also be able to offer suitable power sources for the scene
generator, IMU, head display, and video camera, such that the risk
of equipment failure and resulting operator danger due to power
outage is minimized.
[0042] Depending on the vehicle used, and other factors, the scene
generator may be secured within the vehicle, or may be a portable
unit that can be worn or otherwise held by the operator while the
mobile vehicle is in motion.
[0043] FIG. 4B depicts another illustrative embodiment of the
invention, wherein operator 120B observes the environment view
through half-silvered mirror 143B, which is sufficiently
transparent to allow the operator to view the natural environment
180B through it, and sufficiently opaque to allow the operator to
view artificial images projected by scene display projector 145B.
The scene generator 130B transmits a signal consisting of
artificial elements to be displayed and their location in the
environment view, among other data, to the scene display projector,
and the scene display projector projects the image on half-silvered
mirror 143B. Thus the composite image/environment view 170B viewed
by an operator is a combination of natural elements from the scene
ahead and superimposed artificial elements projected by the scene
display projector.
[0044] Although the half-silvered mirror is depicted in FIG. 4B as
a flat, windshield-like screen, other embodiments might employ a
cylindrical half-silvered mirror mounted to the vehicle structure,
a cylindrical half-silvered mirror mounted to the operator's head,
or other variations.
[0045] The measurement unit 150B provides input on the vehicle's
velocity to the scene generator so that artificial elements within
the environment view can be updated appropriately for presentation
by the scene display projector. The scene generator accepts this
input on the vehicle's position to continually regenerate the
environment view.
[0046] A light brightness equivalence mechanism 190B measures the
intensity of light outside the vehicle and provides this input to
the scene generator. The scene generator then adjusts the
brightness of images to be superimposed by scene display projector
145B, so that composite image 170B constitutes a realistic
rendering of an operating environment. This aspect of the invention
may be especially important for vehicle operation during periods of
low sunlight, during periods of especially bright daylight, or in
instances of high glare.
[0047] The scene generator, the scene display projector, the
measurement unit and the light brightness equivalence mechanism may
be stored within or mounted upon the vehicle.
[0048] FIG. 4C depicts another embodiment of the invention, which
is the same as FIG. 4A except that the scene generator 130C
receives input from video camera 200C, which is mounted on the
operator's head so as to be collinear with the operator's view.
This video signal may depict the natural environment, or it may be
altered before presentation to the operator in a predetermined
fashion. In an illustrative embodiment, scene generator 130C alters
the signal sent by video camera 200C to insert artificial elements
and their location into the environment view, and in some cases
also makes selected variations in the natural environment. Thus,
operation of the vehicle at night might be simulated during
daylight hours. This altered signal is then input to head-mounted
scene display 140C. A scene generator may be mounted on or within
the vehicle, or may be a portable unit that can be worn or
otherwise held by the operator while the vehicle is in motion.
[0049] The inertial measurement unit 150C affixed to the
head-mounted display provides input on the acceleration of the
vehicle and/or the operator's head. When the head-mounted display
is used, additional measurement of head orientation or position,
and/or of operator position within the vehicle, may be provided by
means of an electromagnetic sensor and/or mechanical linkage sensor
with a potentiometer (not shown) affixed to the vehicle. This may
prove useful for simulating the operation of a vehicle which may
requires the operator to move about within the vehicle's interior
(e.g., a ship or railroad car). The scene generator will combine
data provided by the sensor(s) and other measurements of the
vehicle's and operator's position to provide an accurate
environment view to the operator.
[0050] FIG. 4D depicts another embodiment of the invention, which
is largely the same as FIG. 4B except that video camera 200D
provides input to scene generator 130D in the form of a video image
collinear with the operator's view, and operator 120D views an
image projected on flat screen 140D. A measurement unit 150D
transmits input on vehicle position, provided by means which may
include a global positioning system (GPS) unit, laser triangulation
within the operating area, or other position measurement
techniques, to the scene generator. In a preferred embodiment, the
scene generator manipulates the signal sent by the video camera,
which may depict the natural environment, to insert artificial
elements and their location. This altered signal is then fed to
scene display projector 140D, which projects environment view 170D
on to the flat screen.
[0051] While in the discussion above, head and vehicle movement and
position have been measured to control the scene generator, in some
applications other movements may be monitored, as appropriate, to
provide a realistic simulation of vehicle operation. Thus, while
the invention has been particularly shown and described with
reference to specific embodiments, and variations thereon have been
indicated, it should be understood by those skilled in the art that
various additional changes in form and detail may be made therein
without departing from the spirit and scope of the invention as
defined by the following claims.
* * * * *