U.S. patent application number 15/606934 was filed with the patent office on 2017-09-14 for automatic correction of keystone distortion and other unwanted artifacts in projected images.
The applicant listed for this patent is John G. Posa, Barry H. Schwab. Invention is credited to John G. Posa, Barry H. Schwab.
Application Number | 20170264875 15/606934 |
Document ID | / |
Family ID | 46600364 |
Filed Date | 2017-09-14 |
United States Patent
Application |
20170264875 |
Kind Code |
A1 |
Posa; John G. ; et
al. |
September 14, 2017 |
AUTOMATIC CORRECTION OF KEYSTONE DISTORTION AND OTHER UNWANTED
ARTIFACTS IN PROJECTED IMAGES
Abstract
A system for manual or automatic correction of geometric or
video image distortions introduced by projection onto obliquely
angled or imperfect surfaces. Sensors may be disposed at the
projector itself, at the projected image surface, or in a portable
remote-control unit. Corrections may be applied as part of an
original set-up process, or dynamically as the configuration
changes. Image-stabilization techniques are utilized where
applicable, as is the use of test patterns and the incorporation of
"helper" signals into the video image. Tactile-sensing capabilities
provide an interactive environment, and various packaging
configurations for projection units and sensor provisions are
disclosed.
Inventors: |
Posa; John G.; (Ann Arbor,
MI) ; Schwab; Barry H.; (West Bloomfield,
MI) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Posa; John G.
Schwab; Barry H. |
Ann Arbor
West Bloomfield |
MI
MI |
US
US |
|
|
Family ID: |
46600364 |
Appl. No.: |
15/606934 |
Filed: |
May 26, 2017 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13095569 |
Apr 27, 2011 |
9667932 |
|
|
15606934 |
|
|
|
|
61439208 |
Feb 3, 2011 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
H04N 5/144 20130101;
H04N 9/3185 20130101; H04N 9/3194 20130101; G09G 2320/06 20130101;
H04N 5/23258 20130101; G09G 3/002 20130101 |
International
Class: |
H04N 9/31 20060101
H04N009/31 |
Claims
1. Projected image interaction apparatus, comprising: a device
including a projector operative to project an image defining a
display area onto a surface, and wherein the projected image
includes textual or graphical imagery; a sensor operative to view
the projected image on the surface and any overlying objects
interposed between the projector and the surface display area;
gesture recognition apparatus operative to recognize an overlying
object interacting with the textual or graphical imagery in the
projected image; and a processor operative to interpret a
recognized gesture and control the device or projected image in
response to the interaction.
2. The projected image interaction apparatus of claim 1, wherein:
the projected image is a still or moving image; and the gesture
recognition apparatus is operative to recognize the position or
movement of the overlying object.
3. The projected image interaction apparatus of claim 1, wherein
the object includes a human form.
4. The projected image interaction apparatus of claim 3, wherein
the object includes a finger, hand or other body part.
5. The projected image interaction apparatus of claim 4, wherein:
the projected image has x and y coordinates associated therewith;
and the processor is operative to compare the position or movement
of the finger, hand or other body part to corresponding x and y
coordinates associated with the projected image to control the
device or the projected image.
6. The projected image interaction apparatus of claim 1, wherein:
the sensor is operative to detect infrared objects; and the
processor is operative to determine the position or movement of a
finger, hand or other body part based upon infrared detection.
7. The projected image interaction apparatus of claim 1, further
including: a tactile sensor operative to generate a wireless signal
if a point of the surface is touched by a user; a wireless receiver
operative to receive the wireless signal; and apparatus operative
to receive the image viewed by the sensor, determine coordinates of
the point touched by the user in response to receipt of the
wireless signal, compare the coordinates to coordinates of the
projected image, and select a region of the projected image
associated with the point touched by the user.
8. The projected image interaction apparatus of claim 7, wherein
the tactile sensor is worn by a user.
9. The projected image interaction apparatus of claim 7, wherein
the tactile sensor is worn on a finger of a user.
10. The projected image interaction apparatus of claim 7, wherein
the wireless signal is an optical, infrared or RF signal.
11. The projected image interaction apparatus of claim 1, further
including: a display buffer storing a raw image to be seen by a
viewer, a projection buffer storing a version of the raw image to
be projected onto the surface, and a sensor buffer storing an image
of the projected image as viewed by the sensor; and wherein the
processor is operative to recognize an interaction by comparing an
image stored in the sensor buffer to an image stored in the display
buffer.
12. The projected image interaction apparatus of claim 11, wherein
the sensor is operative to view an area of the surface which is
larger than the display area of the projected image and store an
image of the larger area in the sensor buffer.
13. The projected image interaction apparatus of claim 1, wherein
the projection buffer is selectively turned off or the frame rate
is reduced to facilitate movement compensation.
14. The projected image interaction apparatus of claim 1, wherein
the device is a smartphone.
15. The projected image interaction apparatus of claim 1, wherein
the sensor forms part of the device.
16. The projected image interaction apparatus of claim 1, wherein
the sensor is a video camera.
Description
REFERENCE TO RELATED APPLICATION
[0001] This application is a continuation of U.S. application Ser.
No. 13/095,569, filed Apr. 27, 2011, now U.S. Pat. No. 9,667,932,
which claims priority from U.S. Provisional Patent Application Ser.
No. 61/439,208, filed Feb. 3, 2011, and wherein the entire content
of both applications is incorporated herein by reference.
FIELD OF THE INVENTION
[0002] This invention relates to image and video projection
systems, and, more particularly, to a system for correcting
geometric and distortions introduced by projection onto imperfect
surfaces.
BACKGROUND OF THE INVENTION
[0003] With the advent of super bright and ultra-bright
light-emitting diodes and portable electronic devices that are too
compact to provide displays which show a great deal of information,
projected video and still images will become more popular. At the
same time, since it might not be known upon which surface the light
will be projected, certain undesirable image artifacts and
distortions, such as keystone effects, will need to be addressed.
Although there are many proposed solutions, including patents which
are directed to keystone correction, they all make assumptions
about the relationship between the projector and the projected
image; if these assumptions are incorrect, a geometrically
unacceptable projection may still result. This invention corrects
for keystone effects and other unwanted image artifacts and
deficiencies such as brightness, contrast, color uniformity, etc.,
in a projected image, regardless of the surface chosen for
projection.
SUMMARY OF THE INVENTION
[0004] The instant invention comprises a video image projection
system capable of sensing and correcting for image distortions,
such as keystone effects and image artifacts and deficiencies,
which are introduced by imperfect projection configurations. In
particular, provisions are included for off-angle projection, or
image projection onto imperfect surfaces.
[0005] Projected image correction apparatus constructed in
accordance with the invention includes a device with a display
generator and a projector projecting an image onto a surface. A
sensor views the projected image and communicate information to the
device enabling the display generator to correct for keystone
effects or incorrect aspect ratio; variations in brightness,
contrast, color rendition, saturation, hue, focus, sharpness, or
movement of the projector. The device may be a video projector,
camcorder, portable computer, smart phone.
[0006] The sensor, which may be a two-dimensional image sensor, may
be disposed in the projector device, or in a hand-held remote
control unit which communicates the information to the device
through a wireless connection. The correction may occur in
conjunction with an initial projection, in response to a user
input, or on a continuous basis.
[0007] The projection device may include a detachable display
screen providing the surface, for example a detachable rigid or
semi-rigid panel or a roll-out screen. The screen may include
corner or edge indicia to assist the sensor in determining the
periphery of the screen. Hardware or software may be included to
generate a test pattern for correction purposes.
[0008] With the image geometry sensor disposed on or in the
projection device, the unit senses geometric and video distortions,
and provides corrective information to projection circuitry,
enabling the unit to manipulate the shape of the projected image so
as to produce a "squared" image with corrections to video
parameters such as brightness, color rendition, and so forth. The
corrections may be either manually applied, or automatically
applied by the projection circuitry.
[0009] Alternatively the sensor may be disposed on or in a portable
remote control unit in wireless communication with the projection
unit. This simplifies the adjustments to be applied to the
projected image where there is a desire to apply these corrections
from the point of view of the audience, or when the projector
itself may not be in a fixed location. If a detachable projection
screen is utilized it may include sensing provisions located either
on the projection unit, or alternatively on the projection screen
itself
[0010] One preferred implementation of the invention includes a
display buffer storing a raw or `desired` image to be seen by a
viewer. A projection buffer is used to store a version of the raw
image to be projected onto the surface, and a sensor buffer is used
to store an image of the projected image as viewed by the sensor.
The display processor is operative to compare the image stored in
the sensor buffer to the image stored in the display buffer and
adjust the image stored in the projection buffer so that the
projected image seen by the sensor corresponds to the image stored
in the display buffer. The display processor may be operative to
perform one or more of the following adjustments to the image
stored in the projection buffer: flipping, stretching, luminance or
color curve adjustment, color replacement, pixel interpolation, or
rotation.
[0011] Further aspects of the invention enables the projected image
to function as a "touch screen," enabling a user to "select" points
on the displayed image, enlarge and reduce dimensions, scroll, turn
pages, and so forth. This feedback feature can be used to "stretch"
the corners of the image (much like the "rubber-band" feature in
graphic and drawing software enables the manipulation of the shape
of an object), thereby enabling the user to "click-and-drag" the
corners of the projected image to manually correct for keystone
effects, or to assist in the auto-correction process for geometric
correction. These embodiments include gesture recognition apparatus
operative to recognize a gesture made by a finger, hand or other
body part overlying the projected image in the view of the sensor,
and control the device or projected image in response to the
recognized gesture.
[0012] Assuming the projected image has x and y coordinates
associated therewith, the gesture recognition apparatus is
operative to determine the coordinates of a finger, hand or other
body part overlying the projected image in the view of the sensor.
The system is then operative to compare the position or movement
thereof a finger, hand or other body part overlying the projected
image to corresponding coordinates associated with the projected
image to control the device or the projected image. The sensor may
include an infrared imaging capability to assist in detecting a
finger, hand or other body part overlying the projected image.
[0013] A tactile sensor may be provided to generate a wireless
signal if a point of the surface is touched by a user. A wireless
receiver receives the wireless signal, with the system being
operative to receive the image viewed by the sensor, determine the
coordinates of the point touched by the user in response to receipt
of the wireless signal, and compare the coordinates to the
coordinates of the projected image to select a region of the
projected image associated with the point. The wireless signal may
be an optical signal such as an infrared signal, with the sensor is
operative to detect the optical signal, or an RF or acoustical
signal.
[0014] The various embodiments of the invention provide for
numerous features, including automatic or manual correction of
geometric distortions, video parameters, and screen texture and
color. Provisions also are included for automatic "horizontal
level-sensing," image-stabilization, and a tactile-sensing
interactive environment, as well as the use of internally-generated
test patterns and an artificial image-frame to assist in manual and
automatic correction of image deficiencies. The disclosure
anticipates a variety of packaging configurations, from small,
portable projection devices (such as cellular phones or PDAs), to
large scale stand-alone video projectors.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] FIG. 1A depicts a first embodiment of the invention, in
which the sensor device is disposed at the projector unit;
[0016] FIG. 1B depicts an alternative embodiment of the invention,
in which the sensor device is disposed on portable remote-control
unit in communication with the projector unit;
[0017] FIG. 2 depicts an alternative embodiment of the invention,
in which the sensor device is disposed on the projector unit, and a
detachable screen is used for the image projection surface;
[0018] FIG. 3 depicts a functional block diagram displaying many of
the features of the preferred embodiment of the invention;
[0019] FIGS. 4A and 4B are simplified drawings which illustrate the
way in which the correction process operates;
[0020] FIGS. 5A and 5B are simplified drawings which illustrate the
way in which the invention provide image stabilization;
[0021] FIG. 6 shows a device such as a smart phone having a
projector producing an image on a surface and a sensor viewing the
projected image along with gesture recognition which determines the
position and/or movement of a finger, hand, or other body part for
control purposes; and
[0022] FIG. 7 illustrates how selection of a part of a projected
image may be implemented with a fingertip device that generates a
wireless signal when a surface is touched.
DETAILED DESCRIPTION OF THE INVENTION
[0023] FIG. 1A illustrates one embodiment of the invention
including an image projector 104 and sensor 108 integrated into a
single unit 102. For the purposes of this invention, "image
projector" should be taken to mean still or moving pictures (i.e.,
video), regardless of aspect ratio or resolution. The invention is
not limited in terms of projector technology, which includes
light-transmissive (i.e., LCD) and light-reflective (i.e., DLP)
approaches. "Sensor" should be taken to mean any type of suitable
image-gathering device, preferably a two-dimensional image sensor
based upon charge-coupled devices (CCDs), for example. Although a
relatively high resolution color sensor is preferred, that is not
necessary to the basic implementation of the invention. To correct
artifacts such as keystone effects alone, for example, a relatively
low-resolution image sensor may be used, even a monochrome sensor.
Also the "unit" 102 should be taken to include a dedicated or
stand-alone projector adapted for connection to a source of imagery
to be projected, or any device that would benefit from a projected
display, including laptop and tablet computers, smart phones,
camcorders, and so forth.
[0024] Continuing the reference to FIG. 1A, projector 104 produces
an image 106 on a surface to be viewed by an audience. The image
106 may include various undesirable artifacts, including keystone
effects or other aspect-ratio issues; brightness, contrast, color,
focus or sharpness variations, and so forth. To alleviate such
problems, this invention utilizes an image sensor which observes
the projected image to obtain a true, real-time picture including
any undesirable artifacts that may be present. Information about
the projected image is then communicated to a graphics processing
unit (GPU) or other electronic circuitry within the projection unit
to correct for the artifacts detected, as discussed in further
detail below. In all embodiments, the correction may occur at a
given time during start-up or by pressing a "correct" button, for
example, or adjustments may automatically occur at present
intervals or in real time to account for changes in position or
lighting over time.
[0025] In FIG. 1A, the projector and sensor are integrated, such
that the projected image is corrected from the perspective of the
projector itself. There may be situations, however, where it is
desirable to correct the image from a perspective other than that
of the projector, such as that seen by a particular viewer. FIG. 1B
illustrates an alternative embodiment of the invention wherein the
image sensor/camera is housed in a hand-held remote control unit
110 which communicates wirelessly to projection device 102. The
remote 110 may simply be an image correction device, including a
still or video camera 112; however, in the preferred configuration,
the remote device 110 forms the same remote control device used to
change images and control the projector 102. With this embodiment
the remote unit may be operated at a desired location associated
with an audience or selected viewer, thereby correcting for
artifacts relative to a particular viewing location. In addition,
this embodiment allows for corrections to be implemented when the
projector 102 may not maintain a fixed position.
[0026] Thus, in each of the embodiments disclosed herein,
corrections can be applied to provide the desired image as viewed
from various points of view: rectilinear or "squared" image as seen
by the projection source; "squared" image as seen by the audience;
or "squared" image as seen by a dynamically changing point of view.
The applicability of the invention is not limited by the size of
the projected image, the texture or nature of the surface onto
which the image is to be projected, or the particular position of
the projection unit relative to the surface onto which the image is
to be projected, and any variations in these variables should be
considered to be within the scope of the invention.
[0027] FIG. 2 illustrates of the applicability of the invention to
a portable electronic device 202 having a detachable screen 204 and
a built-in projector projecting an image 206 onto the screen 204.
The image sensor can be built into the device 202, or in a remote
unit as discussed with respect to FIGS. 1A and 1B. Feet 210 may be
used to stabilize the screen 204 at a desired distance from the
projecting device 202. The projecting device 202 may be any kind of
portable electronic device, including a laptop or tablet computer;
MP3/video player, a device utilizing a display for video content or
controls, or the like. The screen 204 may have corner dots 208 or
other indicia, such that regardless of where it is placed relative
to the projecting unit 202, the sensor may better discern the outer
periphery of the screen 204 and project the image to completely
fill that screen, if desired. Alternatively, the sensors 208 may be
connected to the projector unit through a communication link,
thereby providing direct feedback from the surface as adjustments
are made.
[0028] As opposed to a rigid screen which detaches from the
portable electronic device as shown in the upper portion of FIG. 2,
a flexible unrolling-type screen may be used, much like the
portable screens used with slide projectors and movie cameras of
the past. This would allow the screen to be installed and removed
from the portable electronic device as a small cylindrical tube.
Alternatively, of course, the projection device 202 may project
onto a wall or other surface.
[0029] FIG. 3 illustrates a simplified block diagram of the
preferred embodiments. In essence, a `feedback loop` 300 is
established in accordance with the invention, whereby the projected
image 302 is detected and analyzed by the sensor/camera 304 and
corrected, either automatically and/or in accordance with user
viewing preferences. A display 314 is shown because the invention
does not preclude the use of a display panel other than or in
addition to the projected image. A computer or smart phone, for
example, would typically have both a display screen and a
projector. To enhance operation, projecting device may be equipped
with the capability of producing an image at 106 which is
specifically intended for image correction; that is, it may have a
test pattern, with edges, color or brightness better "seen" by the
sensor. These kinds of test materials are known to those skilled in
the art of video camera set-up.
[0030] There may be some difficulties involved in the detection of
the corners of the projected images by the sensor. For example,
when the sensor is co-located within the projector, if a
perpendicular to the surface that is to receive the image is at an
oblique angle relative to the projection axis, then the amount of
light reflected back towards the sensors will be reduced
significantly. Due to such circumstances, it is advantageous for
the sensor to have sufficient sensitivity to detect low levels of
reflected light. In addition, the amount of light reflected will be
affected by many other factors depending on the smoothness of the
surface, the color of the surface, the brightness of the projected
image, etc. In such cases, the use of "fuzzy logic" as applied in
commercially available consumer video cameras may be utilized to
provide guidance to the correction process, as described herein
below.
[0031] There are several options which may be implemented to
improve the utility and performance of the system. For example,
provisions may be included to automatically identify "level" or
true horizontal, to assist in the "squaring" process. Additionally,
appending a "frame" line surrounding the projected image pixels can
assist in detecting when the projected image is "square," and also
assist in the "leveling" process. For example, in the embodiment
depicted in FIG. 2, the corner detection can be utilized to locate
the "edge" of the image, either from the "frame" markings, or from
detection of the apparent image edge based on content (or lack
thereof), and the information then can be used to automatically
adjust the geometry of the projected image.
[0032] In addition, internally generated "test patterns", such as
the traditional "convergence pattern" "color bars," etc. and other
test signals utilized in broadcast and other professional
applications, can assist the alignment of the system, in much the
same way that these signals are utilized to align and calibrate the
geometric, luminance, and chrominance characteristics of
professional cameras and video monitors. Many of the same
techniques applied in systems utilized for automatic alignment and
matching of cameras and monitors could advantageously be applied
for aligning the projected image system of the instant
invention.
[0033] With the use of a standardized test signal, such as the
"convergence pattern" signal, it would be possible to detect the
texture and/or imperfections of the surface on which the image is
to be projected. Knowing this information, corrections can be
applied at whatever level of detail is desired by the user, and
even complicated distortions, such as would be encountered in
projecting onto a brick wall or hanging drapes, could be applied to
optimize the visible image. Similarly, corrections to luminance
and/or chrominance imperfections can be applied, to optimize the
visible image at whatever level of detail is desired by the user.
In the case of uneven surfaces, particularly those with repeating
patterns such as brick walls, etc., a white or uniform-color
display may be projected, enabling the sensor to gather an image of
the surface texture which may then be used by the graphics
processor to "subtract off" irregularities in luminance, color,
focus, sharpness, and so forth.
[0034] FIGS. 4A and 4B are simplified drawings which better
illustrate the way in which the correction process operates. In the
preferred embodiments different buffer memories are used for
different purposes, namely for "raw" and "corrected" and images. In
reference to FIG. 4A, the image in the display buffer represents a
raw image to be displayed without any undesirable artifacts
generated through color or luminance variations or projection onto
uneven or imperfect surfaces. If the raw image is simply
transferred to the projection buffer used to project the image, the
actual image seen by the sensor as shown in the sensor buffer
suffers from horizontal and vertical keystone distortion and
side-to-side washout (depicted with gray tone variation) due to
uncontrollable lighting effects and/or projected surface
unevenness.
[0035] To correct for these distortions, a graphics processor
analyzes the image in the sensor buffer and compares is to the
desired raw image in the display buffer. The graphics processor
modifies the image in the projection buffer so that the image in
the sensor buffer matches the raw image in the display buffer, at
least as practical under the circumstances. Typically the
projection buffer will generate an intentionally distorted and/or
mirror image of the raw image, such that when the intentionally
distorted or mirror image is projected, the image in the sensor
buffer more closely resembles the desired image in the display
buffer. The graphics processor may use various techniques to create
the intentionally distorted and/or mirror image of the raw image,
including flipping, stretching, luminance or color curve
adjustment, color replacement, pixel interpolation, rotation and so
forth.
[0036] To correct for movement of the projector the invention may
incorporate image-stabilization technology. In many consumer
cameras and camcorders, provisions are included for automatic
image-stabilization, to mitigate the artifacts introduced by the
user while simply trying to hold the camera steady. Mechanical
provisions, such as gyroscopes, may be included, or the correction
may be based on electronic methods. In the case of the electronic
approach to this problem, movement is detected by comparison of
consecutive frames (or fields), to determine which objects are
moving, and which movements actually are related to unsteadiness by
the holder. Movement of the entire frame of the image is attributed
to unsteadiness, while movement of individual objects relative to
other objects is attributed to true motion by these objects.
Movement of the entire image frame can be canceled out by
correcting for the detected motion, leading to a more stable image.
Importantly, this also provides an opportunity to apply noise
reduction, to improve the quality of the image as projected.
[0037] FIGS. 5A and 5B illustrate this aspect of the invention. In
FIG. 5A, the entire image in the sensor buffer is shifted towards
the upper left, indicating that the projector has moved. This may
occur with any hand-held device such as a smart prone projector
being used in a moving car, as one example of many. The graphics
processor detects this movement through comparison of the image in
the display buffer to the shifted image in the sensor buffer. To
stabilize the image, the image in the projection buffer is
intentionally shifted to compensate for the movement such that the
images in the sensor and display buffers now correspond. In the
event the graphics processor is unable to compensate for the motion
if its is extreme or otherwise unexpected, the projection buffer
may be selectively turned off or the frame rate reduced to
facilitate movement compensation.
[0038] A further aspect of the invention enables the projected
image to function as a "touch screen," enabling a user to "select"
points on the displayed image, enlarge and reduce dimensions,
scroll, turn pages, and so forth, much like operations now possible
with the iPhone and pointing devices such as a mouse or touchpad.
This feedback feature can be used to "stretch" the corners of the
image (much like the "rubber-band" feature in graphic and drawing
software enables the manipulation of the shape of an object),
thereby enabling the user to "click-and-drag" the corners of the
projected image to manually correct for keystone effects, or to
assist in the auto-correction process for geometric correction.
[0039] Since the invention incorporates a sensor operative to look
at the projected image, the sensor may be used to detect a finger,
hand or other body part in the projection and interpret position or
movements made as selections of control inputs. FIG. 6 shows a
device 602 such as a smart phone having a projector 604 producing
an image 605 on a surface and a sensor 606 viewing the projected
image. The hand 610 of a user intersects the projected image, and
the user is using their fingers to enlarge a region 612 of the
projected image. This is possible because the x-y pixel coordinates
of the projected image are known by the graphics processor. Even if
the graphics processor has manipulated or corrected the projected
image, the x-y coordinates are known through comparison(s) with the
raw image in the display buffer.
[0040] Continuing the reference to FIG. 6, the coordinates of the
hand 610 (or other body part) may also be determined in several
ways. Since the projected image should "look like" the raw image in
the display buffer, gross differences in area seen by the sensor
may be presumed to be an object in the path of the projected light,
particularly if the object (i.e., hand or finger) is moving. In
addition, if the sensor is able to detect in the infrared (which is
often the case), the sensor (or separate sensor) can be used to
identify warm objects that may be presumed to be body parts. One a
body part is detected, position and movements may be interpreted
with gesture recognition hardware or software known to those of
skill in the art, and my comparing the coordinates of the gesture
with the underlying projected display, control inputs such as
enlarging, reducing, scrolling, page turning, opening, closing, and
so forth may be implemented, much like a touch screen.
[0041] Selection presents a technical challenge, since the user is
touching the projected image on a remote surface as opposed to a
display screen. However, this problem may be solved if the user 710
wears a fingertip device 712 shown in FIG. 7. This device includes
button at the tip of the finger which is activated when a surface
is touched. Activation of the button causes the transmission of a
wireless signal 716 received by the projector device 702. Using the
location/movement detection capabilities discussed with reference
to FIG. 6, the coordinates of the touched point may be determined
and compared to the x-y coordinates of the projected images to
facilitate a selection of the point, area, region or icon
underlying the touched point. The wireless signal generated by the
fingertip device may be an RF or acoustical signal or,
alternatively, an infrared light may be generated which is picked
up by the sensor 706.
[0042] The selection device need not be worn by a person and may
alternatively be put on the end of a pointed used to tap the
surface. As a further alternative, pressure-sensitive surfaces for
the projected images, or touch-screen frames can provide
information as to where a user may be touching the image. Various
applications can benefit from such tactile or other feedback
techniques enabling the various embodiments to function in an
interactive environment. These kinds of signals can be used to
trigger other events, such as advancing the slides in a
presentation, or activating a link to another portion of a control
program or a web site.
[0043] We claim:
* * * * *