U.S. patent application number 14/234030 was filed with the patent office on 2014-05-22 for projection capture system and method.
The applicant listed for this patent is David Bradley Short. Invention is credited to David Bradley Short.
Application Number | 20140139668 14/234030 |
Document ID | / |
Family ID | 47629565 |
Filed Date | 2014-05-22 |
United States Patent
Application |
20140139668 |
Kind Code |
A1 |
Short; David Bradley |
May 22, 2014 |
PROJECTION CAPTURE SYSTEM AND METHOD
Abstract
In one example, a projection capture system includes: a digital
camera, a projector, and a mirror housed together as a single unit
in which, when the unit is deployed for use with a work surface:
the camera is positioned above the projector; the projector is
positioned below the camera; and the mirror is positioned above the
projector and configured to reflect light from the projector into
the camera capture area. In one example, a projection capture
method includes: establishing a camera capture area within which a
camera can capture an image of an object; establishing a projector
display area overlapping the capture area and into which a
projector can project light; lighting the camera capture area with
the projector; and positioning a specular glare spot from the
projector lighting outside the camera capture area.
Inventors: |
Short; David Bradley; (San
Diego, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Short; David Bradley |
San Diego |
CA |
US |
|
|
Family ID: |
47629565 |
Appl. No.: |
14/234030 |
Filed: |
August 2, 2011 |
PCT Filed: |
August 2, 2011 |
PCT NO: |
PCT/US2011/046253 |
371 Date: |
January 21, 2014 |
Current U.S.
Class: |
348/143 |
Current CPC
Class: |
G06F 3/042 20130101;
G03B 21/2033 20130101; H04N 5/2252 20130101; H04N 7/18 20130101;
G03B 21/147 20130101; G03B 21/28 20130101; G03B 5/02 20130101; G06F
3/0304 20130101; G03B 33/06 20130101; G06F 3/03545 20130101; G03B
15/00 20130101; H04N 5/2256 20130101; G03B 17/54 20130101; H04N
9/3111 20130101; G03B 27/323 20130101 |
Class at
Publication: |
348/143 |
International
Class: |
H04N 7/18 20060101
H04N007/18 |
Claims
1. A projection capture system for use with a work surface, the
system comprising: a digital camera, a projector, and a mirror
housed together as a single unit in which, when the unit is
deployed for use with the work surface: the camera is positioned
above the projector and the camera defines a capture area within
which the camera can acquire images on the work surface; the
projector is positioned below the camera; and the mirror is
positioned above the projector and configured to reflect light from
the projector down on to the work surface, and the projector and
the mirror define a display area on the work surface overlapping at
least part of the capture area.
2. The system of claim 1, wherein: the camera and the projector are
operatively connected to one another such that the projector
provides a light source for the camera capturing images; and the
camera, the projector, and the mirror are positioned with respect
to one another such that, when the unit is deployed for use with
the work surface, a glare spot from projector light reflected off
the mirror lies outside the camera capture area.
3. The system of claim 2, wherein, when the unit is deployed for
use with the work surface: the camera is positioned over the
capture area at a location offset from a center of the capture
area; and the projector is positioned outside the capture area and
outside the display area.
4. The system of claim 2, further comprising a controller housed
together with the digital camera, the projector, and the mirror as
part of the single unit, the controller operatively connecting the
camera and the projector and configured to control the camera and
the projector for: capturing an image of an object positioned on
the work surface in the capture area; and projecting the object
image onto the work surface in the display area.
5. The system of claim 4, further comprising a user input device
operatively connected to the controller and configured to enable a
user to interact with the system on the work surface.
6. The system of claim 5, wherein the unit is configured for
deployment with a flat, horizontal work surface.
7. The system of claim 6, further comprising a portable mat
deployable with the unit as the work surface.
8. The system of claim 2, wherein the display area and the capture
area are substantially the same.
9. A projection capture system for use with a work surface, the
system comprising: a digital camera, a projector, a controller and
a mirror housed together as a single unit in which, when the unit
is deployed for use with the work surface: the camera defines a
capture area within which the camera can acquire images on the work
surface; and the projector and the mirror define a display area on
the work surface overlapping at least part of the capture area; and
the controller operatively connecting the camera and the projector
and configured to control the camera and the projector for: the
projector illuminating the capture area; the camera capturing an
image of an object positioned on the work surface in the capture
area and illuminated by the projector; and the projector projecting
the object image onto the work surface in the display area.
10. The system of claim 9, the camera, the projector, and the
mirror are positioned with respect to one another such that, when
the unit is deployed for use with the work surface, a glare spot
from projector light reflected off the mirror lies outside the
camera capture area.
11. The system of claim 9, further comprising a user input device
operatively connected to the controller and configured to enable a
user to interact with the system on the work surface.
12. An interactive projection capture system for use with a three
dimensional workspace having a horizontal work surface, the system
comprising: a digital camera, a projector, and a mirror housed
together as a single unit in which, when the unit is deployed for
use with the workspace: the camera defines a three dimensional
capture space within which the camera can effectively acquire
images, the capture space bounded in two dimensions by a capture
area on the work surface; the projector and the mirror define a
three dimensional display space bounded in two dimensions by a
display area on the work surface overlapping at least part of the
capture area; and the camera, the projector, and the mirror are
positioned with respect to one another such that light from the
projector reflected off the mirror illuminates the capture space
and a glare spot from the reflected light lies outside the capture
space; and a user input device operatively connected to the camera
and the projector and configured to enable a user to interact with
the system in the workspace.
13. The system of claim 12, further comprising a controller
operatively connecting the camera and the projector and configured
to control the camera and the projector for: the projector
illuminating the capture space; the camera capturing an image of an
object positioned on the work surface in the capture area and
illuminated by the projector; the projector projecting the object
image onto the work surface in the display area; and the camera
capturing an image of the projected object image.
14. The system of claim 12, wherein the user input device includes
an infrared digital stylus for selectively emitting infrared light
within the workspace and an infrared camera for capturing infrared
light emitted by the stylus within the workspace.
15. The system of claim 14, wherein a capture space of the infrared
camera is coincident with the display space.
16. A projection capture method, comprising: establishing a camera
capture area within which a camera can capture an image of an
object; establishing a projector display area overlapping the
capture area and into which a projector can project light; lighting
the camera capture area with the projector; and positioning a
specular glare spot from the projector lighting outside the camera
capture area.
17. The method of claim 16, wherein positioning the specular glare
spot outside the camera capture area includes the projector
projecting light up toward a mirror and the mirror reflecting light
down into the camera capture area.
18. The method of claim 16, wherein positioning the specular glare
spot outside the camera capture area includes folding a light path
from the projector to the camera capture area along a fold
line.
19. The method of claim 18, wherein the fold line is defined by a
light reflecting surface.
Description
BACKGROUND
[0001] A new projection capture system has been developed in an
effort to improve digitally capturing images of documents and other
objects and in an effort to improve the interactive user experience
working with real objects and projected objects on a physical work
surface.
DRAWINGS
[0002] FIGS. 1A and 1B are perspective, exterior views illustrating
a new projection capture system, according to one example of the
invention. In FIG. 1A, the image of a two dimensional object (a
hardcopy photograph) has been captured and displayed. In FIG. 1B,
the image of a three dimensional object (a cube) has been captured
and displayed.
[0003] FIG. 2 is a perspective, interior view illustrating a
projection capture system, such as the system of FIG. 1, according
to one example of the invention.
[0004] FIG. 3 is a block diagram of the projection capture system
shown in FIG. 2.
[0005] FIG. 4 is block diagram illustrating one example of a user
input device in the system shown in FIGS. 2 and 3.
[0006] FIGS. 5 and 6 are side and front elevation views,
respectively, illustrating the positioning of the camera and the
projector in the projection capture system shown in FIGS. 2 and
3.
[0007] FIGS. 7-11 are a progression of side elevation views showing
various positions for the projector and the camera in a projection
capture system, illustrating some of the problems associated with
moving the glare spot out of camera capture area.
[0008] FIGS. 12 and 13 illustrate one example of the camera in the
projection capture system shown in FIGS. 2 and 3.
[0009] FIG. 14 illustrates one example of the projector in the
projection capture system shown in FIGS. 2 and 3.
[0010] FIGS. 15 and 16 illustrate examples of the user input device
in the projection capture system shown in FIGS. 2 and 3.
[0011] The same part numbers are used to designate the same or
similar parts throughout the figures,
DESCRIPTION
[0012] The examples shown in the figures and described below
illustrate but do not limit the invention, which is defined in the
Claims following this Description.
[0013] In one example of the new projection capture system, a
digital camera, a projector, and a mirror are housed together as a
single unit in which, when the unit is deployed for use with a work
surface, the camera is positioned above the projector, the
projector is positioned below the camera, and the mirror is
positioned above the projector and configured to reflect light from
the projector into the camera capture area. In one example, the
projector provides a light source for the camera capturing images
where the camera, the projector, and the mirror are positioned with
respect to one another such that the glare spot from the projector
light lies outside the camera capture area.
[0014] FIGS. 1A and 1B are perspective, exterior views illustrating
one example of a new projection capture system 10 and an
interactive workspace 12 associated with system 10. FIG. 2 is a
perspective view illustrating one example of a projection capture
system 10 with exterior housing 13 removed. FIG. 3 is a block
diagram of system 10 shown in FIG. 2. Referring to FIGS. 1A, 1B, 2,
and 3, projection capture system 10 includes a digital camera 14, a
projector 16, and a controller 18. Camera 14 and projector 16 are
operatively connected to controller 18 for camera 14 capturing an
image of an object 20 in workspace 12 and projector 16 projecting
the object image 22 into workspace 12 and, in some examples, for
camera 14 capturing an image of the projected object image 22. The
lower part of housing 13 includes a transparent window 21 over
projector 16 (and infrared camera 30).
[0015] In the example shown in FIG. 1A, a two dimensional object 20
(a hardcopy photograph) placed onto a work surface 24 in workspace
12 has been photographed by camera 14 (FIG. 2), object 20 removed
to the side of workspace 12, and object image 22 projected onto a
work surface 24 where it can be photographed by camera 14 (FIG. 2)
and/or otherwise manipulated by a user. In the example shown in
FIG. 1B, a three dimensional object 20 (a cube) placed onto work
surface 24 has been photographed by camera 14 (FIG. 2), object 20
removed to the side of workspace 12, and object image 22 projected
into workspace 12 where it can be photographed by camera 12 and/or
otherwise manipulated by a user.
[0016] System 10 also includes a user input device 26 that allows
the user to interact with system 10. A user may interact with
object 20 and/or object image 22 in workspace 12 through input
device 26, object image 22 transmitted to other workspaces 12 on
remote systems 10 (not shown) for collaborative user interaction,
and, if desired, object image 22 maybe photographed by camera 14
and re-projected into local and/or remote workspaces 12 for further
user interaction. In FIG. 1A, work surface 24 is part of the
desktop or other underlying support structure 23. In FIG. 1 B, work
surface 24 is on a portable mat 25 that may include touch sensitive
areas. In FIG. 1A, for example, a user control panel 27 is
projected on to work surface 24 while in FIG. 1B control panel 27
may be embedded in a touch sensitive area of mat 25. Similarly, an
A4, letter or other standard size document placement area 29 may be
projected onto work surface 24 in FIG. 1A or printed on a mat 25 in
FIG. 1B. Of course, other configurations for work surface 24 are
possible. For example, it may be desirable in some applications for
system 10 to use an otherwise blank mat 25 to control the color,
texture, or other characteristics of work surface 24, and thus
control panel 27 and document placement area 29 may be projected on
to the blank mat 25 in FIG. 1B just as they are projected on to the
desktop 23 in FIG. 1A.
[0017] In the example shown in FIG. 4, user input device 26
includes an infrared digital stylus 28 and an infrared camera 30
for detecting stylus 28 in workspace 12. Although any suitable user
input device may be used, a digital stylus has the advantage of
allowing input in three dimensions, including along work surface
24, without a sensing pad or other special surface. Thus, system 10
can be used on a greater variety of work surfaces 24. Also, the
usually horizontal orientation of work surface 24 makes it useful
for many common tasks. The ability to use traditional writing
instruments on work surface 24 is advantageous over vertical or
mobile computing interfaces. Projecting an interactive display on
to a working desktop mixes computing tasks with the standard
objects that may exist on a real desktop, thus physical objects can
coexist with projected objects. As such, the comfort of using real
writing instruments as well as their digital counterparts (like
stylus 28) is an effective use model. A three-dimensional pad-free
digital stylus enables annotation on top of or next to physical
objects without having a sensing pad get in the way of using
traditional instruments on work surface 24.
[0018] In one example implementation for system 10, projector 16
serves as the light source for camera 14. Camera capture area 32
(FIG. 12) and projector display area 34 (FIG. 14) overlap on work
surface 24. Thus, a substantial operating efficiency can be gained
using projector 16 both for projecting images and for camera
lighting. The light path from projector 16 through workspace 12 to
work surface 24 should be positioned with respect to camera 14 to
enable user display interaction with minimal shadow occlusion while
avoiding specular glare off work surface 24 and objects in
workspace 12 that would otherwise blind camera 14. The system
configuration described below avoids the glare induced artifacts
that would result from a conventional camera lighting geometry
while still maintaining a sufficiently steep incident angle for the
projector light path desired for proper illumination and projection
of two and three dimensional objects in workspace 12.
[0019] Ideally, projector 16 would be mounted directly over
workspace 12 at an infinite height above work surface 24 to insure
parallel light rays. This configuration, of course, is not
realistic. Even if projector 16 was moved down to a realistic
height above work surface 24 (but still pointing straight down),
the projector's light would be reflected off glossy and semi-glossy
surfaces and objects straight back into camera 14, creating a
blinding specular glare. Thus, the glare spot must be moved out of
camera capture area 32. (Specular glare refers to glare from
specular reflection in which the angle of incidence of the incident
light ray and the angle of reflection of the reflected light ray
are equal and the incident, reflected, and normal directions are
coplanar.)
[0020] To achieve a commercially reasonable solution to this
problem of specular glare, camera 14 and projector 16 are shifted
away from the center of capture and display areas 32, 34 and
projector 16 is positioned low, near base 36, as shown in FIGS. 5
and 6, and a fold mirror 38 is introduced into the projector's
light path to simulate a projector position high above work surface
24. The simulated position of projector 16 and the corresponding
light path above mirror 38 are shown in phantom lines in FIGS. 5
and 6. However, before describing the configuration shown in FIGS.
5 and 6 in more detail, it is helpful to consider the problems
associated with other possible configurations for moving the glare
spot out of camera capture area 32.
[0021] In FIG. 7, camera 14 is positioned at the center of capture
area 32 with an overhead projector 16 slightly off center so that
camera 14 does not block the projector light path. In the
configuration of FIG. 7, the specular glare spot 39 (at the
intersection of incident light ray 41 and reflected light ray 43)
falls within capture area 32 and, thus, will blind camera 14 to
some objects and images in capture area 32. In addition, for the
configuration shown in FIG. 7, where camera 14 and projector 16 are
both positioned high above the base, system 10 would be top heavy
and, thus, not desirable for a commercial product implementation.
If projector 16 is positioned to the side the distance needed to
move glare spot 39 out of camera capture area 32, as shown in FIG.
8, the corresponding projector lens offset required would not be
feasible. Also, any product implementation for the configuration of
system 10 shown in FIG. 8 would be undesirably broad and top
heavy.
[0022] Moving camera 14 off center over capture area 32 brings
projector 16 in to make the system less broad, as shown in FIG. 9,
but the projector lens offset is still too great and the product
still top heavy. In the configuration shown in FIG. 10, projector
16 is raised to a height so that it may be brought in close enough
for an acceptable lens offset but, of course, the product is now
too tall and top heavy. The most desirable solution is a "folded"
light path for projector 16, shown in FIGS. 5 and 11, in which the
"high and tight" configuration of FIG. 10 is simulated using fold
mirror 38. In FIGS. 5 and 11, projector 16 and the upper light path
are folded over the reflecting surface of mirror 38 to project the
same light path on to work surface 24 as in the configuration of
FIG. 10. This folding effect is best seen in FIG. 5 where fold
angles .THETA.1=.THETA.2 and .phi.1=.phi.2.
[0023] As shown in FIGS. 5 and 6, camera 14 is placed in front of
the mirror 38 over workspace 12 so that it does not block the
projector's light path. Camera 14 is positioned off center in the Y
direction (FIG. 5) as part of the overall geometry to keep glare
spot 39 out of capture area 32 with an acceptable offset for both
camera 14 and projector 16. Projector 16 is focused on mirror 38 so
that light from projector 16 is reflected off mirror 38 into
workspace 12. By moving projector 16 down low and introducing a
fold mirror 38 into the projector light path, glare spot 39 is kept
out of capture area 32 with an acceptable projector offset and
system 10 is sufficiently narrow, short and stable (not top heavy)
to support a commercially attractive product implementation.
[0024] Thus, and referring again to FIGS. 1A, 1B, and 2, the
components of system 10 may be housed together as a single device
40. Referring also to FIG. 3, to help implement system 10 as an
integrated standalone device 40, controller 18 may include a
processor 42, a memory 44, and an input/output 46 housed together
in device 40. For this configuration of controller 18, the system
programming to control and coordinate the functions of camera 14
and projector 16 may reside substantially on controller memory 44
for execution by processor 42, thus enabling a standalone device 40
and reducing the need for special programming of camera 14 and
projector 16. While other configurations are possible, for example
where controller 18 is formed in whole or in part using a computer
or server remote from camera 14 and projector 16, a compact
standalone appliance such as device 40 shown in FIGS. 1A, 1B and 2
offers the user full functionality in an integrated, compact mobile
device 40.
[0025] Referring now to FIG. 12, camera 14 is positioned in front
of mirror 38 above workspace 12 at a location offset from the
center of capture area 32. As noted above, this offset position for
camera 14 helps avoid specular glare when photographing objects in
workspace 12 without blocking the light path of projector 16. While
camera 14 represents generally any suitable digital camera for
selectively capturing still and video images in workspace 12, it is
expected that a high resolution digital camera will be used in most
applications for system 10. A "high resolution" digital camera as
used in this document means a camera having a sensor array of at
least 12 megapixels. Lower resolution cameras may be acceptable for
some basic scan and copy functions, but resolutions below 12
megapixels currently are not adequate to generate a digital image
sufficiently detailed for a full range of manipulative and
collaborative functions. Small size, high quality digital cameras
with high resolution sensors are now quite common and commercially
available from a variety of camera makers. A high resolution sensor
paired with the high performance digital signal processing (DSP)
chips available in many digital cameras affords sufficiently fast
image processing times, for example a click-to-preview time of less
than a second, to deliver acceptable performance for most system 10
applications.
[0026] Referring now also to FIG. 13, in the example shown, camera
sensor 50 is oriented in a plane parallel to the plane of work
surface 24 and light is focused on sensor 50 through a shift lens
52. This configuration for sensor 50 and lens 52 may be used to
correct keystone distortion optically, without digital keystone
correction in the object image. The field of view of camera 14
defines a three dimensional capture space 51 in work space 12
within which camera 14 can effectively capture images. Capture
space 51 is bounded in the X and Y dimensions by camera capture
area 32 on work surface 24. Lens 52 may be optimized for a fixed
distance, fixed focus, and fixed zoom corresponding to capture
space 51.
[0027] Referring to FIG. 14, projector 16 is positioned near base
36 outside projector display area 34 and focused on mirror 38 so
that light from projector 16 is reflected off mirror 38 into
workspace 12. Projector 16 and mirror 38 define a three dimensional
display space 53 in workspace 12 within which projector 16 can
effectively display images. Projector display space 53 overlaps
camera capture space 51 (FIG. 12) and is bounded in the X and Y
dimensions by display area 34 on work surface 24. While projector
16 represents generally any suitable light projector, the compact
size and power efficiency of an LED or laser based DLP (digital
light processing) projector will be desirable for most applications
of system 10. Projector 16 may also employ a shift lens to allow
for complete optical keystone correction in the projected image. As
noted above, the use of mirror 38 increases the length of the
projector's effective light path, mimicking an overhead placement
of projector 16, while still allowing a commercially reasonable
height for an integrated, standalone device 40.
[0028] One example of suitable characteristics for system 10 as a
standalone device 40 are set out in Table 1. (Dimension references
in Table 1 are to FIGS. 5 and 6.)
TABLE-US-00001 TABLE 1 CAMERA PROJECTOR Sensor Mpixel 12 Mp Sensor
aspect 1.333 ratio X/Y Pixel size .00175 mm CX Object full size X
427 mm PX Illum Full- 310 mm field X CY Object full size Y 320 mm
PY Illum Full- 310 mm field Y CH Camera height 450 mm PH Projector
670 mm height CS Camera shift in Y 150 mm PS Projector 330 mm shift
in Y Magnification.sup.-1 66 Sensor pixels X 4016 Lens offset 216%
Sensor pixels Y 3016 Lens shift 108% Sensor size X 7.028 mm Max
Y-fan 35.76 deg angle Sensor size Y 5.278 mm Min Y-fan 14.84 deg
angle Image size X 6.470 mm Half-field X 203.5 mm Image size Y
4.848 mm Half-field Y 482.5 mm Half-field X 213.5 mm Throw ratio
1.65 Half-field Y 280 mm Max throw 38.01 deg angle Full-field angle
76.08 deg CC Camera 51.6 mm clearance distance Sampling 220 ppi GC
Glare spot 44.4 mm resolution clearance distance Capture length X
464.85 mm Capture length Y 348.35 mm
[0029] Since projector 16 acts as the light source for camera 12
for still and video capture, the projector light must be bright
enough to swamp out any ambient light that might cause defects from
specular glare. It has been determined that a projector light 200
lumens or greater will be sufficiently bright to swamp out ambient
light for the typical desktop application for system 10 and device
40. For video capture and real-time video collaboration, projector
16 shines white light into workspace 12 to illuminate object(s) 20.
For an LED projector 16, the time sequencing of the red, green, and
blue LED's that make up the white light are synchronized with the
video frame rate of camera 14. The refresh rate of projector 16 and
each LED sub-frame refresh period should be an integral number of
the camera's exposure time for each captured frame to avoid
"rainbow banding" and other unwanted effects in the video image.
Also, the camera's video frame rate should be synchronized with the
frequency of any ambient fluorescent lighting that typically
flickers at twice the AC line frequency (e.g., 120 Hz for a 60 Hz
AC power line). An ambient light sensor can be used to sense the
ambient light frequency and adjust the video frame rate for camera
14 accordingly. For still image capture, the projector's red,
green, and blue LED's can be turned on simultaneously for the
camera flash to increase light brightness in workspace 12, helping
swamp out ambient light and allowing faster shutter speeds and/or
smaller apertures to reduce noise in the image.
[0030] The example configuration for system 10 integrated into a
standalone device 40 shown in the figures and described above
achieves a desirable balance among product size, performance,
usability, and cost. The folded light path for projector 16 reduces
the height of device 40 while maintaining an effective placement of
the projector high above workspace 12 to prevent specular glare in
the capture area of camera 12. The projector's light path shines on
a horizontal work surface 24 at a steep angle enabling 3D object
image capture. This combination of a longer light path and steep
angle minimizes the light fall off across the capture area to
maximize the light uniformity for camera flash. In addition, the
folded light path enables the placement of projector 16 near base
36 for product stability.
[0031] Suitable input devices and techniques for use in system 10
include, for example, finger touch, touch gestures, stylus, in-air
gestures, voice recognition, head tracking and eye tracking. A
touch pad can be used to enable a multi-touch interface for
navigating a graphical user interface or performing intuitive
gesture actions like push, flick, swipe, scroll, pinch-to-zoom, and
two-finger-rotate. Depth cameras using structured light,
time-of-flight, disturbed light pattern, or stereoscopic vision
might also be used to enable in-air gesturing or limited touch and
touch gesture detection without a touch pad. A touch-free digital
stylus is particularly well suited as a user input 26 for system
10. Thus, in the example shown in the figures, user input 26
includes an infrared digital stylus 28 and an infrared camera 30
for detecting stylus 28 in workspace 12. As noted above, a
touch-free digital stylus has the advantage of allowing input in
three dimensions, including along work surface 24, without a
sensing pad or other special surface.
[0032] Referring now to FIGS. 4 and 15, input device 26 includes
infrared stylus 28, infrared camera 30 and a stylus charging dock
54. Stylus 28 includes an infrared light 56, a touch sensitive nib
switch 58 to turn on and off light 56 automatically based on touch,
and a manual on/off switch 60 to manually turn on and off light 56.
(Nib switch 58 and manual switch 60 are shown in the block diagram
of FIG. 4.) Light 56 may be positioned, for example, in the tip of
stylus 28 as shown in FIG. 15 to help maintain a clear
line-of-sight between camera 30 and light 56. Light 56 may also
emit visible light to help the user determine if the light is on or
off.
[0033] Nib switch 58 may be touch sensitive to about 2 gr of force,
for example, to simulate a traditional writing instrument. When the
stylus's nib touches work surface 24 or another object, nib switch
58 detects the contact and turns on light 56. Light 56 turning on
is detected by camera 30 which signals a touch contact event
(similar to a mouse button click or a finger touch on a touch pad).
Camera 30 continues to signal contact, tracking any movement of
stylus 28, as long as light 56 stays on. The user can slide stylus
28 around on any surface like a pen to trace the surface or to
activate control functions. When the stylus nib is no longer in
contact with an object, light 56 is switched off and camera 30
signals no contact. Manual light switch 60 may be used to signal a
non-touching event. For example, when working in a three
dimensional workspace 12 the user may wish to modify, alter, or
otherwise manipulate a projected image above work surface 24 by
manually signaling a "virtual" contact event.
[0034] Infrared camera 30 and mirror 38 define a three dimensional
infrared capture space 61 in workspace 12 within which infrared
camera 30 can effectively detect light from stylus 28. Capture
space 61 is bounded in the X and Y dimensions by an infrared camera
capture area 62 on work surface 24. In the example shown, as best
seen by comparing FIGS. 14 and 15, infrared camera capture space 61
is coextensive with projector display space 53. Thus, infrared
camera 30 may capture stylus activation anywhere in display space
53.
[0035] In one example implementation shown in FIG. 16, camera 30 is
integrated into the projection light path such that the projector
field-of-view and the infrared camera field-of-view are coincident
to help make sure stylus 28 and thus the tracking signal from
infrared camera 30 is properly aligned with the projector display
anywhere in workspace 12. Referring to FIG. 16, visible light 64
generated by red, green and blue LEDs 66, 68, and 70 in projector
16 passes through various optics 72 (including a shift lens 74) out
to mirror 38 (FIG. 14). Infrared light 75 from stylus 28 in
workspace 12 reflected off mirror 38 toward projector 16 is
directed to infrared camera sensor 76 by an infrared beam splitter
78 through a shift lens 80. (Similar to the example configuration
for camera 14 described above, infrared light sensor 76 for camera
30 may be oriented in a plane parallel to the plane of work surface
24 and light focused on sensor 76 through shift lens 80 for full
optical keystone correction.)
[0036] It may be desirable for some commercial implementations to
house projector 16 and infrared camera 30 together in a single
housing 82 as shown in FIG. 16. The geometrical configuration for
infrared camera 30 shown in FIG. 16 helps insure that the stylus
tracking signal is aligned with the display no matter what height
stylus 28 is above work surface 24. If the projector field-of-view
and the infrared camera field-of-view are not coincident, it may be
difficult to calibrate the stylus tracking at more than one height
above work surface 24, creating the risk of a parallax shift
between the desired stylus input position and the resultant
displayed position.
[0037] Although it is expected that workspace 12 usually will
include a physical work surface 24 for supporting an object 20,
work space 12 could also be implemented as a wholly projected work
space without a physical work surface. In addition, workspace 12
may be implemented as a three dimensional workspace for working
with two and three dimensional objects or as a two dimensional
workspace for working with only two dimensional objects. While the
configuration of workspace 12 usually will be determined largely by
the hardware and programming elements of system 10, the
configuration of workspace 12 can also be affected by the
characteristics of a physical work surface 24. Thus, in some
examples for system 10 and device 40 it may be appropriate to
consider that workspace 12 is part of system 10 in the sense that
the virtual workspace accompanies system 10 to be manifested in a
physical workspace when device 36 is operational, and in other
examples it may be appropriate to consider that workspace 12 is not
part of system 10.
[0038] The system 10 examples shown in the figures, with one camera
14 and one projector 16, do not preclude the use of two or more
cameras 14 and/or two or more projectors 16. Indeed, it may be
desirable in some applications for a system 10 to include more than
one camera, more than one projector or more than one of other
system components. Thus, the articles "a" and "an" as used in this
document mean one or more.
[0039] As noted at the beginning of this Description, the examples
shown in the figures and described above illustrate but do not
limit the invention. Other examples, embodiments and
implementations are possible. Therefore, the foregoing description
should not be construed to limit the scope of the invention, which
is defined in the following claims.
* * * * *