U.S. patent application number 12/606728 was filed with the patent office on 2010-02-25 for interaction with a multi-component display.
This patent application is currently assigned to PURE DEPTH LIMITED. Invention is credited to Gareth Paul Bell, Gabriel Engel.
Application Number | 20100045601 12/606728 |
Document ID | / |
Family ID | 19927326 |
Filed Date | 2010-02-25 |
United States Patent
Application |
20100045601 |
Kind Code |
A1 |
Engel; Gabriel ; et
al. |
February 25, 2010 |
INTERACTION WITH A MULTI-COMPONENT DISPLAY
Abstract
A system and method for interacting with at least one display
screen of a multi-component display is disclosed. A system includes
a first display screen operable to display a first image, wherein
the first display screen includes a first plurality of pixels. A
second display screen is operable to display a second image,
wherein the second display screen includes a second plurality of
pixels, wherein the second display screen overlaps the first
display screen, wherein the second display screen is further
operable to display the second image simultaneously with the
display of the first image, and wherein a portion of the first
image is viewable through the second display screen. A user
interface is operable to enable interaction with at least one
display screen selected from a group consisting of the first
display screen and the second display screen.
Inventors: |
Engel; Gabriel; (Hamilton
2001, NZ) ; Bell; Gareth Paul; (Hamilton,
NZ) |
Correspondence
Address: |
MURABITO, HAO & BARNES, LLP
TWO NORTH MARKET STREET, THIRD FLOOR
SAN JOSE
CA
95113
US
|
Assignee: |
PURE DEPTH LIMITED
Auckland
NZ
|
Family ID: |
19927326 |
Appl. No.: |
12/606728 |
Filed: |
October 27, 2009 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
10048638 |
Feb 1, 2002 |
7626594 |
|
|
PCT/NZ00/00143 |
Aug 1, 2000 |
|
|
|
12606728 |
|
|
|
|
Current U.S.
Class: |
345/161 ;
345/1.1; 345/520 |
Current CPC
Class: |
H04N 13/344 20180501;
H04N 13/10 20180501; H04N 13/339 20180501; H04N 13/189 20180501;
G09F 19/12 20130101; H04N 13/286 20180501; G02B 30/34 20200101;
H04N 13/398 20180501; H04N 13/194 20180501; H04N 13/341 20180501;
H04N 13/395 20180501 |
Class at
Publication: |
345/161 ;
345/1.1; 345/520 |
International
Class: |
G06F 13/14 20060101
G06F013/14; G09G 5/12 20060101 G09G005/12; G06F 3/033 20060101
G06F003/033 |
Foreign Application Data
Date |
Code |
Application Number |
Aug 1, 1999 |
NZ |
336212 |
Claims
1. A system comprising: a first display screen operable to display
a first image, wherein said first display screen comprises a first
plurality of pixels; a second display screen operable to display a
second image, wherein said second display screen comprises a second
plurality of pixels, wherein said second display screen overlaps
said first display screen, wherein said second display screen is
further operable to display said second image simultaneously with
said display of said first image, and wherein a portion of said
first image is viewable through said second display screen; and a
user interface operable to enable interaction with at least one
display screen selected from a group consisting of said first
display screen and said second display screen.
2. The system of claim 1, wherein said user interface overlaps a
viewable area of at least one display screen selected from a group
consisting of said first display screen and said second display
screen.
3. The system of claim 1, wherein said user interface is separate
from at least one display screen selected from a group consisting
of said first display screen and said second display screen.
4. The system of claim 1, wherein said user interface is located on
a panel overlapping said first display screen and said second
display screen.
5. The system of claim 1, wherein said user interface comprises at
least one user interface element.
6. The system of claim 5, wherein said at least one user interface
element is selected from a group consisting of a button and a
joystick.
7. The system of claim 5, wherein said at least one user interface
element is mechanically actuated.
8. The system of claim 5, wherein said user interface comprises a
touch-sensitive material, and wherein said at least one user
interface element is associated with a portion of said
touch-sensitive material.
9. The system of claim 5, wherein said at least one user interface
element is operable to adjust the display of an image selected from
a group consisting of said first image and said second image.
10. The system of claim 5, wherein said at least one user interface
element is operable to transition display of portions of images
between said first and second display screens.
10. The system of claim 1, wherein said second image is associated
with a user interface element operable to enable a user to interact
with said first image displayed on said first display screen.
11. The system of claim 1, wherein said first image is associated
with a user interface element operable to enable a user to interact
with said second image displayed on said second display screen.
12. The system of claim 1, wherein said first and second images
represent a single, three-dimensional object.
13. The system of claim 1, wherein said first image and said second
image each represent a different object with a different respective
depth.
14. A method of interacting with a multi-component display, said
method comprising: displaying a first image on a first display
screen of said multi-component display, wherein said first display
screen comprises a first plurality of pixels; displaying a second
image on a second display screen of said multi-component display,
wherein said second display screen comprises a second plurality of
pixels, wherein said first display screen and said second display
screen overlap, wherein said displaying said second image further
comprises displaying said second image simultaneously with said
displaying said first image, and wherein a portion of said first
image is viewable through said second display screen; in response
to an interaction with a user interface, adjusting a display of an
image selected from a group consisting of said first image and said
second image.
15. The method of claim 14, wherein said user interface overlaps a
viewable area of at least one display screen selected from a group
consisting of said first display screen and said second display
screen.
16. The method of claim 14, wherein said user interface is separate
from at least one display screen selected from a group consisting
of said first display screen and said second display screen.
17. The method of claim 14, wherein said user interface is located
on a panel overlapping said first display screen and said second
display screen.
18. The method of claim 14, wherein said user interface comprises
at least one user interface element.
19. The method of claim 19, wherein said at least one user
interface element is selected from a group consisting of a button
and a joystick.
20. The method of claim 19, wherein said at least one user
interface element is mechanically actuated.
21. The method of claim 19, wherein said user interface comprises a
touch-sensitive material, and wherein said at least one user
interface element is associated with a portion of said
touch-sensitive material.
23. The method of claim 19, wherein said adjusting said display
comprises transitioning display of portions of images between said
first display screen and said second display screen responsive to
an interaction with said user interface element.
24. The method of claim 14, wherein said second image is associated
with a user interface element, and wherein said adjusting said
display further comprises adjusting a display of sad first image
responsive to an interaction with said user interface element.
25. The method of claim 14, wherein said first image is associated
with a user interface element, and wherein said adjusting said
display further comprises adjusting a display of sad second image
responsive to an interaction with said user interface element.
26. The method of claim 14, wherein said first and second images
represent a single, three-dimensional object.
27. The method of claim 14, wherein said first image and said
second image each represent a different object with a different
respective depth.
28. A system comprising: means for displaying a first image on a
first display screen of said multi-component display, wherein said
first display screen comprises a first plurality of pixels; means
for displaying a second image on a second display screen of said
multi-component display, wherein said second display screen
comprises a second plurality of pixels, wherein said first display
screen and said second display screen overlap, wherein said means
for displaying said second image further comprises means for
displaying said second image simultaneously with said displaying
said first image, and wherein a portion of said first image is
viewable through said second display screen; means for adjusting,
in response to an interaction with a user interface, a display of
an image selected from a group consisting of said first image and
said second image.
29. The system of claim 28, wherein said user interface overlaps a
viewable area of at least one display screen selected from a group
consisting of said first display screen and said second display
screen.
30. The system of claim 28, wherein said user interface is separate
from at least one display screen selected from a group consisting
of said first display screen and said second display screen.
31. The system of claim 28, wherein said user interface is located
on a panel overlapping said first display screen and said second
display screen.
32. The system of claim 28, wherein said user interface comprises
at least one user interface element.
33. The system of claim 32, wherein said at least one user
interface element is selected from a group consisting of a button
and a joystick.
34. The system of claim 32, wherein said at least one user
interface element is mechanically actuated.
35. The system of claim 32, wherein said user interface comprises a
touch-sensitive material, and wherein said at least one user
interface element is associated with a portion of said
touch-sensitive material.
36. The system of claim 32, wherein said means for adjusting said
display comprises means for transitioning display of portions of
images between said first display screen and said second display
screen responsive to an interaction with said user interface
element.
37. The system of claim 28, wherein said second image is associated
with a user interface element, and wherein said means for adjusting
said display further comprises means for adjusting a display of sad
first image responsive to an interaction with said user interface
element.
38. The system of claim 28, wherein said first image is associated
with a user interface element, and wherein said means for adjusting
said display further comprises means for adjusting a display of sad
second image responsive to an interaction with said user interface
element.
39. The system of claim 28, wherein said first and second images
represent a single, three-dimensional object.
40. The system of claim 28, wherein said first image and said
second image each represent a different object with a different
respective depth.
Description
RELATED APPLICATIONS
[0001] The present application is a continuation of U.S. patent
application Ser. No. 10/048,638, filed Feb. 1, 2002, naming Gabriel
D. Engel and Pita Witehira as inventors, assigned to the assignee
of the present invention, and having attorney docket number
PURE-P004, which claims the benefit of International Application
Number PCT/NZ00/00143, filed Aug. 1, 2000, which claims the benefit
of New Zealand Patent Number 336212, filed Aug. 1, 1999. Each of
these applications is incorporated herein by reference in their
entirety and for all purposes.
BACKGROUND OF THE INVENTION
[0002] Since our eyes naturally perceive depth, it is seen as a
disadvantage that most display systems are two dimensional.
Furthermore, there are many applications of displays where the
realism of depth would improve the effectiveness of the display.
Thus many attempts have been made to create display systems with
depth.
[0003] A number of display systems that present an image of depth
have been developed.
[0004] One class of such displays requires the viewer to wear some
form of eye shield system by which various means allows the
viewer's two eyes to see different images that are concurrently
displayed on the same two dimensional screen. However, many users
find it unsatisfactory to wear eye shields, while the method of
providing two different images on the same screen is cumbersome and
inconvenient for many applications.
[0005] A related but different class of displays presents a
different image to each eye by means of a binocular image system in
close proximity to both eyes. This method, however, is restricted
in the number of viewers who can use the system and again many
users find it unsatisfactory and uncomfortable to use.
[0006] A third class of display uses modifications of the two
dimensional screen surface wherein two images are created on the
screen, but by manipulation of multiple refractors on the screen,
for a viewer in the correct position one image is refracted into
the right eye, and the other image is refracted into the left
eye.
[0007] This system requires the users to be carefully placed, is
inflexible and has not found favor with many users.
[0008] The major problems with these systems were overcome by the
innovative screen techniques disclosed in PCT Patent Application
No's. PCT/NZ98/00098 and PCT/NZ99/00021 which detail a screen
system producing a perception of depth comprising at least two
screens placed such that their axes are approximately co-linear,
with each screen separated from the other in the direction of the
normal, wherein an image, or part of an image, displayed on one or
more screens can be selectively made transparent, opaque or
partially opaque as desired.
[0009] It has been found however that even with these types of
screens there are some applications where the operator needs more
control of the images, such as with computer games and other
interactive programs--for example training programs.
[0010] It is an object of the present invention to address the
foregoing problems or at least to provide the public with a useful
choice.
[0011] Further aspects and advantages of the present invention will
become apparent from the ensuing description which is given by way
of example only.
SUMMARY OF THE INVENTION
[0012] According to one aspect of the present invention there is
provided an interactive imaging system with depth, including at
least two screens configured to show a 3-dimensional image
characterized in that a user can manipulate one or more parts of an
image displayed, by the means of using one or more on-screen touch
control means.
[0013] It should be understood that in preferred embodiments of the
present invention the 3 dimensional composite image, spread over
two or more screens, can be interactive with any sort of controls,
in particular with "touch" controls on a screen or on a clear panel
in front of the front screen.
[0014] It should be further understood that in preferred
embodiments the "touch" control can be activated by a variety of
items including, but not limited to, pointers, pens, fingers or
pencils.
[0015] One form of touch control means can be an image of a
"button" on the front and/or rear screens which when touched can
flip between 2 or more screens to show the information relating to
the button, or can perform an operation associated with that
button.
[0016] According to another aspect of the present invention there
is provided an interactive imaging system with depth, including at
least two screens configured to show a 3-dimensional image
characterized in that at least one part of the image, displayed on
one or more of the screens, can be manipulated by the actions of
the user by using one or more control means.
[0017] In preferred embodiments of the present invention a user can
manipulate one or more parts of an image by using one or more
control means located on or near the screens, these control means
can be in the form of a standard "keypress" button or a type of
joystick control or may even be "touch" controls located on at
least one touchpad adjacent to the screen, any of which can be
readily purchased "off the shelf."
[0018] It would be clear to anyone skilled in the art that these
are all "off the shelf" items that are readily available.
[0019] According to a further aspect of the present invention there
is provided an interactive imaging system which creates a
perception of depth, including at least two screens configured to
show a 3-dimensional image characterized in that at least one part
of the image, displayed on one or more of the screens, can be
manipulated by the actions of the user by using one or more control
means, and the information necessary to generate at least part of
an image can be transmitted from or received by the display
apparatus via the internet or by another suitable communications
means.
[0020] In preferred embodiments of the present invention there is
provided a method of controlling at least part of an image
displayed on an interactive imaging system which creates a
perception of depth including at least two screens configured to
show a 3-dimensional image characterized by the step of
manipulating, by the actions of the user, at least one or more
parts of an image displayed on the interactive imaging system.
[0021] In some preferred embodiments of the present invention there
is provided a method of controlling at least part of an image
displayed on an interactive imaging system which creates a
perception of depth including at least two screens configured to
show a 3-dimensional image characterized by the step of sending or
receiving the information necessary to generate the image on the
interactive imaging system via the internet or any other suitable
communication means.
[0022] In preferred embodiments of the present invention the
images, or the data corresponding to the images, may be transmitted
over the Internet or by other communication means for display at
any compatible display unit, or in the absence of a suitable
display unit, as one or more separate images simultaneously on a
single screen display. The data corresponding to the images may be
stored at any compatible remote location for processing or
display.
[0023] Therefore the present invention has huge advantages over the
display systems currently available as a far greater amount of data
can be displayed on the display system.
[0024] Generally, data for front and rear images can be obtained
and stored separately.
[0025] Applications where this is appropriate can be in kiosks,
games, simulators, training devices and the like.
[0026] For example, a flight simulator in its simplest form may
consist of two screens wherein the front screen may display the
cockpit instruments, control settings and generally illustrate the
interior of the cockpit, while the rear screen shows the image as
seen through the cockpit windscreen--such as other aircraft, sky,
cloud, grounds, the runway and so on, therefore giving the operator
a sense of true perspective as different maneuvers are
simulated.
[0027] Either conventional instrument displays or "head-up"
displays can be simulated with this invention, with the
"touch-screen" ability improving the "playability" of these
applications.
[0028] Display kiosks in stores and also for other applications can
be configured to show images of products, their use and typically
pictorial data for describing, promoting and benefiting from the
product on the rear screen, while written or symbolic information
about the product can be shown on the front screen.
[0029] Alternatively, this order may be reversed, combined or
arranged as appropriate for the preferred method of
presentation.
[0030] The front screen may also have interactive functions such as
touch controls, selectors and the like which allow the viewer to
select or control either or all of the display screens.
[0031] Alternatively the controls may be separate from, but in
close proximity to, the screen and still allow the user to
manipulate or select separately or simultaneously what is displayed
on each screen.
[0032] Kiosks based on the invention may be used for a variety of
advertising and information presentation purposes. For example, a
customer may be attracted to the kiosk by the use of attractive 3
dimensional images which can then show advertising in an attractive
and unobtrusive manner principally on one screen while other
screens at different depths continue to keep the viewers attention.
The viewer may be encourage to concentrate on action occurring on
one screen while advertising or other messages are unobtrusively
shown on parts of another screen, typically the front screen, which
may be mostly transparent.
[0033] This has a significant advantage over prior systems in that
far more information can be displayed at any one time, for instance
on a two screen system-twice the information is available to the
operator than on a single screen system.
[0034] The use of kiosks based on the invention allows the
dissemination of more advertising within the same footprint or
floor area, while also enabling the advertising to be made less
obtrusive and more acceptable to customers, allowing the
advertising to be more effective.
[0035] In effect the available screen size within the same
footprint or floor area is expanded allowing more information to be
displayed and in a form where it becomes easier to be absorbed.
[0036] One huge advantage with the present invention over the
systems previously available is that due to having either on-screen
touch controls, or controls located adjacent to the screen system,
the operator does not need to take their gaze away from the screen
area in order to perform a control function.
[0037] This not only means that their concentration is not broken
but also that they will be able to cope with a higher information
rate.
[0038] There are a number of applications which are ideally suited
to this aspect, in particular computer gaming where taking your
eyes from the screen can affect your performance in an extremely
adverse manner.
[0039] The use of the present invention means that a computer gamer
for instance will have a much faster response time to any given
situation and less likelihood of missing any on-screen event.
[0040] This has even further advantages when the image or images
are transferred over the internet as the advantages disclosed
previously can be applied to on-line applications.
BRIEF DESCRIPTION OF THE DRAWINGS
[0041] FIG. 1 shows a first perspective representation in
accordance with one embodiment of the present invention.
[0042] FIG. 2 shows a second perspective representation in
accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0043] With reference to FIG. 1 there is illustrated an interactive
imaging system with a perception of depth generally indicated by
arrow 1. The interactive imaging system 1 is comprised of a number
of parallel screens 2 configured in order they give a perception of
depth.
[0044] An image, or part of an image, contained on one or more of
the screens 2 can be manipulated by use of the on-screen touch
controls 3.
[0045] It should be appreciated that the on-screen touch controls
are of a known off-the-shelf type.
[0046] The on-screen touch controls 3 can be configured to perform
a variety of functions including the switching of the screens to
the foreground and the manipulation of part of an image from one
screen to another.
[0047] It is envisaged that in some preferred modes of operation
the interactive imaging system 1 will display three dimensional
images on the screens 2 that have been transmitted to the
interactive imaging system 1 via the internet.
[0048] With reference to FIG. 2 there is shown an interactive
imaging system with a perception of depth where an image, or part
of an image, contained on one or more of the screens 2 can be
manipulated by use of controls at the side of the screen which work
in the same manner as those in FIG. 1.
[0049] It should also be appreciated that these can be replaced by
other controls such as an off-the-shelf type joystick.
[0050] Aspects of the present invention have been described by way
of example only and it should be appreciated that modifications and
additions may be made thereto without departing from the scope
thereof.
* * * * *