U.S. patent application number 12/964660 was filed with the patent office on 2011-12-01 for apparatus and method for gesture control of a dual panel electronic device.
This patent application is currently assigned to KNO, INC.. Invention is credited to Kimberly Cameron, Paul S. Chambers, Babur Habib, Osman Rashid, Kyrie Robinson, David M. Straus, Ann Sydeman, William G. Tsui.
Application Number | 20110291964 12/964660 |
Document ID | / |
Family ID | 45021681 |
Filed Date | 2011-12-01 |
United States Patent
Application |
20110291964 |
Kind Code |
A1 |
Chambers; Paul S. ; et
al. |
December 1, 2011 |
Apparatus and Method for Gesture Control of a Dual Panel Electronic
Device
Abstract
An electronic device includes a processor, a first touch screen
and a second touch screen. The first touch screen displays an
object. An object transition module executed by the processor
includes executable instructions to map a gesture applied to the
object to a set of object movement parameters and then move the
object from the first touch screen to the second touch screen in
accordance with the object movement parameters.
Inventors: |
Chambers; Paul S.; (San
Jose, CA) ; Tsui; William G.; (Oakland, CA) ;
Sydeman; Ann; (Woodside, CA) ; Robinson; Kyrie;
(Palo Alto, CA) ; Straus; David M.; (Los Altos,
CA) ; Rashid; Osman; (Fremont, CA) ; Habib;
Babur; (San Francisco, CA) ; Cameron; Kimberly;
(Sunnyvale, CA) |
Assignee: |
KNO, INC.
Santa Clara
CA
|
Family ID: |
45021681 |
Appl. No.: |
12/964660 |
Filed: |
December 9, 2010 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61396789 |
Jun 1, 2010 |
|
|
|
Current U.S.
Class: |
345/173 |
Current CPC
Class: |
G06F 1/1643 20130101;
G06F 1/1647 20130101; G06F 3/0488 20130101; G06F 3/0482 20130101;
G06F 3/04883 20130101; G06F 1/1616 20130101; G06F 3/04842 20130101;
G06F 3/0483 20130101 |
Class at
Publication: |
345/173 |
International
Class: |
G06F 3/041 20060101
G06F003/041 |
Claims
1. An electronic device, comprising: a processor, a first touch
screen and a second touch screen, wherein the first touch screen
displays an object; and on object transition module executed by the
processor, the object transition module including executable
instructions to: map a gesture applied to the object to a set of
object movement parameters, and move the object from the first
touch screen to the second touch screen in accordance with the
object movement parameters.
2. The electronic device of claim 1 wherein the gesture is a single
touch gesture.
3. The electronic device of claim 2 wherein the single touch
gesture is a flick gesture characterized by some combination of
position, velocity magnitude and velocity direction at the point of
release from a contact point.
4. The electronic device of claim 3 wherein the object transition
module applies a physics-based computation to the object movement
parameters.
5. The electronic device of claim 4 wherein the physics-based
computation utilizes assigned parameters to additional objects of
the first touch screen and the second touch screen.
6. The electronic device of claim 4 wherein the physics-based
computation utilizes assigned parameters to portions of the first
touch screen and the second touch screen.
7. The electronic device of claim 1 wherein the object transition
module models a discontinuity between the first touch screen and
the second touch screen as a hill that an object must traverse.
8. An electronic device, comprising: a processor; individual touch
screens; and on object transition module executed by the processor,
the object transition module including executable instructions to
map a gesture to a set of object movement parameters and thereby
trigger an exchange of contents between touch screens based on the
object movement parameters.
9. The electronic device of claim 8 wherein the gesture is a pinch
gesture across a discontinuity between two touch screens.
10. An electronic device, comprising: a processor; individual touch
screens; and on object transition module executed by the processor,
the object transition module including executable instructions to:
identify a first gesture that designates a selected object on a
first screen, and identify a second gesture that moves the content
of a second screen beneath the selected object on the first
screen.
11. The electronic device of claim 10 wherein the object transition
module receives a third gesture to attach the selected object to
the content of the second screen.
12. The electronic device of claim 11 wherein the object transition
module replaces the content of the second screen with content from
the page following the content of the second screen.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Patent
Application 61/396,789 filed Jun. 1, 2010, entitled "Electronic
Device for Education", the contents of which are incorporated
herein by reference.
FIELD OF THE INVENTION
[0002] The invention relates to generally to electronic devices.
More particularly, the invention relates to methods and devices for
manipulating objects near or between discontinuities between touch
sensitive screens of an electronic device.
BACKGROUND OF THE INVENTION
[0003] In many applications, it is desirable to have multiple touch
screen displays. For example, many electronic book readers are
known to provide multiple screens. It is apparent that for devices
that have discontinuous touch sensitive displays it is desirable to
be able to move objects not only within one display but also
between discontinuous displays. Traditionally for multiple display
devices, a standard pointing device such as a mouse is manipulated
on a flat continuous surface and software maps the position on the
continuous surface to the entire display of multiple screens. For
devices with touch sensitive displays, however, the pointing device
itself must cross a discontinuity, not just the object on the
screen.
[0004] It would be desirable to provide an intuitive and seamless
movement of objects between screens. More particularly, it would be
desirable to provide an easier, intuitive and seamless way to
manipulate objects near and across a touch screen
discontinuity.
SUMMARY OF THE INVENTION
[0005] One embodiment of the invention includes an electronic
device with a processor, a first touch screen and a second touch
screen. The first touch screen displays an object. An object
transition module executed by the processor includes executable
instructions to map a gesture applied to the object to a set of
object movement parameters and then move the object from the first
touch screen to the second touch screen in accordance with the
object movement parameters.
[0006] Another embodiment of the invention includes an electronic
device with a processor, individual touch screens and on object
transition module executed by the processor. The object transition
module includes executable instructions to map a gesture to a set
of object movement parameters and thereby trigger an exchange of
contents between touch screens based on the object movement
parameters.
[0007] Another embodiment of the invention includes an electronic
device with a processor, individual touch screens and an object
transition module executed by the processor. The object transition
module includes executable instructions to identify a first gesture
that designates a selected object on a first screen and a second
gesture that moves the content of a second screen beneath the
selected object on the first screen.
BRIEF DESCRIPTION OF THE FIGURES
[0008] The invention is more fully appreciated in connection with
the following detailed description taken in conjunction with the
accompanying drawings, in which:
[0009] FIG. 1 is a view of an electronic device with a dual
display. Movement of an object between the two displays is shown in
accordance with an embodiment of the invention.
[0010] FIG. 2 is a flow diagram of a process of moving an object
near and across a touch discontinuity in accordance with an
embodiment of the invention.
[0011] FIG. 3 is a view of an electronic device with a dual
display. The gesture depicted exchanges the content of the displays
according to an embodiment of the invention.
[0012] FIG. 4 is a view of an electronic device with a dual
display. The gesture depicted holds one object stationary and
replaces the content of the display underneath with the contents of
an adjacent screen according to an embodiment of the invention.
[0013] FIG. 5 is an electronic device configured in accordance with
an embodiment of the invention.
[0014] Like reference numerals refer to corresponding parts
throughout the several views of the drawings.
DETAILED DESCRIPTION OF THE INVENTION
[0015] FIG. 1 illustrates an electronic device 100 with a first
touch screen 102 and a second touch screen 104. The screens are
separated by a hinge or discontinuity 106. A hand gesture on the
first touch screen 102 determines the speed, position trajectory,
behavior at the discontinuity, and final location of an object 108
on the second touch screen 104. The gesture provides sufficient
parameters to determine the complete motion of the object 108 from
its initial position to its final position.
[0016] For example, a flick gesture includes contacting the touch
screen, moving the contact point across the screen and releasing
the contact with the screen before the contact point has come to
rest. The parameters of this gesture can accomplish the selection
of an object to be moved, the trajectory and speed of the object
along the trajectory, the behavior across the discontinuity and the
final position of the object. The object is animated across the
screen according to the computed parameters and trajectory.
[0017] Parameters of the flick gesture such as the position of the
point where contact was removed, the velocity of the contact point
at the time of the release from the contact point, and the
direction of the velocity of the contact point at the time of the
release can be used in conjunction with a physics based algorithm
that includes deceleration or damping to determine the speed and
trajectory of the object. For example, physical properties may be
simulated to give the visual element the appearance of behaving as
a physical object might in the real world (e.g., as if the visual
element were laying on a low-friction surface, and flicked with the
finger). Additionally, some degree of resistance may be associated
with a screen edge, so that an object flicked with insufficient
velocity/momentum to make it entirely across from one display to
another is not left astride the two when it comes to a stop. One
possible implementation is to simulate a `hump` between the
adjacent panels such that a visual element that is just short of
the required velocity will `slide back` onto the original display,
and one with just above the required velocity will continue to
slide entirely onto the adjacent display.
[0018] As the object nears the boundary the object can be displayed
as spilt across the boundary. For situations when the movement of
the object is dragged in the traditional way, whether the object
moves across the boundary can then be based not just upon the
position of the touch indicator (finger, stylus, etc.) but also the
boundaries of the visual display of the object. For example, if the
object display does not move more than halfway into the target
screen before the contact point is released, the object will not
cross the screen but return to the source screen.
[0019] A flow chart of one embodiment of the flicking algorithm as
well as the behavior near the edge of the panels and near the
discontinuity is given in FIG. 2. Throughout the flowchart, the
term finger is used but could equally well be a pen, stylus or
other device.
[0020] Upon receiving a touch event 201 the process then determines
whether the finger is contacting the display screen 202. If it is,
then the process next determines whether the finger was contacting
the display screen in the previous check 203. If the finger was not
contacting the display screen in the previous check then a timer is
started to determine how long the finger has been contacting the
display screen 204, the object underneath the finger is determined
205 and the relative position of the finger and object is
determined 206. This information is later used to determine whether
the finger has moved a distance greater than a specified threshold
value 208 and whether the timer has expired 209.
[0021] If the object is not floating, the finger has moved a
distance greater than the threshold and the timer has expired, then
the object state is set to floating 210. Once an object is floating
and the finger is contacting the display and was contacting the
display immediately prior, the object is displaced with the finger
211. If the object then extends beyond the display edges 212 a
correction is calculated 213 and is applied to bring the object
back onto the display 214. An inverse correction is applied to the
calculated finger position relative to the object 215. In this
manner, objects can be placed into a floating state and dragged
around the display corresponding to the motion of a finger in
contact with the display.
[0022] Once the finger is released from the display, such that the
finger is not contacting the display but was immediately prior 216,
the process next evaluates whether the object was floating 217. If
the object was not floating, the gesture is treated as a standard
tap 218. If the object was floating, then the process determines
whether the finger was moving immediately before the contact was
released 219. If the finger was moving then the flick motion vector
is calculated 220, and the object position is updated based on some
combination of the motion vector, position, velocity, friction etc.
221. The update is 221 is repeated after an update time period 222
until the new position is the same as the old 223. If the finger
was not moving and the object straddles displays 224, then the
process determines which display the object should move to 225 and
calculates the motion vector to correct the object position 226. If
the finger was not moving and the object did not straddle the
displays then the object state is changed to not floating 227.
[0023] In addition, other objects on the screen can affect the
trajectory behavior of the flicked object. For example, properties
may be assigned to other on screen objects such that the on screen
object can mimic the effect of gravity or magnetism on the flicked
object. An on screen object can attract slow-moving flicked objects
but have a lesser effect on fast-moving flicked objects.
Alternatively, particular areas of the screen or on screen objects
may be assigned properties such as coefficients of friction that
affect the trajectory of the flicked object and make particular
screen areas more likely for the final resting position of the
flicked object.
[0024] In another aspect of the invention, using a gesture in a
particular screen location can move the contents of one display to
another, change the display mode from one screen to multiple
screens or exchange the contents of adjacent displays. In one
embodiment, using a flick gesture on the title bar of an
application or program in one page mode can cause the application
or program to switch panels if the direction of the flick is in the
direction of the opposite panel. Likewise, a pinch close motion on
the title bar of a two-page application can change the mode to
one-page and a pinch open motion of the title bar of a one-page
application can change the mode to two-page. A pinch close across
the spine when two one page applications are displayed switches the
two applications that are running in one-page mode. In another
embodiment the pinch close could also be used to switch the pages
across the spine within an application running in two-page
mode.
[0025] FIG. 3 depicts the exchange of screen contents based on a
gesture. In FIG. 3, before the pinch gesture 300, page 1 was
displayed on the left side display 102 and page 2 was displayed on
the right side display 104. The pinch gesture causes the exchange
of the pages so that page 1 is on the right side display 104 and
page 2 is on the left side display 102.
[0026] In yet another aspect of the invention, moving objects
between pages is accomplished by moving the display rather than the
object. For example, the object could be held stationary on one
screen while another gesture could be used to replace everything on
the screen with the contents of the opposite panel, except the
selected stationary object. The selected stationary object would
appear on top of the new display content. This could be implemented
by using a gesture to put the object into a floating mode where the
object is detached from the page. While the object is in this
state, other objects within the displays can be changed in the
traditional manner (page turning, navigating etc.). The selected
object could be reattached to a different page (either in the same
application or a different one) by another gesture such as tapping
on the object.
[0027] FIG. 4 depicts the movement of the display contents beneath
a selected stationary object 400. In particular, page 2 of the
second touch screen 104 is slid over to the first touch screen 102,
while the selected stationary object remains in position.
[0028] FIG. 5 illustrates an electronic device 500 configured in
accordance with an embodiment of the invention. The electronic
device 500 includes a processor 510 connected to a set of
input/output devices 512 via a bus 514. The input/output devices
512 include at least two touch screens. In addition, the
input/output devices 512 may include a keyboard, mouse, speaker,
printer and the like. A network interface circuit 516 is also
connected to the bus 514 so that the electronic device 500 may
operate in a networked environment. A memory 520 is also connected
to the bus. The memory 520 includes executable instructions to
implement operations of the invention. For example, an object
transition module 522 includes executable instructions to implement
operations described throughout this specification and accompanying
figures.
[0029] An embodiment of the present invention relates to a computer
storage product with a computer readable storage medium having
computer code thereon for performing various computer-implemented
operations. The media and computer code may be those specially
designed and constructed for the purposes of the present invention,
or they may be of the kind well known and available to those having
skill in the computer software arts. Examples of computer-readable
media include, but are not limited to: magnetic media such as hard
disks, floppy disks, and magnetic tape; optical media such as
CD-ROMs, DVDs and holographic devices; magneto-optical media; and
hardware devices that are specially configured to store and execute
program code, such as application-specific integrated circuits
("ASICs"), programmable logic devices ("PLDs") and ROM and RAM
devices. Examples of computer code include machine code, such as
produced by a compiler, and files containing higher-level code that
are executed by a computer using an interpreter. For example, an
embodiment of the invention may be implemented using JAVA.RTM.,
C++, or other object-oriented programming language and development
tools. Another embodiment of the invention may be implemented in
hardwired circuitry in place of or in combination with,
machine-executable software instructions.
[0030] The foregoing description, for purposes of explanation, used
specific nomenclature to provide a thorough understanding of the
invention. However, it will be apparent to one skilled in the art
that specific details are not required in order to practice the
invention. Thus, the foregoing descriptions of specific embodiments
of the invention are presented for purposes of illustration and
description. They are not intended to be exhaustive or to limit the
invention to the precise forms disclosed; obviously, many
modifications and variations are possible in view of the above
teachings. The embodiments were chosen and described in order to
best explain the principles of the invention and its practical
applications, they thereby enable others skilled in the art to best
utilize the invention and various embodiments with various
modifications as are suited to the particular use contemplated. It
is intended that the following claims and their equivalents define
the scope of the invention.
* * * * *