U.S. patent application number 14/485815 was filed with the patent office on 2015-01-08 for counter-tactile keypad.
The applicant listed for this patent is Google Inc.. Invention is credited to Eyal Bychkov, Hagay Katz.
Application Number | 20150009165 14/485815 |
Document ID | / |
Family ID | 39790755 |
Filed Date | 2015-01-08 |
United States Patent
Application |
20150009165 |
Kind Code |
A1 |
Bychkov; Eyal ; et
al. |
January 8, 2015 |
COUNTER-TACTILE KEYPAD
Abstract
An electronic device, including electronic circuitry contained
within a housing, an orientation sensor for detecting an
orientation of the housing, a button-based user interface on a
first side of the housing including physical buttons for enabling
button-press user inputs to the electronic circuitry, and a
touch-sensitive screen on a second side of the housing for enabling
touch-based user inputs to the electronic circuitry, the second
side being opposite the first side, wherein the electronic
circuitry is operative (i) to process button-press user inputs and
to ignore touch-based user inputs, when the orientation sensor
detects an orientation of the housing wherein the second side is
facing down and the first side is facing up, and (ii) to process
touch-based user inputs and to ignore button-press user inputs,
when the orientation sensor detects an orientation of the housing
wherein the first side is facing down and the second side is facing
up.
Inventors: |
Bychkov; Eyal; (Hod
Hasharon, IL) ; Katz; Hagay; (Moshav Herut,
IL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Google Inc. |
Mountain View |
CA |
US |
|
|
Family ID: |
39790755 |
Appl. No.: |
14/485815 |
Filed: |
September 15, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
12190599 |
Aug 13, 2008 |
8836637 |
|
|
14485815 |
|
|
|
|
60964872 |
Aug 14, 2007 |
|
|
|
Current U.S.
Class: |
345/173 |
Current CPC
Class: |
G06F 3/041 20130101;
H04M 2250/22 20130101; Y02D 30/70 20200801; Y02D 70/122 20180101;
G06F 1/1626 20130101; G06F 3/0489 20130101; H04M 1/236
20130101 |
Class at
Publication: |
345/173 |
International
Class: |
G06F 3/041 20060101
G06F003/041 |
Claims
1.-21. (canceled)
22. An electronic device, comprising: a housing; electronic
circuitry contained within said housing; an orientation sensor
coupled to said electronic circuitry, for detecting an orientation
of said housing; a button-based user interface mounted on a first
side of said housing comprising physical buttons for enabling
button-press user inputs to said electronic circuitry; and a
touch-sensitive screen mounted on a second side of said housing for
enabling touch-based user inputs to said electronic circuitry, the
second side being opposite the first side, wherein said electronic
circuitry is operative (i) to process button-press user inputs and
to ignore touch-based user inputs, when said orientation sensor
detects an orientation of said housing wherein the second side is
facing down and the first side is facing up, and (ii) to process
touch-based user inputs and to ignore button-press user inputs,
when said orientation sensor detects an orientation of said housing
wherein the first side is facing down and the second side is facing
up.
23. The electronic device of claim 22 wherein said button-based
user interface enables substantially the same input commands, at
any given time, as the touch-sensitive screen.
24. A method performed by an electronic device, the method
comprising: receiving, by the electronic device, notification that
a user has pressed a physical input button on a first side of an
electronic device; further receiving, by the electronic device,
notification that a user has touched a touch-sensitive screen on a
second side of the electronic device, the second side being
opposite the first side; detecting, by the electronic device, an
orientation of said housing; and processing, by the electronic
device, said receiving and ignoring, by the electronic device, said
further receiving, when said detecting detects an orientation of
the electronic device wherein the second side is facing down and
the first side is facing up; and processing, by the electronic
device, said further receiving and ignoring, by the electronic
device, said receiving, when said detecting detects an orientation
of the electronic device wherein the first side is facing down and
the second side is facing up.
25. A non-transitory computer readable storage medium storing
program code for causing an electronic device to perform a method,
the method comprising: receiving notification that a user has
pressed a physical input button on a first side of the electronic
device; further receiving notification that a user has touched a
touch-sensitive screen on a second side of the electronic device,
the second side being opposite the first side; detecting an
orientation of said housing; and processing said receiving and
ignoring said further receiving, when said detecting detects an
orientation of the electronic device wherein the second side is
facing down and the first side is facing up; and processing said
further receiving and ignoring said receiving, when said detecting
detects an orientation of the electronic device wherein the first
side is facing down and the second side is facing up.
26. An electronic device, comprising: a housing; electronic
circuitry contained within said housing; a button-based user
interface mounted on a first side of said housing comprising
physical buttons for enabling button-press user inputs to said
electronic circuitry; a touch-sensitive screen mounted on a second
side of said housing for enabling touch-based user inputs to said
electronic circuitry, the second side being opposite the first
side; and a button-only button mounted on said housing comprising a
physical button, for activating a de-activating a button-only state
of said electronic circuitry at any time; wherein said electronic
circuitry is operative (i) to process button-press user inputs and
to ignore touch-based user inputs, when the button-only state is
currently activated, and (ii) to process touch-based user inputs
and to ignore button-press user inputs, when the button-only state
is currently de-activated.
27. The electronic device of claim 26 wherein said button-based
user interface enables substantially the same input commands, at
any given time, as the touch-sensitive screen.
28. A method performed by an electronic device, the method
comprising: receiving, by the electronic device, notification that
a user has pressed a physical input button on a first side of an
electronic device; further receiving, by the electronic device,
notification that a user has touched a touch-sensitive screen on a
second side of the electronic device, the second side being
opposite the first side; detecting, by the electronic device,
whether a button-only state of the electronic device is currently
activated or de-activated; and processing, by the electronic
device, said receiving and ignoring, by the electronic device, said
further receiving, when the electronic device detects that the
button-only state is currently activated; and processing, by the
electronic device, said further receiving and ignoring, by the
electronic device, said receiving, when the electronic device
detects that the button-only state is currently de-activated.
29. A non-transitory computer readable storage medium storing
program code for causing an electronic device to perform a method,
the method comprising: receiving notification that a user has
pressed a physical input button on a first side of an electronic
device; further receiving notification that a user has touched a
touch-sensitive screen on a second side of the electronic device,
the second side being opposite the first side; detecting whether a
button-only state of the electronic device is currently activated
or de-activated; and processing said receiving and ignoring said
further receiving, when the button-only state is currently
activated; and processing said further receiving and ignoring said
receiving, when the button-only state is currently de-activated.
Description
PRIORITY REFERENCE TO RELATED APPLICATIONS
[0001] This application claims benefit of U.S. Provisional
Application No. 60/964,872, entitled COUNTER-TACTILE KEYPAD, filed
on Aug. 14, 2007 by inventors Eyal Bychkov and Hagay Katz.
FIELD OF THE INVENTION
[0002] The present invention relates to touch-based user interfaces
for electronic devices.
BACKGROUND OF THE INVENTION
[0003] Handheld electronic devices have benefited from touch screen
technology. Touch screens are used for both input and output. They
enable device manufacturers to reduce the area of the device used
for off-screen input controls, such as buttons and keypads, and to
enlarge the screen area, thereby enhancing the user experience.
[0004] For input, users interact with touch screens using visual
control elements. Control elements correspond to user commands, are
displayed on a screen, and provide areas for a user to press on.
Control elements may appear as buttons, scroll bars, slide bars and
wheels. Users can press or tap on control elements such as buttons,
or drag control elements such as scroll bars, slide bars and wheels
to a desired location. Pressing, tapping or dragging control
elements activates their corresponding commands.
[0005] For output, touch screens display graphics, similar to
conventional LCD displays.
[0006] Reference is now made to FIG. 1, which is a prior art
illustration of a touch screen. Shown in FIG. 1 is a handheld
electronic device 100 having a touch screen 110. Device 100
displays various buttons 120 in touch screen 110, which a user can
press in order to enter numbers and commands.
[0007] An advantage of touch screens is the flexibility of
displaying a wide variety of control elements, such as buttons,
icons and selection menus, for a corresponding wide variety of
modes of operation. Thus, while in a dialer mode of operation, a
touch screen may display a numeric keypad, and while in an SMS mode
of operation, the touch screen may display an alphabet keypad.
Areas on the screen thus produce different actions when pressed,
depending on the control elements being displayed therein.
[0008] A drawback with touch screens is the lack of a tactile
feeling, as a result of which many people find them difficult to
use. Prior art methods of overcoming this drawback include
graphical methods, audio methods, force feedback methods and
vibration methods. Graphical methods make control elements appear
to be pressed and released, similar to physical buttons presses,
thus creating a perception of a physical button press. Audio
methods provide sounds in response to elements being pressed. The
TouchSense.RTM. system of Immersion Corporation of San Jose,
Calif., includes both graphical and audio feedback when touch
screens are pressed.
[0009] Force feedback methods operate by mounting a touch screen on
a linear flexure, which allows the screen to bend inwards when
pressed. Force feedback for touch screens is described in U.S. Pat.
No. 7,113,177 to Franzen. The '177 patent describes a
touch-sensitive display with tactile feedback, comprised of three
layers; namely, a display layer, a layer that includes receptors,
and a layer that includes controllable actuators.
[0010] Vibration methods cause a device to vibrate in response to a
control element being pressed, as a tactile feedback. Pantech Group
of Seoul, Korea, developed such a touch screen for its dual-LCD
sliding phones.
SUMMARY OF THE DESCRIPTION
[0011] The present invention provides a way to generate tactile
feedback for screens that display user interface control elements.
The present invention uses both front and back sides of an
electronic device; one side for housing a screen, and the other
side for housing physical buttons. The screen is positioned
substantially opposite the buttons. Pressing a button on the device
activates a control element that is displayed opposite the button
on the other side of the device.
[0012] Three embodiments of the invention are described. In the
first embodiment, the screen is a touch screen and the buttons are
not electrically connected to the device; i.e. the buttons are
merely used for their tactile feedback. In the second embodiment,
the screen is a non-touch screen and the buttons are fully
functional. In the third embodiment, the screen is a touch screen
and the buttons are fully functional.
[0013] There is thus provided in accordance with an embodiment of
the present invention a touch-based user interface for an
electronic device, including a housing including electronic
circuitry, a plurality of buttons mounted within a first area on a
first side of the housing, and a screen mounted on a second area of
a second side of the housing, the second side being opposite to the
first side, and the second area being opposite to at least a
portion of the first area, wherein the electronic circuitry is
operative (i) to display on the screen at least one user interface
control element that corresponds respectively to at least one
button, each such user interface control element having a command
associated therewith, and (ii) to perform the command associated
with a designated user interface control element when its
corresponding button is pressed.
[0014] There is additionally provided in accordance with an
embodiment of the present invention a method for a touch-based user
interface for an electronic device, including receiving
notification that a user has pressed a button on a first side of an
electronic device, and performing a command associated with a
control element displayed on a screen on the electronic device,
wherein the screen is located on a second side of the electronic
device, the second side being opposite to the first side, and
wherein the control element is displayed opposite the location of
the button that was pressed.
[0015] There is moreover provided in accordance with an embodiment
of the present invention a computer readable storage medium storing
program code for causing an electronic device to receive
notification that a user has pressed a button on a first side of
the electronic device, and to perform a command associated with a
control element displayed on a screen on the electronic device,
wherein the screen is located on a second side of the electronic
device, the second side being opposite to the first side, and
wherein the control element is displayed opposite the location of
the button that was pressed.
[0016] There is further provided in accordance with an embodiment
of the present invention a method for touch-based user interface
for an electronic device, including receiving a notification that a
user has pressed an area of a touch screen on an electronic device
where a control element is displayed, verifying that the user has
also pressed an off-screen button corresponding to the control
element, and performing a command associated with the control
element after and only after the receiving and the verifying have
been performed.
[0017] There is yet further provided in accordance with an
embodiment of the present invention a computer readable storage
medium storing program code for causing an electronic device to
receive a notification that a user has pressed an area of a touch
screen on the electronic device where a control element is
displayed, to verify that the user has also pressed an off-screen
button corresponding to the control element, and to perform a
command associated with the control element after and only after
both the receiving and the verifying have been performed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] The present invention will be more fully understood and
appreciated from the following detailed description, taken in
conjunction with the drawings in which:
[0019] FIG. 1 is a prior art illustration of a touch screen;
[0020] FIG. 2 is an illustration of a touch-based user interface
that uses two opposite sides of an electronic device, in accordance
with an embodiment of the present invention;
[0021] FIG. 3 is a simplified cross-sectional front, side and back
view of an electronic device that has a touch-based user interface,
in accordance with an embodiment of the present invention; and
[0022] FIG. 4 is an illustration of two opposite sides of an
electronic device that has a touch-based user interface, in
accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
[0023] The present invention relates to touch-based user interfaces
for electronic devices. The present invention uses two opposite
sides of the devices; one side for a screen, and the opposite side
for physical buttons. The screen is located substantially opposite
the buttons.
[0024] Reference is now made to FIG. 2, which is an illustration of
a touch-based user interface that uses two opposite sides of an
electronic device, in accordance with an embodiment of the present
invention. A user presses a screen on the front side of the device
with his thumb, and presses a button on the back side of the device
with his index finger. The user's thumb and index finger are
aligned substantially opposite one another.
[0025] Reference is now made to FIG. 3, which is a simplified
cross-sectional front, side and back view of an electronic device
that has a touch-based user interface, in accordance with an
embodiment of the present invention. As shown in FIG. 3, a screen
is housed on the front side of the device and physical buttons are
positioned on the back side of the device. The side view indicates
that the buttons protrude from the housing. It will be appreciated
by those skilled in the art, however, that the buttons may be flush
with or indented in the back surface of the housing.
[0026] The screen housed on the front side of the device is
positioned in an area that is substantially opposite the area where
the buttons are located on the back side of the device.
[0027] Reference is now made to FIG. 4, which is an illustration of
two opposite sides of an electronic device that has a touch-based
user interface, in accordance with an embodiment of the present
invention. Shown in FIG. 4 are control elements 410 displayed on a
screen 400 on the front side of the device, and physical buttons
420 positioned on the back side of the device. Each control element
410 on the front side is positioned substantially opposite a
corresponding button 420 on the back side. The shapes of buttons
420 need not be the same as those of the areas of their
corresponding control elements 410. In FIG. 4, for example, buttons
420 are oval shaped, and their corresponding control elements 410
are rectangular shaped.
[0028] When a button 420 on the back side is pressed, a command
associated with its corresponding control element 410 on the front
side is performed. Thus, to activate a specific control element
410, a user may position his thumb on the control element, and
press its corresponding button 420 with his index finger. I.e.,
pressing on a button 420 on the back side behind the screen
corresponds to pressing its corresponding control element 410 on
the screen.
[0029] A motivation for the present invention is that fact that
neurologically people are able to accurately align the tips of
their thumbs and index fingers. In fact, neurological diagnoses
often incorporate patients' accuracy in arranging their two fingers
to touch.
[0030] In a first embodiment of the present invention, screen 400
is a touch screen, and buttons 420 are physically functional but
not electrically connected to the device. Buttons 420 serve to
supplement the touch screen with a tactile effect, with inexpensive
mechanical buttons.
[0031] In a second embodiment of the present invention, screen 400
is an inexpensive conventional non-touch screen, and buttons 420
are fully functional. Buttons 420 serve to provide the non-touch
screen with the flexibility of a touch screen.
[0032] The following pseudo-code is a simplified description of the
second embodiment, in accordance with the present invention.
TABLE-US-00001 x=0 if (button == 1) { x =
Find_control_element(location == 1); } if ((button == 2) { x =
Find_control_element(location == 2); } ..... if (x>0) { do
Control_element_function(X) }
[0033] In a third embodiment of the present invention, screen 400
is a touch screen, and buttons 420 are fully functional. In this
embodiment, operation of the device is configured so that a control
element is activated only when both the control element is touched
on the screen and its corresponding button is pressed. The device
thus ensures that a user is using two fingers, which is useful in
avoiding unintended presses of the screen, and eliminates the need
to lock the screen.
[0034] The following pseudo-code is a simplified description of the
third embodiment, in accordance with the present invention.
TABLE-US-00002 if (button-only( ) == FALSE) { if
((screen-control-element) == 1) AND (button == 1)) { do
Control_element_function 1 } if ((screen-control-element) == 2) AND
(button == 2)) { do Control_element_function 2 } ..... else { do
nothing; } } else { if (button == 1) { do Control_element_function
1 } if (button == 2) { do Control_element_function 2 } ..... else {
do nothing; } }
[0035] In a fourth embodiment of the present invention, screen 400
is a touch screen, and buttons 420 are fully operational, as in the
third embodiment. In this embodiment, operation of the device is
configured to that a control element is activated when either the
control element is touched on the screen, or when its corresponding
button is pressed.
[0036] The following pseudo-code is a simplified description of the
fourth embodiment, in accordance with the present invention.
TABLE-US-00003 if (button-only( ) == FALSE) { if
((screen-control-element) == 1) OR (button == 1)) { do
Control_element_function 1 } if ((screen-control-element) == 2) OR
(button == 2)) { do Control_element_function 2 } ..... else { do
nothing; } } else { if (button == 1) { do Control_element_function
1 } if (button == 2) { do Control_element_function 2 } ..... else {
} }
[0037] In all four of the above embodiments, graphical and audio
feedback may be incorporated, to notify a user that his action is
acknowledged.
[0038] In accordance with an embodiment of the present invention, a
controller of the device in FIGS. 2-4 is programmed to map each
button 420 to a specific area of screen 400. Control elements have
buttons associated therewith, and are displayed by the controller
on screen 400 at positions within the screen areas to which their
buttons map. Different control elements may be displayed in
different modes of operation of the device, but for each mode the
control elements are positioned within the screen areas to which
their buttons map.
[0039] In the third embodiment described hereinabove, wherein a
control element and its button must both be pressed in order to
activate the control element, the controller is programmed to
detect both presses before activating the control element.
[0040] In accordance with an embodiment of the present invention,
the device in FIGS. 2-4 may also be operated in certain cases
without the display, in a "button-only" mode. The button-only mode
may be activated manually, by a user pressing a "button-only"
button, or automatically when the device is held in an orientation
with the screen facing down and the buttons facing the user. Such
an orientation may be automatically detected by an orientation
sensor within the device.
[0041] The buttons are engraved with symbols, such as alphanumeric
symbols, which represent their default functions. The default
button functions are operational when the device is in the
button-only mode.
[0042] In the foregoing specification, the invention has been
described with reference to specific exemplary embodiments thereof.
It will, however, be evident that various modifications and changes
may be made to the specific exemplary embodiments without departing
from the broader spirit and scope of the invention as set forth in
the appended claims. Accordingly, the specification and drawings
are to be regarded in an illustrative rather than a restrictive
sense.
* * * * *