U.S. patent application number 15/095537 was filed with the patent office on 2016-08-04 for detecting a user input with an input device.
The applicant listed for this patent is Studer Professional Audio GmbH. Invention is credited to Robert HUBER, Rene WUSSLER.
Application Number | 20160224129 15/095537 |
Document ID | / |
Family ID | 40352075 |
Filed Date | 2016-08-04 |
United States Patent
Application |
20160224129 |
Kind Code |
A1 |
WUSSLER; Rene ; et
al. |
August 4, 2016 |
DETECTING A USER INPUT WITH AN INPUT DEVICE
Abstract
An input device and methods for detecting user input using an
input device. An example input device includes a multi-touch
sensing display configured to detect multiple simultaneous triggers
on a surface of the multi-touch sensing display as distinct input
events. The input device also includes at least one mechanical
control element arranged on the surface of the multi-touch sensing
display. The at least one mechanical control element is configured
to generate an input event. The input event is detected by the
multi-touch sensing display in response to actuation of the at
least one mechanical control element.
Inventors: |
WUSSLER; Rene; (Watt,
CH) ; HUBER; Robert; (Geroldswil, CH) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Studer Professional Audio GmbH |
Regensdorf |
|
CH |
|
|
Family ID: |
40352075 |
Appl. No.: |
15/095537 |
Filed: |
April 11, 2016 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
12621388 |
Nov 18, 2009 |
9310901 |
|
|
15095537 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06F 3/038 20130101;
G06F 2203/04106 20130101; G06F 3/039 20130101; G06F 2203/04104
20130101; G06F 3/0421 20130101; G06F 3/0393 20190501; G06F 3/0416
20130101; G06F 3/044 20130101; G06F 3/0362 20130101 |
International
Class: |
G06F 3/038 20060101
G06F003/038; G06F 3/044 20060101 G06F003/044; G06F 3/0362 20060101
G06F003/0362; G06F 3/041 20060101 G06F003/041; G06F 3/042 20060101
G06F003/042 |
Foreign Application Data
Date |
Code |
Application Number |
Nov 18, 2008 |
EP |
08 020 092.6 |
Claims
1. An input device comprising: a multi-touch sensing display
configured to detect multiple simultaneous triggers on a surface of
the multi-touch sensing display as distinct input events; a
plurality of mechanical control elements arranged on the surface of
the multi-touch sensing display, a first mechanical control element
of the plurality of mechanical control elements is configured to
control a function in response to a first user input; and a
touch-sensitive area positioned adjacent to the first mechanical
control element and configured to receive a second user input,
wherein the second user input corresponds to a detected touch on
the touch-sensitive area, and wherein the touch-sensitive area is
configured to change the function provided by the first mechanical
control element based on a position of the detected touch on the
touch-sensitive area.
2. The input device of claim 1 wherein the plurality of mechanical
control elements includes a plurality of rotary knobs fixedly
mounted to the surface of the multi-touch sensing display.
3. The input device of claim 2 wherein each rotary knob comprises a
movable component and a shaft.
4. The input device of claim 3 wherein each rotary knob is
configured to revolve around the shaft that is attached to a base
that is attached to the surface of the multi-touch sensing
display.
5. The input device of claim 1 where the multi-touch sensing
display is configured to detect multiple simultaneous triggers that
include touches, near-touches, or a combination thereof as distinct
input events.
6. The input device of claim 1 wherein the touch-sensitive area
surrounds the first mechanical control element.
7. A method of detecting a user input with an input device
comprising a multi-touch sensing display adapted to detect distinct
input events and a plurality of mechanical control elements
arranged on a surface of the multi-touch sensing display, the
method comprising: in response to an actuation of a first
mechanical control element of the plurality of mechanical control
elements related to a first user input, generating an input event
to control a function and detecting the input event by the
multi-touch sensing display; adjusting a parameter associated with
the first mechanical control element in accordance with the
detected input event; detecting, at a touch sensitive area that is
positioned adjacent to the first mechanical control element, a
second user input that corresponds to a detected touch on the
touch-sensitive area, and changing the function provided by the
first mechanical control element based on a position of the
detected touch on the touch-sensitive area.
8. The method of claim 7 wherein the plurality of mechanical
control elements includes a plurality of rotary knobs fixedly
mounted to the surface of the multi-touch sensing display.
9. The method of claim 8 wherein each rotary knob comprises a
movable component and a shaft.
10. The method of claim 9 further comprising revolving each rotary
knob around the shaft that is attached to a base that is attached
to the surface of the multi-touch sensing display.
11. The method of claim 7 further comprising detecting, via the
multi-touch sensing display, multiple simultaneous triggers that
include touches, near-touches, or a combination thereof as distinct
input events.
12. The method of claim 7 wherein the touch-sensitive area
surrounds the first mechanical control element.
13. An audio console comprising: an input device including a
multi-touch sensing display configured to detect multiple
simultaneous triggers on a surface of the multi-touch sensing
display as distinct input events, a plurality of mechanical control
elements arranged on the surface of the multi-touch sensing
display, a first mechanical control element of the plurality of
mechanical control elements configured to generate an input event
in response to a first user input to control a function, the input
event being detected by the multi-touch sensing display in response
to actuation of the first mechanical control element; and a
touch-sensitive area positioned adjacent to the first mechanical
control element and being configured to receive a second user
input, wherein the second user input corresponds to a detected
touch on the touch-sensitive area, and wherein the touch-sensitive
area is configured to change the function controlled by the first
mechanical control element based on a position of the detected
touch on the touch-sensitive area.
14. The audio console of claim 13 wherein the plurality of
mechanical control elements includes a plurality of rotary knobs
fixedly mounted to the surface of the multi-touch sensing
display.
15. The audio console of claim 14 wherein each rotary knob
comprises a movable component and a shaft.
16. The audio console of claim 15 wherein each rotary knob is
configured to revolve around the shaft that is attached to a base
that is attached to the surface of the multi-touch sensing
display.
17. The audio console of claim 13 wherein the multi-touch sensing
display is configured to detect multiple simultaneous triggers that
include touches, near-touches, or a combination thereof as distinct
input events.
18. The audio console of claim 13 wherein the touch-sensitive area
surrounds the first mechanical control element.
Description
RELATED APPLICATIONS
[0001] This application is a continuation of U.S. application Ser.
No. 12/621,388 filed Nov. 18, 2009, now U.S. Pat. No. ______,
which, in turn, claims priority to European Patent application
Serial No. 08 020 092.6 filed Nov. 18, 2008, the disclosures of
which are hereby incorporated in their entirety by reference
herein.
BACKGROUND
[0002] 1. Field of the Invention
[0003] The invention relates to devices and methods for detecting
user input, and more particularly, to an input device comprising a
multi-touch sensing display.
[0004] 2. Related Art
[0005] Modern electronic devices often use a plurality of control
elements to allow a user to adjust parameters relevant to the
operation of the device. An example of an input unit that may be
used in an electronic device includes a console having a plurality
of mechanical control elements. Such an input device may be used to
control, for example, audio equipment, video equipment, or a
central control station including, for example, a power plant, a
factory, or a traffic system. Control elements employed in these
systems include analog input elements.
[0006] Analog input elements have a predefined function. Their
function cannot be changed or adjusted once implemented limiting
their application in the input unit. Control elements that operate
using encoders, such as for example, rotary encoders, are
programmable as to their, function. However, in operation, it may
be difficult or even impossible to determine the function that is
assigned to the control element at any given time. It is even more
difficult if the function and value of an associated parameter is
displayed on a separate screen remote from the control element.
Control elements that use encoders and other complex
electromechanical input elements also tend to be relatively
expensive and overly complex. Input units that use such
electromechanical input elements must typically accommodate a fair
amount of space underneath the cover plate of the input device,
further adding to their cost and making them difficult to mount.
Cost, expense and mounting difficulty present substantial problems
for large input consoles that may include up to tens or hundreds of
control elements.
[0007] Touchscreens are input devices often implemented in compact
electronic devices, such as personal digital assistant (PDA) or
more recently mobile phones. Touchscreens may use one of several
known technologies for detecting a touch or a near-touch to a
surface. One example includes a resistive touchscreen panel
composed of several layers. When the panel is touched, a change in
the electrical current through the layers may be detected as a
touch event. A controller may derive the position of the touch
event on the panel based on the change in current, which is
different at any given position. Other touchscreen technologies
include capacitive touchscreen panels based on detecting a
distortion of an electromagnetic field, or frustrated total
internal reflection (FTIR). Some FTIR touchscreen panels use
reflected light paths in which a reflection light path internal to
a glass plate provides a sensitive surface. A disturbance to the
light path may be detected by pressing an object against the
surface. These touchscreens can be operated with objects like a
finger or a pen. Some touchscreen panels may trigger input events
upon a near touch. For example, a capacitive touchscreen may
trigger an input event if an object comes to within a predetermined
distance of the touchscreen surface.
[0008] Touchscreens were originally designed to detect a single
touch at a time. Touchscreens have since evolved to detect
simultaneous multiple touches as separate input events. Such
multi-touch screens allow a user to use two or more fingers to
simultaneously manipulate two or more objects. Despite their
flexibility, multi-touch screens are not well-suited for
applications involving setting a larger number of parameters. The
screens are generally small in size and operated by one hand. The
adjustment of a graphical control element on the touchscreen using
for example a finger or a pen may demand substantial motor skills
from a user and yet, still be rather imprecise. A graphical control
element typically requires a substantial amount of space on the
screen limiting the number of such elements displayed at any given
time. A plurality of small control elements would be difficult and
time-consuming to operate. Adjusting a plurality of parameters with
a conventional touchscreen is thus not ergonomic, particularly if
such adjustments are to be performed over a prolonged time.
[0009] Accordingly, there is a need for an ergonomic input device
that allows for flexible precise adjustment of parameters and that
informs a user of the parameter being adjusted.
SUMMARY
[0010] In view of the above, an input device is provided for
detecting user input. An example input device includes a
multi-touch sensing display configured to detect multiple
simultaneous triggers on a surface of the multi-touch sensing
display as distinct input events. The input device also includes at
least one mechanical control element arranged on the surface of the
multi-touch sensing display. The at least one mechanical control
element is configured to generate an input event. The input event
is detected by the multi-touch sensing display in response to
actuation of the at least one mechanical control element.
[0011] A method for detecting user input with an input device is
also provided. An example method may be implemented using an input
device having a multi-touch sensing display adapted to detect
multiple simultaneous touches or near touches to a surface of the
multi-touch sensing display as distinct input events. The input
device includes at least one mechanical control element arranged on
the surface of the multi-touch sensing display. An example method
includes generating an input event and detecting the input event by
the multi-touch sensing display in response to an actuation of the
at least one mechanical control element. A parameter associated
with the at least one mechanical control element is then adjusted
in accordance with the detected input event.
[0012] Those skilled in the art will appreciate that the features
mentioned above and those yet to be explained below can be used not
only in the respective combinations indicated, but also in other
combinations or in isolation, without leaving the scope of the
present invention. The above-described methods may be implemented
in a device for processing audio signals, or examples
implementations may include steps described with respect to the
device for processing audio signals.
[0013] Other devices, apparatus, systems, methods, features and
advantages of the invention will be or will become apparent to one
with skill in the art upon examination of the following figures and
detailed description. It is intended that all such additional
systems, methods, features and advantages be included within this
description, be within the scope of the invention, and be protected
by the accompanying claims.
BRIEF DESCRIPTION OF THE FIGURES
[0014] Example implementations of the invention are described below
with reference to the following figures. The components in the
figures are not necessarily to scale, emphasis instead being placed
upon illustrating the principles of the invention. In the figures,
like reference numerals designate corresponding parts throughout
the different views.
[0015] FIG. 1 is a schematic diagram of an example of an input
device.
[0016] FIG. 2 is a schematic diagram of an example of an input
device having a rotary knob as a mechanical control element.
[0017] FIG. 3 is a schematic diagram of an example of an input
device having a sliding controller as a mechanical control
element.
[0018] FIG. 4 is a schematic diagram of an example of an input
device having a push button as a control element.
[0019] FIG. 5 is a schematic diagram of an example of an input
device using photosensitive elements for detecting actuation of a
control element.
[0020] FIG. 6 is a schematic diagram of an example of an input
device using a capacitive element for triggering an input
event.
[0021] FIG. 7 is a flowchart illustrating operation of an example
method for detecting multiple simultaneous touches or near touches
of a multi-touch sensing device.
[0022] FIG. 8 is a schematic diagram of an example of an audio
console.
DETAILED DESCRIPTION
[0023] FIG. 1 is a schematic diagram of an example of an input
device 100. The input device 100 includes a multi-touch sensing
display 101 and two mechanical control elements illustrated in FIG.
1 as including two rotary knobs 102 and 103. The rotary knobs 102
and 103 in FIG. 1 are fixedly mounted to a surface 104 of the
multi-touch sensing display 101. Below the surface 104, the
multi-touch sensing display 101 includes an array of optical
sensors or photosensitive elements represented in FIG. 1 as a line
105. The multi-touch sensing display 101 may for example be a thin
film transistor (TFT) LCD display. The TFT LCD display includes
integrated photosensitive elements. Such displays are known in the
art. Documents describing TFT LCD displays having integrated
photosensitive elements include an article titled "Active matrix
LCD with integrated optical touchscreen,"
www.planar.com/advantages/whitepapers/docs/planar-AMLCD-Optical-Touchscre-
-en.pdf, which is incorporated by reference. Light emitted by the
multi-touch sensing display 101 may be absorbed, scattered or
reflected by trigger elements 106 and 107. If one of the control
elements 102 or 103 is actuated by turning, the intensity of light
reflected onto photosensitive elements located underneath the
trigger element 106 or 107 at the previous and the new position of
the trigger element changes. The change in intensity generates an
input event. Such an input event may be detected by the multi-touch
sensing display 101 as a change of photocurrent, or a change of the
current through the array of photosensitive elements 105.
[0024] It is to be understood that the multi-touch sensing display
101 may be configured to include mechanisms for determining the
position where the input event has occurred on its surface as well
as for generating a position-dependent signal in response to an
input event Such mechanisms may include a processor and other
hardware and/or software suitably configured. Referring to FIG. 1,
the multi-touch sensing display 101 may therefore deliver signals
corresponding to sensor data that may be used by a processing unit
108 to determine an occurrence and position of an input event. The
multi-touch sensing display 101 may also directly deliver the
position of a detected input event, such as for example, as two
dimensional (for example, x and y) coordinates relative to the
surface. The setting of the control element that generates the
input event may then be determined by the processing unit 108.
[0025] The processing unit 108 is connected to provide the
multi-touch sensing display 101 with processing resources. The
processing unit 108 in FIG. 1 provides the display signal to the
multi-touch sensing display 101 and reads out the state of the
array of photosensitive elements of the multi-touch sensing display
101. A readout of the array of photosensitive elements 105 may be
performed at predetermined times. At such times, the processing
unit 108 may obtain an image of light intensities detected by the
photosensitive elements 105 at their respective positions on the
surface 104 of display 101. The processing unit 108 may analyze the
image data to determine a position in the data at which a change in
intensity occurred. The processing unit 108 may be provided with
information that includes the type of control element located at a
given position, and the function currently assigned to the
respective control element. By determining the position of an input
event and the position of the trigger element relative to the
surface 104, the processing unit 108 may determine the setting of
the control element and assign a corresponding value to a parameter
of the function controlled by the control element. It is to be
understood that a particular setting of the control element need
not correspond to a particular value of an associated parameter,
but that activation of the control element by, for example,
rotation through a particular angle may define a corresponding
change of the parameter value.
[0026] The actual position of trigger elements 106 and 107 may be
detected by the photosensitive elements of the multi-touch sensing
display 101 and determined by processing unit 108. The input device
100 may detect simultaneous actuation of the control elements 102
and 103 as separate or distinct input events, which may involve
resolving the positions relative to the surface at which the input
events occurred. The input device 100 may also detect a touch or a
near-touch to the surface 104 in areas of the multi-touch sensing
display 101 that are not covered by the control elements or
provided with the optical sensors. The processing unit 108 may also
control the multi-touch sensing display 101 to display information,
such as for example, the type and the value of the parameter
controlled by the control element 102 or 103. The information may
be displayed next to the respective control element 102 or 103.
FIG. 1 shows an example implementation that uses optical sensors so
that the surface 104 is made of a transparent material, such as
glass. With a glass surface, the rotary knobs 102 and 103 may be
mechanically mounted on the surface 104 using an adhesive, for
example.
[0027] The input device 100 in FIG. 1 is coupled to an audio mixing
device 109 to allow the user to control parameters for operating
the audio mixing device 109. For example, the values of parameters
that may be adjusted using the control elements 102 and 103 are
provided to the audio mixing device 109. The audio mixing device
109 includes a plurality of audio inputs 110 and outputs 111 for
communicating audio signals. The audio mixing device 109 processes
the audio input signals 110 in accordance with parameters received
from processing unit 108. Audio mixing devices, such as for
example, a digital mixer, are known in the art and require no
further description.
[0028] The example input device 100 in FIG. 1 has been described as
including the multi-touch sensing display 101 having optical
sensors 105, however, other types of touchscreens may be used as
well. For example, capacitive or resistive touchscreen panels may
also be used. Parameter values may also be provided to any type of
device by the processing unit 108. For example, parameters may be
relevant to a control station for a machine, a power plant, or any
other electronic device, such as a computer or a station for video
processing, or any other device connected to the input device 100.
The multi-touch sensing display 101 of input device 100 may display
information relating to the function controlled by a control
element as well as data and information provided by a device
connected to the input device 100.
[0029] FIG. 2 is a schematic diagram of an example of an input
device having a rotary knob 201 as a mechanical control element.
The rotary knob 201 may be turned in two directions as indicated by
arrow 202. The rotary knob 201 includes a movable component 203 and
a shaft with a base 204 fixedly mounted to a surface 205 of the
multi-touch sensing display 206. The shaft and base 204 are mounted
by an adhesive to surface 205. The moveable component 203 rotates
on the shaft 205. This rotation moves the trigger element 207 in a
plane substantially parallel to the surface 205. The distance
between the trigger element 207 and the surface 205 is determine to
permit detection of the position of the trigger element 207 by the
multi-touch sensing display 206. The turning of the rotary knob 201
generates subsequent input events at positions lying on a circle
around the rotary axis 208 of the rotary knob 201.
[0030] FIG. 3 is a schematic diagram of an example of an input
device using a sliding controller 301 as a mechanical control
element. The sliding controller 301 includes a movable component
303 that slides linearly in a direction horizontal with respect to
a surface 305 of the multi-touch sensing display 306 (along arrow
302). The moveable component 303 is movable within a support
structure 304 that is fixedly mounted to the surface 305. As
described above with reference to FIG. 1, the support structure 304
may be mounted to the surface 305 using a variety of techniques and
fixing components including gluing or cementing: engaging elements
of the support structure 304 with a recess formed on the surface
305; providing one or more holes through the multi-touch sensing
display 306 to attach the support structure 305 using bolts,
screws, and the like. Actuation of the control element, by moving
the sliding control 301, results in a movement of a trigger element
307 fixed to the movable component 303 in a horizontal direction
relative to the surface 305. The spacing between the trigger
element 307 and the surface 305 is again determined to permit
detection of the trigger element 307 by the multi-touch sensing
display 306. The spacing will depend on the particular detection
mechanism employed. When using optical sensors or a capacitive
touch screen panel, the trigger element 307 may not touch surface
305. When using a resistive touchscreen panel or a method based on
total internal reflection, the trigger element 307 may touch the
surface 305. Actuation of the sliding control 301 results in the
generation of input events at positions on the surface 305 along a
line. The movement of the sliding controller 301 may be inferred by
detecting the positions of the input events. A value of an
associated parameter may then be changed accordingly.
[0031] FIG. 4 is a schematic diagram of an example of an input
device using a push button 401 as a control element. The push
button 401 includes a movable component 403 that moves in a
direction indicated by arrow 402, which is substantially
perpendicular to a surface 405 of the multi-touch sensing display
406. A trigger element 407 is mounted on the movable component 403
a variable distance to the surface 405, the distance varied by
moving the movable component 403. The movable component 403 of the
push button 401 is supported by a supporting structure 404 fixedly
mounted to surface 405. The push button 401 is actuated by applying
pressure to the movable component 403. The distance between the
trigger element 407 and the surface 405 is decreased eventually
triggering an input event. The distance between the trigger element
407 and the surface 405 depends on the specific multi-touch sensing
display 406, on a first distance to an un-pushed state and a second
distance to a pushed state. A calibrating procedure may be
implemented to adjust the first and second distances. In one
example, the intensity of light detected by optical sensors
underneath the surface 405 may increase or decrease in the pushed
position, without having the trigger element 407 touch the surface
405. In the non-actuated state, the multi-touch sensing display 406
may still be able to determine the position of the trigger element
407. The distance to the surface 405 is sufficient to allow the
trigger element 407 to generate an input event when the push button
401 is actuated.
[0032] FIG. 5 is a schematic diagram of an example of an input
device 500 using photosensitive elements for detecting actuation of
a control element. The input device 500 of FIG. 5 includes a
control element implemented as a rotary knob 501. The rotary knob
501 is mounted to a surface 502 of a multi-touch sensing display
503. The multi-touch sensing display 503 includes photosensitive
pixels 505 (shown as black squares) and display pixels 506 (shown
as white squares). The display pixels 506 emit light, as indicated
by arrows when displaying an image. The emitted light is reflected
by a reflective trigger element 504 mounted to the rotary knob 501.
The reflected light is detected by the photosensitive pixels 505
(as indicated by the arrows received by the photosensitive pixels
505). The position of the reflective trigger element 504 relative
to the surface 502 may be detected by the photosensitive pixels 505
and determined by reading out the detected intensity values and
analyzing the intensity distribution. As shown in FIG. 5, the
remaining surface of the rotary knob 501 facing surface 502 may be
non-reflective, or light absorptive, for the light emitted by the
multi-touch sensing display 503. The emission of light by the
display pixels 506 located in the area of the surface 502 over
which the trigger element 504 may be moved may be controlled such
that the display pixels 505 emit light with a predetermined
intensity, for example, near maximum intensity, so that a high
signal may be received from photosensitive pixels 505, and the
position of the trigger element may be precisely determined. It is
to be understood that other implementations are also possible, such
as providing an absorptive trigger element and a reflective surface
of the control element 501 facing the surface 502 of the display
503.
[0033] FIG. 6 is a schematic diagram of an example of an input
device 600 using a capacitive element for triggering an input
event. The input device 600 includes at least one control element
implemented in FIG. 6 as a rotary knob 601. The multi-touch sensing
display 603 includes a capacitive touch-screen panel having a
capacitance sensitive layer 605. Capacitive multi-touch sensing
displays are known to those of ordinary skill in the art and will
not be explained in further detail. More details on the operation
of a capacitive multipoint touchscreen can be found, for example,
the US Patent Publication US 2006/00917991 A1, which is
incorporated by reference in its entirety.
[0034] A conventional capacitive touchscreen panel may for example
include a capacitive sensing layer of a metal oxide, such as indium
tin oxide, which conducts an electrical current across the sensor
panel. The current is applied by electrodes on each corner of the
panel, in one example, with a square wave signal. When the panel is
touched, a charge transport occurs, which can be measured as a
current at the corners of the panel. The position of the touch
event may be determined by evaluating the resulting currents at the
corners of the panel. To detect multiple simultaneous touches, the
touchscreen panel may include a plurality of transparent sensor
nodes which may again be formed of a conductive medium such as a
metal oxide, spatially separated into electrodes and traces.
Different coordinates on the display may then be represented by the
different electrodes, and the traces are used to connect the
electrodes to a capacitive sensing circuit. A change of a
capacitance occurring at a particular electrode may then be
recognized, and by using a plurality of electrodes, the positions
of simultaneously occurring touches may be resolved. Referring to
FIG. 6, a capacitive trigger element 604 is provided to trigger an
input event. The trigger element 604 disturbs an electrical field
established adjacent to a sensing node of capacitive sensitive
layer 605 at a position underneath the trigger element 604. The
disturbance may be detected as a change in capacitance at the
sensing node. The position of the trigger element 604 relative to
the surface 602 may then be determined. Actuation of the control
element 601 results in a change of capacitance of another sensing
node, which again generates an input event at a position relative
to surface 602, which can be determined by a capacitive sensing
circuit. The capacitive trigger element 604 may be grounded, or may
be grounded when a user touches the control element 601. The
sensing nodes of the capacitive multi-touch sensing panel may also
be arranged to achieve a high resolution of the positioning of
trigger element 604. For example, high resolution may be achieved
by closely spacing the sensing nodes in proximity to the control
element. Again, multi-touch sensing display 603 is capable of
sensing a simultaneous actuation of the control element 601 and a
touch to the surface 602 while also displaying information.
[0035] FIG. 7 is a flowchart illustrating operation of an example
method for detecting multiple simultaneous touches or near touches
of a multi-touch sensing device. In an example, the method may be
performed using the input device of FIG. 1 or FIG. 5. In step 701,
two control elements are actuated simultaneously. It is to be
understood that these elements may be any type of control elements,
such as rotary knobs, sliders, rockers, push buttons, and similar
devices. By actuating the control elements, trigger elements of the
control elements are moved relative to the display surface at step
702. Optical sensors located in the display may detect light
emitted by the display and reflected by the trigger elements. The
movement of the trigger element results in a change in the
intensity of the light detected by the optical sensors, which is
detected in step 703. The locations or positions on the display at
which the intensity changes occurred may be determined in step 704.
A new setting for each control element is determined in step 705 on
the basis of the respective intensity change and its location. For
example, it may be determined that a slider was moved a particular
distance or that a rotary knob was turned through a particular
angle. Alternatively, the absolute setting of the control element
may be determined, such as for example, a new position of a slider
or of a rotary knob. A new value for a parameter associated with
the control element is then calculated in step 706 on the basis of
the derived new setting for each control element. For example, a
particular switching function may have been assigned to a push
button, and an associated parameter value may be changed from `1`
to signify an `ON` position to a `0` to signify an `OFF` position
upon actuation. The parameter value may also be adjusted according
to a determined travel distance or turn angle of a control element,
or to the determined absolute new setting of the control element.
The parameters with their values are then provided to a device
connected to the input device in step 707. It is to be understood
that the above method may include additional steps, such as for
example a step of detecting a touch to a surface adjacent to a
control element and adjusting a parameter on the basis of the
detected touch, or changing the function of a control element in
accordance with a position of a detected touch. Graphical control
elements may also be provided and functions of the mechanical
control elements changed accordingly.
[0036] FIG. 8 is a schematic diagram of an example of an audio
console 800. The audio console 800 in FIG. 8 includes two input
devices 801 and 802. The input device 801 of audio console 800
includes a plurality of mechanical control elements implemented as
rotary buttons 803. The input devices 801 and 802 are shown in a
view from above, as indicated by arrow 205 in FIG. 2.
[0037] The portions of input devices 801 and 802 that are visible
to a user are touch-sensitive and are configured to display
information. The input device 801 includes areas 804, 805 and 806
adjacent to rotary knobs 803. The areas 804, 805, 806 may be used
to display the type of parameter and the parameter value that is
currently being adjusted by the respective rotary knob 803. In the
example shown in FIG. 8, area 804 indicates the adjustment of a
numerical value for a particular channel, area 805 indicates the
adjustment of a high frequency equalizer using a needle indicator,
and area 806 indicates the adjustment of a bandwidth.
[0038] The input device 802 includes sliding controls 807 and 808,
which may be for example, faders, with graphical indications on a
channel to be adjusted and of a present setting provided next to
them. The input device 802 includes push buttons 809 and 810 with
their present setting indicated graphically in an area adjacent to
them. Although control elements 807 to 810 are mechanical control
elements, it is to be understood that some of these may also be
implemented as graphical control elements, which may be actuated by
touching the surface of input device 802 at a position where the
control element is displayed.
[0039] Those of ordinary skill in the art will understand that
different types of mechanical and graphical control elements may be
arranged on a touch-sensitive surface of the input device, and that
mechanical control elements other than the ones mentioned above may
be used. Apart from being used in an audio console 800, input
devices according to example implementations may also be used in
other devices such as control stations of a factory or a power
plant.
[0040] Those of ordinary skill in the art will also understand that
the types of multi-touch sensing displays used are not limited to
those described above. Other types of displays may be used, such as
for example, infrared touchscreen panels, strain gauge touchscreen
panels, surface acoustic wave or diffused laser imaging touchscreen
panels, and the like. These panels should be adapted in a manner
similar to the examples described above to recognize multiple
simultaneous touches.
[0041] It will be understood, and is appreciated by persons skilled
in the art, that one or more processes, sub-processes, or process
steps described in connection with FIGS. 1-8 may be performed by
hardware and/or software under the control of a processor, such as
processing unit 108 in FIG. 1. If the process is performed by
software, the software may reside in software memory (not shown) in
a suitable electronic processing component or system such as the
processing unit 108 in FIG. 1. The software in software memory may
include an ordered listing of executable instructions for
implementing logical functions (that is, "logic" that may be
implemented either in digital form such as digital circuitry or
source code or in analog form such as analog circuitry or an analog
source such an analog electrical, sound or video signal), and may
selectively be embodied in any computer-readable medium for use by
or in connection with an instruction execution system, apparatus,
or device, such as a computer-based system, processor-containing
system, or other system that may selectively fetch the instructions
from the instruction execution system, apparatus, or device and
execute the instructions. In the context of this disclosure, a
"computer-readable medium" is any means that may contain, store or
communicate the program for use by or in connection with the
instruction execution system, apparatus, or device. The computer
readable medium may selectively be, for example, but is not limited
to, an electronic, magnetic, optical, electromagnetic, infrared, or
semiconductor system, apparatus or device. More specific examples,
but nonetheless a non-exhaustive list, of computer-readable media
would include the following: a portable computer diskette
(magnetic), a RAM (electronic), a read-only memory "ROM"
(electronic), an erasable programmable read-only memory (EPROM or
Flash memory) (electronic) and a portable compact disc read-only
memory "CDROM" (optical). Note that the computer-readable medium
may even be paper or another suitable medium upon which the program
is printed, as the program can be electronically captured, via for
instance optical scanning of the paper or other medium, then
compiled, interpreted or otherwise processed in a suitable manner
if necessary, and then stored in a computer memory.
[0042] The foregoing description of example implementations has
been presented for purposes of illustration and description. It is
not exhaustive and does not limit the claimed inventions to the
precise form disclosed. Modifications and variations are possible
in light of the above description or may be acquired from
practicing the invention. The claims and their equivalents define
the scope of the invention.
* * * * *
References