U.S. patent application number 12/537823 was filed with the patent office on 2011-02-10 for apparatus for associating physical characteristics with commands.
This patent application is currently assigned to STEELSERIES HQ. Invention is credited to Arnie Grever, Bruce Hawver.
Application Number | 20110034248 12/537823 |
Document ID | / |
Family ID | 43535233 |
Filed Date | 2011-02-10 |
United States Patent
Application |
20110034248 |
Kind Code |
A1 |
Grever; Arnie ; et
al. |
February 10, 2011 |
APPARATUS FOR ASSOCIATING PHYSICAL CHARACTERISTICS WITH
COMMANDS
Abstract
A system that incorporates teachings of the present disclosure
may include, for example, a biometric accessory having a controller
to detect at least one of navigation information and biometric
information associated with a user of the accessory, and transmit
at least one of the navigation information and biometric
information to a software application, wherein an input function of
the accessory which is correlated with at least one of the
navigation information and the biometric information is assigned to
an action of a plurality of associable actions by the software
application, wherein a stimulation of the input function is
detected by the software application, wherein the action is
retrieved by the software application based on the stimulation
being detected, and wherein the retrieved associable action is
transmitted by the software application to an operating system.
Additional embodiments are disclosed.
Inventors: |
Grever; Arnie; (Palatine,
IL) ; Hawver; Bruce; (Hawthorn Woods, IL) |
Correspondence
Address: |
SteelSeries Docket
304 Indian Trace Rd, #750
Weston
FL
33326
US
|
Assignee: |
STEELSERIES HQ
Valby
DK
|
Family ID: |
43535233 |
Appl. No.: |
12/537823 |
Filed: |
August 7, 2009 |
Current U.S.
Class: |
463/36 ;
463/43 |
Current CPC
Class: |
A63F 13/22 20140902;
A63F 2300/1018 20130101; A63F 2300/1012 20130101; A63F 13/42
20140902; A63F 13/73 20140902; A63F 2300/401 20130101; A63F 13/21
20140901; A63F 13/212 20140902; A63F 13/215 20140902 |
Class at
Publication: |
463/36 ;
463/43 |
International
Class: |
A63F 9/24 20060101
A63F009/24 |
Claims
1. A computer-readable storage medium, comprising computer
instructions to: present in a graphical user interface a plurality
of associable actions and a biometric sensing accessory, wherein a
fingerprint is detectable by the biometric sensing accessory and
the fingerprint is correlated to an input function associated with
the biometric sensing accessory; associate one of the plurality of
associable actions with the input function; detect a stimulation of
the input function by monitoring the biometric sensing accessory,
wherein the stimulation of the input function occurs based on a
detection of the fingerprint; retrieve the action associated with
the input function; and transmit the action to an operating
system.
2. The computer-readable storage medium of claim 1, wherein the
fingerprint is a plurality of fingerprints, and wherein each
fingerprint or a combination of fingerprints of the plurality of
fingerprints is correlated to a different input function.
3. The computer-readable storage medium of claim 2, comprising
computer instructions to associate other associable actions of the
plurality of associable actions with the different input
functions.
4. The computer-readable storage medium of claim 1, wherein the
biometric sensing accessory is further configured to detect at
least one of an eye movement, a heart rate, a blood pressure, and a
body movement.
5. The computer-readable storage medium of claim 4, comprising
computer instructions to associate an associable action of the
plurality of associable actions to an input function correlated
with at least one of the detected eye movement, heart rate, blood
pressure, and body movement.
6. The computer-readable storage medium of claim 5, comprising
computer instructions to detect a stimulation of the input function
correlated with at least one of the eye movement, heart rate, blood
pressure, and body movement, wherein the stimulation of the input
function occurs based on a detection of at least one of the eye
movement, heart rate, blood pressure, and body movement.
7. The computer-readable storage medium of claim 1, comprising
computer instructions to authenticate into a software application
based on the detection of the fingerprint.
8. The computer-readable storage medium of claim 7, wherein the
operating system provides a signal representative of the action to
the software application.
9. The computer-readable storage medium of claim 7, wherein the
software application is a gaming application which invokes a gaming
feature based on the signal received from the operating system.
10. The computer-readable storage medium of claim 9, wherein at
least a portion the plurality of associable actions are associated
with one or more actions that control the gaming application.
11. The computer-readable storage medium of claim 1, wherein the
operating system launches a software application based on the
action.
12. The computer-readable storage medium of claim 1, comprising
computer instructions to associate associable actions of the
plurality of associable actions to input functions of a plurality
of other accessories, wherein the plurality of other accessories
comprise at least one of a keyboard, a gaming pad, a mouse, a
gaming console controller, a joystick, a microphone, and a headset
with a microphone.
13. The computer-readable storage medium of claim 1, comprising
computer instructions to store the association of the action with
the input function.
14. The computer-readable storage medium of claim 1, comprising
computer instructions to store the input function correlated with
the fingerprint and the associated action in a profile.
15. The computer-readable storage medium of claim 14, comprising
computer instructions to associate the profile to at least one
software application.
16. The computer-readable storage medium of claim 1, comprising
computer instructions to associate a sequence of stimulations of
the input function to a macro.
17. The computer-readable storage medium of claim 16, wherein the
macro corresponds to at least one of the plurality of associable
actions.
18. The computer-readable storage medium of claim 1, comprising
computer instructions to calculate and present a statistical
frequency of stimulation of the input function correlated with the
fingerprint.
19. An biometric accessory, comprising a controller to: detect at
least one of navigation information and biometric information
associated with a user of the accessory; and transmit at least one
of the navigation information and biometric information to a
software application, wherein an input function of the accessory
which is correlated with at least one of the navigation information
and the biometric information is assigned to an action of a
plurality of associable actions by the software application,
wherein a stimulation of the input function is detected by the
software application, wherein the action is retrieved by the
software application based on the stimulation being detected, and
wherein the retrieved associable action is transmitted by the
software application to an operating system.
20. The accessory of claim 19, wherein the biometric information
comprises at least one among a fingerprint, an eye movement, a
blood pressure, a body movement, and a heart rate.
21. The accessory of claim 19, comprising an integrated display
configured to present a graphical user interface which displays the
plurality of associable actions and a plurality of accessories of
distinct operational types, wherein the plurality of accessories
are for interacting with a software application operable in a
computer system.
22. The accessory of claim 19, wherein the biometric information is
a plurality of physical characteristics, and wherein each physical
characteristic or a combination of physical characteristics of the
plurality of physical characteristics is correlated to a different
input function of the accessory.
23. The accessory of claim 22, wherein the software application
associates actions of the plurality of associable actions with the
different input functions.
24. The accessory of claim 22, wherein the operating system
authenticates into a gaming application based on the detection of
the physical characteristic.
25. The accessory of claim 24, wherein the operating system
transmits the assigned action or an aspect thereof to the gaming
application.
26. The accessory of claim 24, wherein the stimulation of the input
function correlated with the physical characteristic is utilized to
manipulate in-game entities of the gaming application.
27. A computer-readable storage medium, comprising computer
instructions to: receive from a software application operably
coupled to a biometric sensing accessory an action associated with
an input function of the biometric sensing accessory, wherein the
input function is correlated to at least one of navigation
information and biometric information detected by the biometric
sensing accessory, wherein a stimulation of the input function is
detected by the software application, and wherein the action is
retrieved by the software application when the stimulation is
detected; and perform the received action.
28. The computer-readable storage medium of claim 27, wherein the
computer instructions represent a gaming application.
29. The computer-readable storage medium of claim 27, wherein the
physical characteristic comprises at least one of a fingerprint, an
eye movement, a body movement, a heart rate, and a blood
pressure.
30. The computer-readable storage medium of claim 27, comprising
authenticating a user into the gaming application based on the
physical characteristic.
31. The computer-readable storage medium of claim 27, wherein the
action is associated with one or more actions that control the
gaming application.
32. The computer-readable storage medium of claim 27, wherein the
software application is an operating system.
Description
FIELD OF THE DISCLOSURE
[0001] The present disclosure relates generally to accessory
management applications, and more specifically to an apparatus for
associating physical characteristics with commands.
BACKGROUND
[0002] It is common today for gainers to utilize more than one
gaming accessory while utilizing a gaming or other software
application. This is especially true of garners who play Massively
Multiplayer On-line (MMO) games in a team or individual
configuration. Garners can have at their disposal accessories such
as a keyboard, a general purpose gaming pad, a mouse, a gaming
console controller, a headset with a built-in microphone to
communicate with other players, a joystick, a computer display, or
other common gaming accessories.
[0003] In addition, a gamer can utilize biometric sensing devices
to serve as another option to control and/or manage gaming and
other software applications. A gamer can frequently use a
combination of these accessories during a game (e.g., biometric
sensing devices, headset, a keyboard, and mouse) or even use one
accessory to replace the function of another accessory. Efficient
management and utilization of these accessories can frequently
impact the gamer's experience during a game.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] FIGS. 1-3 depict illustrative embodiments of a Graphical
User Interface (GUI) generated by an Accessory Management Software
(AMS) application according to the present disclosure;
[0005] FIGS. 4-6 depict illustrative methods describing the
operation of the AMS application;
[0006] FIGS. 7-9 depict a biometric sensing device featuring
various detectable finger configurations, which can be associated
with various actions; and
[0007] FIG. 10 depicts an illustrative diagrammatic representation
of a machine in the form of a computer system within which a set of
instructions, when executed, may cause the machine to perform any
one or more of the methodologies disclosed herein.
DETAILED DESCRIPTION
[0008] One embodiment of the present disclosure entails a
computer-readable storage medium having computer instructions to
present in a graphical user interface a plurality of associable
actions and a biometric sensing accessory, wherein a fingerprint is
detectable by the biometric sensing accessory and the fingerprint
is correlated to an input function associated with the biometric
sensing accessory, associate one of the plurality of associable
actions with the input function, detect a stimulation of the input
function by monitoring the biometric sensing accessory, wherein the
stimulation of the input function occurs based on a detection of
the fingerprint, retrieve the action associated with the input
function, and transmit the action to an operating system.
[0009] One embodiment of the present disclosure entails a biometric
accessory having a controller to detect at least one of navigation
information and biometric information associated with a user of the
accessory, and transmit at least one of the navigation information
and biometric information to a software application, wherein an
input function of the accessory which is correlated with at least
one of the navigation information and the biometric information is
assigned to an action of a plurality of associable actions by the
software application, wherein a stimulation of the input function
is detected by the software application, wherein the action is
retrieved by the software application based on the stimulation
being detected, and wherein the retrieved associable action is
transmitted by the software application to an operating system.
[0010] One embodiment of the present disclosure entails a
computer-readable storage medium having computer instructions to
receive from a software application operably coupled to a biometric
sensing accessory an action associated with an input function of
the biometric sensing accessory, wherein the input function is
correlated to at least one of navigation information and biometric
information detected by the biometric sensing accessory, wherein a
stimulation of the input function is detected by the software
application, and wherein the action is retrieved by the software
application when the stimulation is detected, and perform the
received action.
[0011] FIGS. 1-3 depict illustrative embodiments of a Graphical
User Interface (GUI) generated by an Accessory Management Software
(AMS) application according to the present disclosure. The AMS
application can operate in a computing device such as a desktop
computer, a laptop computer, a server, a mainframe computer, or a
gaming console. A gaming console can represent a gaming device such
as a Playstation 3.TM., a Wii.TM., or an Xbox360.TM.. Other present
and next generation gaming consoles are contemplated. The AMS
application can also operate in other computing devices with less
computing resources such as a cellular phone, a personal digital
assistant, or a media player (such as an iPOD.TM.). From these
illustrations it would be apparent to an artisan with ordinary
skill in the art that the AMS application can operate in any device
with computing resources.
[0012] FIGS. 4-6 depict illustrative methods 400-600 describing the
operation of the AMS application as shown in FIGS. 1-3. Method 400
can begin with step 402 in which the AMS application is invoked in
a computing device. The invocation step can result from a user
selection of the AMS application from a menu or iconic symbol
presented on a desktop of the computing device by an operating
system (OS) managing operations thereof. In step 404, the AMS
application can detect by way of drivers in the OS a plurality of
operationally distinct accessories communicatively coupled to the
computing device. However, the accessories do not necessarily have
to be operationally distinct, and can have similar features and/or
operational capabilities. The accessories can be coupled to the
computing device by a tethered interface (e.g., USB cable), a
wireless interface (e.g., Bluetooth or Wireless Fidelity--WiFi), or
combinations thereof.
[0013] In the present context, an accessory can represent any type
of device which can be communicatively coupled to the computing
device and which can control aspects of the OS and/or a software
application operating in the computing device. An accessory can
represent for example a keyboard, a biometric sensing device, a
gaming pad, a mouse, a gaming console controller, a joystick, a
microphone, or a headset with a microphone--just to mention a few.
The keyboard and gaming pad represent accessories of a similar
category since their operational parameters are alike.
[0014] A mouse, on the other hand, represents an accessory which
can have disparate operational parameters from the keyboard or
gaming pad. For instance, the operational parameters of a keyboard
generally consist of alphanumeric keys, control keys (e.g., Shift,
Alt, Ctrl), and function keys while the operational parameters of a
mouse consist of navigation data generated by a tracking device
such as a laser sensor, buttons to invoke GUI selections, and
settings thereof (e.g., counts or dots per inch, acceleration,
scroll speed, jitter control, line straightening control, and so
on). Such distinctions can be used to identify disparate categories
of accessories.
[0015] Additionally, a biometric sensing device or other device
capable of recognizing and/or sensing the physical characteristics
of or detecting actions performed by a person can have different
operational parameters as well. In the case of a biometric sensing
device, such as a fingerprint detection pad, the operational
parameters can include, but are not limited to including,
navigation or input data generated by sensing a finger or fingers
dragged across the surface of the pad and input or other data
generated by sensing a finger pressing against the pad. The
navigation and input data can also be generated by sensing other
actions performed with respect to the pad. The pad, for example,
can include a touchscreen, which can detect a user's touch through
the use of capacitive, resistive, surface acoustic wave, projected
capacitance, infrared, strain gauge, optical, and/or acoustic pulse
touchscreen technologies.
[0016] Notably, the biometric sensing device can be configured to
detect each fingerprint of a user or other users and each
fingerprint can be correlated with a particular input function of
the biometric sensing device. In other words, the fingerprints can
serve as inputs. Additionally, combinations of fingers can be
correlated with input functions of the biometric sensing device as
well. For example, a combination of a user's thumb and index finger
can be correlated to an input function. The biometric sensing
device can also be configured to detect and correlate eye
movements, blood pressure, heart rates, body movements, and other
physical characteristics of a person to various other input
functions of the sensing device.
[0017] Furthermore, the sensing device can be configured to detect
a user's act of touching the surface of the pad or touchscreen, the
user's dragging of a finger or fingers on the surface of the
screen, and other actions which are independent of the physical
characteristics of the user and utilize these actions as inputs.
The biometric sensing device can exist as a single device, multiple
devices, a pad integrated with a display, and in other
configurations. The sensing device can be further configured to
have a keypad or a touchscreen keypad and can have touch zones,
where each zone can be tailored to perform a particular function.
The joysticks, game controllers or any other input devices
represent additional categories of accessories supported by the
AMS.
[0018] In step 406, the AMS application presents a GUI 101 such as
depicted in FIG. 1 with operationally distinct accessories such as
the keyboard 108, mouse 110. headset 114, game controller 115, and
biometric sensing device (or other sensing device) 116. In an
embodiment, the GUI 101 can be displayed on the biometric sensing
device 116 if the device has a display. The GUI 101 presents the
accessories 108-116 in a scrollable section 117. One or more
accessories can be selected by a user with a common mouse pointer
or by tapping or dragging a finger on the touch screen/pad of the
sensing device 116. In this illustration, the keyboard 108 and the
biometric sensing device 116 were selected with a pointer for
customization. Upon selecting the keyboard 108 and biometric
sensing device 116 in section 117, the AMS application presents the
keyboard 108 and biometric sensing device 116 in split windows 118,
120, respectively, to help the user during the customization
process.
[0019] In step 408, the AMS application can be programmed to detect
a user-selection of a particular software application such as a
game. This step can be the result of the user entering in a Quick
Search field 160 the name of a gaming application (e.g., World of
Warcraft.TM.). Upon identifying a gaming application, the AMS
application can retrieve in step 410 from a remote or local
database gaming application actions which can be presented in a
scrollable section 139 of the GUI represented as "Actions" 130. The
actions can be tactical actions 132, communication actions 134,
menu actions 136, and movement actions 138, or any other types of
actions, which can be used to invoke and manage features of the
gaming application.
[0020] The actions presented descriptively in section 130 of the
GUI can represent a sequence of accessory input functions which a
user can stimulate by button depressions, navigation, performing
actions with the sensing device 116, or speech. For example,
depressing the left index finger on the biometric sensing device
116 can represent the tactical action "Reload", while the
simultaneous keyboard depressions "Ctrl A" can represent the
tactical action "Melee Attack". For ease of use, the "Actions" 130
section of the GUI is presented descriptively rather than by a
description of the input function(s) of a particular accessory.
[0021] Any one of the Actions 130 can be associated with one or
more input functions of the accessories by way of a simple drag and
drop action. For instance, a user can select a "Melee Attack" by
placing a pointer 133 over an iconic symbol associated with this
action by utilizing the mouse 108 or by dragging the pointer 133
using a pad/touchscreen of the biometric sensing device 116. Upon
doing so, the symbol can be highlighted to indicate to the user
that the icon is selectable. At this point, the user can select the
icon by holding the left mouse button and drag the symbol (or by
utilizing the touch screen of the sensing device 116) to any of the
input functions (e.g., buttons) of the keyboard 108, mouse 110, or
biometric sensing device 116 to make an association with an input
function of one of these accessories.
[0022] For example, the user can drag the Melee Attack symbol to a
particular region of a touchscreen of the sensing device 116
thereby causing an association between the associated region and
the gaming action of a Melee Attack. When the region of the sensing
device 116 is tapped or otherwise touched during normal operation
of a game, the AMS application can detect the selection as a
"trigger" to generate the key sequence "Ctrl A" which is understood
by the gaming application as request for a Melee Attack. The gaming
application receives from the AMS application by way of an
operating system the "Ctrl A" sequence as if it had been generated
by a Qwerty keyboard.
[0023] Additionally, the sensing device 116 and/or the AMS
application can be configured to record the fingerprints, a
combination of fingerprints, eye movements, body movements, heart
rates, blood pressure, and other physical characteristics of the
user and correlate the characteristics to input functions of the
sensing device 116. The user can then associate any one of the
actions 130 to an input function correlated with a particular
physical characteristic. As an example, the biometric sensing
device 116 is illustratively shown in split window 120 of FIG. 1
with a view from underneath the surface of the sensing device 116.
A user's hands are placed on the top surface of the sensing device
116 so that the user's right hand is illustratively shown on the
left and the user's left hand is illustratively shown on the right.
Each fingerprint 116a-e of the right hand and each fingerprint
116f-j of the right hand can be detected by the sensing device 116
and can be transmitted to the AMS application. The AMS application
can then be utilized to associate actions 130 to each fingerprint
116f-j or to a combination of the fingerprints 116f-j.
[0024] For example, the fingerprint corresponding to the left ring
finger 116i of the user can be associated with the "Night Vision"
action under the Tactics 132 menu and the right thumb can be
assigned to "Melee Attack." Upon tapping a touch screen of the
sensing device 116 with the user's left ring finger 116i during a
game, a night vision mode can be triggered during game play. By
tapping the screen again with the ring finger 116i, the night
vision mode can be toggled off or even lead to another action. Of
course, each finger or combination of fingers can be associated
with a particular action. Similarly, a user's eye movement to the
left can be associated with the "Move Left" action and eye movement
to the right can be associated with the "Move Right" action under
the Movement 138 menu.
[0025] With this in mind, attention is directed to step 412 where
the AMS application can respond to a user selection of a profile. A
profile can be a device profile or master profile invoked by
selecting GUI button 156 or 158, each of which can identify the
association of actions with input functions of one or more
accessories. If a profile selection is detected in step 412, the
AMS application can retrieve macro(s) and/or prior associations of
actions with the accessories as defined by the profile. For
example, if a certain set of fingers or finger combinations are
associated with certain actions for a particular video game in a
selected profile, the associations of actions stored in the profile
can be retrieved in anticipation of playing the video game. The
actions and/or macros defined in the profile can also be presented
in step 416 by the AMS application in the actions column 130 of the
GUI 101 to modify or create new associations.
[0026] In step 418, the AMS application can also respond to a user
selection to create a macro. A macro in the present context can
represent a subset of actions that can be presented in the Actions
column 130. Any command which can be recorded by the AMS
application can be used to define a macro. A command can represent
a sequence of input functions of an accessory, identification of a
software application to be initiated by an operating system (OS),
or any other recordable stimulus to initiate, control or manipulate
software applications. For instance, a macro can represent a user
entering the identity of a software application (e.g., instant
messaging tool) to be initiated by an OS. A macro can also
represent recordable speech delivered by a microphone singly or in
combination with a headset for detection by another software
application through speech recognition or for delivery of the
recorded speech to other parties. In an embodiment, the macro can
represent recordable taps, finger drags, or other actions performed
on a touchscreen/pad of the sensing device 116 In yet another
embodiment a macro can represent recordable navigation of an
accessory such as a mouse or joystick, recordable selections of
buttons on a keyboard, a mouse, or a mouse pad, and so on. Macros
can also be combinations of the above illustrations. Macros can be
created from the GUI 101 by selecting a "Record Macro" button 148.
The macro can be given a name and category in user-defined fields
140 and 142.
[0027] Upon selecting the Record Macro button 148, a macro can be
generated by selection of input functions on an accessory (e.g.,
Ctrl A, speech, touch screen region, etc.) and/or by manual entry
in field 144 (e.g., typing the name and location of a software
application to be initiated by an OS). Once the macro is created,
it can be tested by selecting button 150 which can repeat the
sequence specified in field 144. The clone button 152 can be
selected to replicate the macro sequence if desired. Fields 152 can
also present timing characteristics of the stimulation sequence in
the macro with the ability to customize such timing. Once the macro
has been fully defined, selection of button 154 records the macro
in step 420. The recording step can be combined with a step for
adding the macro to the associable items Actions column 130,
thereby providing the user the means to associate the macro with
input functions of the accessories.
[0028] In step 422, the AMS application can respond to drag and
drop associations between actions and input functions of the
keyboard 108 and the biometric sensing device 116. If an
association is detected, the AMS application can proceed to step
424 where it can determine if a profile has been identified in step
412 to record the association(s) detected. If a profile has been
identified, the associations are recorded in said profile in step
426. If a profile was not been identified in step 412, the AMS
application can create a profile in step 428 for recording the
detected associations. In the same step, the user can name the
newly created profile as desired. The newly created profile can
also be associated with one or more software applications in step
430 for future reference.
[0029] The GUI 101 presented by the AMS application can have other
functions. For example, the GUI 101 can provide options for layout
of the accessory selected (button 122), how the keyboard is
illuminated when associations between input functions and actions
are made (button 134), and configuration options for the accessory
(button 126). Configuration options can include operational
settings of the mouse 110 such as Dots Per Inch or Counts Per Inch,
and so on. The AMS application can adapt the GUI 101 to present
more than one functional perspective. For instance, by selecting
button 102, the AMS application can adapt the GUI 101 to present a
means to create macros and associate actions to accessory input
functions as depicted in FIG. 1. Selecting button 104 can cause the
AMS application to adapt the GUI 101 to present statistics in
relation to the usage of accessories as depicted in FIGS. 2-3.
Selecting button 106 can cause the AMS application to adapt the GUI
101 to present promotional offers and software updates.
[0030] It should be noted that the steps of method 400 in whole or
in part can be repeated until a desirable pattern of associations
of actions to input functions of the selected accessories has been
accomplished. It would be apparent to an artisan with ordinary
skill in the art that there can be numerous other approaches to
accomplish similar results. These undisclosed approaches are
contemplated by the present disclosure.
[0031] FIG. 5 depicts a method 500 in which the AMS application can
be programmed to recognize unknown accessories so that method 400
can be applied to them as well. Method 500 can begin with step 502
in which the AMS application detects an unknown accessory such as a
new keyboard from an unknown vendor by way of a communicative
coupling to a computing device from which the AMS application
operates. The AMS application in this instance can receive an
identity from the keyboard or the operating system which is not
known the AMS application. Upon detecting an unknown accessory, the
AMS application in step 504 can present a depiction of an accessory
of similar or same category in response to a user providing
direction as to the type of accessory (by selecting for example a
drop-down menu). Alternatively, or in combination with the user
instructions, the AMS application can determine from the
information received from the unknown accessory an accessory
type.
[0032] In step 506 the AMS application can receive instructions
describing all or a portion of the input functions of the unknown
accessory. These instructions can come from a user who defines each
input function individually or responds to inquiries provided by
the AMS application. The AMS application can for example make an
assumption as to the keyboard layout and highlight each key with a
proposed function which the user can verify or modify. Once the AMS
application has been provided instructions in step 506, the AMS
application can create an accessory identity in step 508 which can
be defined by the user. In steps 510 and 512, the AMS application
can associate and record the accessory instructions with the
identity for future recognition of the accessory. In step 514, the
AMS application can present a depiction of the new accessory with
its identity along with the other selectable accessories in section
117.
[0033] Method 500 can provide a means for universal detection and
identification of any accessory which can be used to control or
manage software applications operating in a computing device.
[0034] FIG. 6 depicts a method 600 for illustrating the AMS
application responding to input function stimuli (triggers) of
accessories. Method 600 can begin with step 602 in which the AMS
application monitors the use of accessories, such as the biometric
sensing device 116. This step can represent monitoring the
stimulation of input functions of one or more accessories
communicatively coupled to a computing device from which the AMS
application operates. The input functions can correspond to button
depressions on a keyboard, gaming pad, or navigation device such as
a mouse, or can correspond to fingerprints, eye movements, heart
rate increases, or blood pressure changes detected by the sensing
device 116. The input functions can also represent navigation
instructions such as eye, finger, mouse, or joystick movements. The
input functions can further represent speech supplied by a
microphone singly or in combination with a headset. Other existing
or future input functions of an accessory detectable by the AMS
application are contemplated by the present disclosure. The AMS
application can monitor input functions by for example processing
human interface device (HID) reports supplied by the accessories to
the computing device.
[0035] Once one or more stimulations have been detected in step
604, the AMS application can proceed to step 606 to determine if
action(s) have been associated with the detected stimulation(s).
If, for example, the stimulations detected correspond to keyboard
108 and/or taps/drags on the touchscreen of the sensing device 116,
the AMS application can determine if actions have been associated
and recorded for such stimulations. If these stimulations "trigger"
one or more actions, the AMS application can proceed to step 608
where it retrieves the stimulation definition of these actions for
each accessory reporting a stimulation. In step 610, the AMS
application can substitute the detected stimulations with the
stimulations defined by the action.
[0036] To illustrate this substitution, suppose for example that
the detected stimulation was "Ctrl A" simultaneously depressed on a
keyboard. Suppose further that an action associated with this
stimulus consists of a macro that combines finger dragging with a
navigation of the sending device 116 (e.g., moving one's finger
quickly in a backward motion for a given distance), and a request
to invoke an instant messaging (IM) session with a particular
individual using Skype.TM. or some other common IM tool. In step
610, the AMS application would substitute "Ctrl A" for stimulations
consisting of the finger drags, navigation and a request for an IM
application. The substitute stimulations would then be reported in
step 612 to an operating system (OS).
[0037] In step 616, the OS can determine whether to pass the
substitute stimulations to an active software application in
operation (e.g., a gaming application) and/or to invoke another
software application. The active software application can be
operating from the same computer system from which the OS and the
AMS application operate or can be operating at a remote system such
as an on-line server or family of servers (e.g., World of Warcraft)
awaiting stimulation data from the computer system. In this
illustration, the macro comprises both stimulation feedback for the
active software application and a request to initiate an IM
session.
[0038] Accordingly, the OS conveys in step 618 the sensing device's
116 stimulation signals to the active software application (e.g.,
gaming application), and in a near simultaneous fashion invokes the
IM session in step 620 with a specific individual (or
organization). In another example, if a user's right thumb is
assigned to the "Melee Attack" action and the user's left thumb is
assigned to "Throw Frag," the OS can transmit the actions to a
gaming application which is currently executing and an in-game
entity (such as a video game character) can perform the received
actions.
[0039] In order to provide a further example, and now referring
also to FIGS. 7-9, are illustrations depicting the use of various
fingerprint combinations, which can be associated with various
actions. FIG. 7 illustrates a user pressing his/her ring finger
702a, middle finger 702c, and ring finger 702c onto the surface of
a sensing device. The combination of the three fingers 702a-c can
be assigned to the "Throw Special" action from FIG. 1 and when the
user press the three fingers 702a-c on the sensing device, the
sensing device can transmit a signal to the AMS application that
three fingers 702a-c were pressed. Once the AMS application
receives the signal, the AMS application can query a database and
find that the three fingers 702a-c were associated with the action
"Throw Special." The AMS application can then transmit the action
to a software application which can utilize the action. The action
may cause, in the case of the software application being a video
game, an in-game entity, such as a video game character, to throw
some special weapon or special object.
[0040] Similarly, FIGS. 8 and 9 feature fingerprint combinations
802a-b and 902a-d, which can also be associated with actions as
well. For example, the combination 802a-b can be associated with
the "Menu" action from FIG. 1, and when depressing fingers 802a-b
on the sensing device, the AMS application can transmit the action
to a software application, which can then activate the "Menu"
action and open a menu. Finger combination 902a-d can be associated
with multiple actions or a macro and the actions or macro can be
similarly utilized by the software application.
[0041] Referring back to step 606, the illustrations above cover a
scenario in which the AMS application has detected an association
of actions to accessory stimuli. If however the AMS application
does not detect such an association, then the detected stimulus (or
stimuli) supplied by one or more accessories is transmitted to the
OS in step 614. For example, it may be that a stimulation based on
the depressions of "Ctrl A" has no particular association to an
action. In this case, the AMS application passes this stimulation
to the OS with no substitutes. In step 616 the OS can determine if
this stimulation invokes a new software application in step 620 or
is conveyed to the previously initiated software application.
[0042] Contemporaneous to the embodiments described above, the AMS
application can also record in step 622 statistics relating to the
detected accessory stimulations. A portion of the AMS application
can operate as a background process which performs statistical
analysis on the stimulations detected. By selecting button 104 in
FIG. 1, the AMS application can provide an updated GUI which
illustrates the usage of input functions of one or more accessories
for which stimulations were detected in step 604. For ease of
illustration, only a keyboard accessory is shown. In this
illustration, certain keys (references 204, 206 208, 210) on the
keyboard are color-coded to illustrate the frequency of usage of
these keys.
[0043] A color scale 203 defines the frequency of usage of the
input functions of the keyboard. The first end of the scale (navy
blue) represents a single detected depression, while an opposite
end of the scale (bright red) represents 500 detected depressions.
Based on this scale, the AMS application maps by color in step 624
stimulations of the keyboard. For example, the key grouping 208
depict a color coding with the highest detectable usage, while the
F7 key (reference 210) indicates the fewest depressions. Keys
having zero depressions are not color coded to readily identify the
color mapping of keys which were used at least once.
[0044] The AMS application provides additional functions in a
playback panel of the GUI which can help a user understand how the
color coded keys were used during an active software application
such as a video game. In this section of the GUI, the AMS
application can present the user with a playback control function
202 which the user can select to replay, pause, forward or rewind
the usage of these keys. When usage playback is selected, the user
can for instance see the color coded keys highlighted in real-time
with a temporary white border to visualize how the keys were
selected. A time clock 204 provides the user the elapsed time of
the playback sequence. Button 212 allows the user to retrieve
statistics from other sessions, while button 214 provides the user
a means to save statistics from a given session.
[0045] The GUI of FIG. 2 could have been shown as a split screen
with all accessories which generated one or more detected
stimulations (e.g., keyboard, biometric sensing device, mouse, and
microphone), each providing statistical symbolic results as
described above for the keyboard. For example, the sensing device
116 can be shown and color coding can illustrate where the user
tapped the touchscreen of the sensing device 116 with the highest
frequency. Red regions could represent heavily tapped areas of the
touch screen, while blue areas can represent rarely tapped areas.
Additionally, if fingerprints or other physical characteristics of
the user are recorded, the GUI can display which fingerprints are
used the most, which combination of fingerprints are used the most,
etc. Although not shown, split screen embodiments are contemplated
by the present disclosure for the GUI of FIG. 2.
[0046] In addition to a symbolic representation as shown in FIG. 2,
the AMS application can provide the user a means to visualize raw
statistics in a table format such as shown in FIG. 3 by selecting
button 212. The table format shows raw data in section 302 and
possible suggestions in section 304 for improving user performance
which can be generated by the AMS application in step 626. Section
302 can be presented in a table format with a column identifying
the key being analyzed, its usage, and number of key presses. The
user can ascertain from this table the most and least frequently
used keys as well as other identifiable patterns. Similarly, the
table can include statistics on how many times a particular finger
or finger combination was used. For example, a user's first left
finger, FP1, may have had a usage duration of 03:05:23 and may be
been detected 295 times by the AMS application. Additionally, the
table can include what heart rate was the most frequent, what eye
direction is most frequently used, and other similar statistics for
any type and number of detectable physical characteristics.
[0047] The AMS application can utilize an understanding of the
layout of the accessory (in this case, the keyboard or sensing
device) to determine from the statistics ways that the user can
improve response time or ergonomic use. For example, the AMS
application can determine from a layout analysis that the key
combination <Alt .> can be reassigned to a macro based on the
trigger <Ctrl F> which could provide the user a faster
response time and free up the user's right hand for other tasks.
The AMS application can also provide alternative suggestions. For
example, the AMS application can also suggest creating single
button macros for each of the key combinations <Alt .> and
<Ctrl A> which can be assigned to keys on the keyboard or
left and right buttons of a mouse. The latter suggestion of
assigning macros to the mouse can help the user free up his/her
left hand.
[0048] Similarly, with regard to the sensing device 116, the AMS
application can utilize information about the sensing device 116 to
recommend to the user better or optimal associations. For example,
the AMS application may determine that a particular region of the
sensing device's 116 touchscreen is over-utilized and that
distributing finger touches or drags over a larger portion of the
screen would be more advantageous. Additionally, the AMS
application may determine that particular combinations of fingers
associated with certain actions would be more advantageous than
current associations. For example, as shown in FIG. 3, the AMS
application can determine that the user should replace the
combination of finger two and finger three of the user's left hand
with finger one of the user's right hand. The AMS application may
suggest such a change to increase response time, reduce the total
number of combinations/fingerprints associated with the user's left
hand, take advantage of the user's right hand, or for other
reasons. Furthermore, the application can determine that the user
should be using fingers in conjunction with eye movements or other
body movements.
[0049] The AMS application can utilize present and next generation
algorithms to determine how to improve response times and ergonomic
usage of accessory devices. The AMS application can for example
have at its disposal an understanding of the layout of each
accessory, the type of software being controlled by the accessory
(e.g., World of Warcraft), type of operations commonly used to
control the software (e.g., known actions as shown in the actions
column 130 of FIG. 1), an understanding of the associations made by
other users (e.g., gamers) to improve their performance when
controlling the software, and so on. The AMS application can also
be adapted to communicate with the active software application by
way of an Application Programming Interface (API) to receive
additional usage statistics from the software which it can in turn
use to improve the user's performance. The AMS application can also
utilize common statistical and behavior modeling techniques to
predict the behavior of the user and responses from the software
application to identify possible ways to improve the user's
performance.
[0050] From these illustrations, it would be apparent to an artisan
of ordinary skill in the art that innumerable algorithms can be
developed to analyze accessory usage and thereby suggest
improvements. These undisclosed embodiments are contemplated by the
present disclosure.
[0051] From the foregoing descriptions, it would be evident to an
artisan with ordinary skill in the art that the aforementioned
embodiments can be modified, reduced, or enhanced without departing
from the scope and spirit of the claims described below. For
example, method 400 can be adapted to define more than one
programmable layer for an accessory. Such a feature can extend the
functionality of an accessory into multi-layer paradigms of input
functions. The GUI of FIG. 1 can be adapted so that a user can
specify more than one programmable layer for a specific
accessory.
[0052] The user can also specify which layer to present in FIG. 1
while associating actions. If for instance layer 1 is shown, the
GUI of FIG. 1 can present the actions associated in this layer by
presenting descriptors superimposed on the input functions (e.g.,
buttons or keys or regions of a touchscreen). When the user
switches to layer 2 (e.g., by selecting from a drop-down menu the
layer of interest) the accessory can be shown in the GUI with a
different set of associated actions. The user can define a macro or
identify a key sequence to switch between layers when the accessory
is in use.
[0053] The trigger for switching between layers can be a toggle
function (e.g., selecting the tab key on a Qwerty keyboard or
tapping a certain region of a touchscreen of the sensing device) to
switch between layers in a round robin fashion (layer
1.fwdarw.layer 2.fwdarw.layer 3.fwdarw.to layer 1.fwdarw.and so
on). Alternatively, the user can define a hold and release trigger
to switch between layers. In this embodiment, the user moves to
another layer while pressing a button (e.g., a "Shift" key) or
portion of a touchscreen of the sensing device 116 and returns to
the preceding layer upon its release. In yet another embodiment,
the trigger to switch layers can be defined differently per layer.
The user can for example select the letter "A" in layer 1 to
proceed to layer 2, and select the letter "B" in layer 2 to return
to layer 1 or proceed to yet another layer 3. There can be numerous
combinations of layers and triggers which can be defined to
substantially expand the capability of single accessory.
Additionally, triggers can be of any kind, tactile, speech,
etc.
[0054] In another embodiment, method 400 can be adapted so that a
user can define super macros and/or super profiles. A super macro
can represent nested macros (combinations of macros). Method 400
can be adapted so that the user can customize the timing for
executing nested macros. Similarly, a super profile can represent
nested profiles (combinations of profiles). A super profile can for
example comprise sub-profiles, each sub-profile defining
associations of actions to input functions of a particular
accessory.
[0055] In yet another embodiment, method 400 can be adapted to
establish audio profiles for headset accessories. When a user
select a headset accessory such as 114, GUI 101 can be adapted to
provide the user options to establish an sound output (equalizer)
setting to optimize performance for a particular gaming
application. For instance GUI 101 can present an equalizer so that
the user can raise the volume of high frequencies to an enemy's
footsteps from a longer distance in a gaming application.
[0056] In still another embodiment, the method 400 can be adapted
to allow a user to authenticate into the AMS application and/or a
software application accessible by the AMS application based on a
detection of an authorized fingerprint or other physical
characteristic detected by the sensing device 116.
[0057] The foregoing embodiments are a subset of possible
embodiments contemplated by the present disclosure. Other suitable
modifications can be applied to the present disclosure.
Accordingly, the reader is directed to the claims for a fuller
understanding of the breadth and scope of the present
disclosure.
[0058] FIG. 10 depicts an exemplary diagrammatic representation of
a machine in the form of a computer system 1000 within which a set
of instructions, when executed, may cause the machine to perform
any one or more of the methodologies discussed above. In some
embodiments, the machine operates as a standalone device. In some
embodiments, the machine may be connected (e.g., using a network)
to other machines. In a networked deployment, the machine may
operate in the capacity of a server or a client user machine in
server-client user network environment, or as a peer machine in a
peer-to-peer (or distributed) network environment.
[0059] The machine may comprise a server computer, a client user
computer, a personal computer (PC), a tablet PC, a laptop computer,
a desktop computer, a control system, a network router, switch or
bridge, or any machine capable of executing a set of instructions
(sequential or otherwise) that specify actions to be taken by that
machine. It will be understood that a device of the present
disclosure includes broadly any electronic device that provides
voice, video or data communication. Further, while a single machine
is illustrated, the term "machine" shall also be taken to include
any collection of machines that individually or jointly execute a
set (or multiple sets) of instructions to perform any one or more
of the methodologies discussed herein.
[0060] The computer system 1000 may include a processor 1002 (e.g.,
a central processing unit (CPU), a graphics processing unit (GPU,
or both), a main memory 1004 and a static memory 1006, which
communicate with each other via a bus 1008. The computer system
1000 may further include a video display unit 1010 (e.g., a liquid
crystal display (LCD), a flat panel, a solid state display, or a
cathode ray tube (CRT)). The computer system 1000 may include an
input device 1012 (e.g., a keyboard), a cursor control device 1014
(e.g., a mouse), a disk drive unit 1016, a signal generation device
1018 (e.g., a speaker or remote control) and a network interface
device 1020.
[0061] The disk drive unit 1016 may include a machine-readable
medium 1022 on which is stored one or more sets of instructions
(e.g., software 1024) embodying any one or more of the
methodologies or functions described herein, including those
methods illustrated above. The instructions 1024 may also reside,
completely or at least partially, within the main memory 1004, the
static memory 1006, and/or within the processor 1002 during
execution thereof by the computer system 1000. The main memory 1004
and the processor 1002 also may constitute machine-readable
media.
[0062] Dedicated hardware implementations including, but not
limited to, application specific integrated circuits, programmable
logic arrays and other hardware devices can likewise be constructed
to implement the methods described herein. Applications that may
include the apparatus and systems of various embodiments broadly
include a variety of electronic and computer systems. Some
embodiments implement functions in two or more specific
interconnected hardware modules or devices with related control and
data signals communicated between and through the modules, or as
portions of an application-specific integrated circuit. Thus, the
example system is applicable to software, firmware, and hardware
implementations.
[0063] In accordance with various embodiments of the present
disclosure, the methods described herein are intended for operation
as software programs running on a computer processor. Furthermore,
software implementations can include, but not limited to,
distributed processing or component/object distributed processing,
parallel processing, or virtual machine processing can also be
constructed to implement the methods described herein.
[0064] The present disclosure contemplates a machine readable
medium containing instructions 1024, or that which receives and
executes instructions 1024 from a propagated signal so that a
device connected to a network environment 1026 can send or receive
voice, video or data, and to communicate over the network 1026
using the instructions 1024. The instructions 1024 may further be
transmitted or received over a network 1026 via the network
interface device 1020.
[0065] While the machine-readable medium 1022 is shown in an
example embodiment to be a single medium, the term
"machine-readable medium" should be taken to include a single
medium or multiple media (e.g., a centralized or distributed
database, and/or associated caches and servers) that store the one
or more sets of instructions. The term "machine-readable medium"
shall also be taken to include any medium that is capable of
storing, encoding or carrying a set of instructions for execution
by the machine and that cause the machine to perform any one or
more of the methodologies of the present disclosure.
[0066] The term "machine-readable medium" shall accordingly be
taken to include, but not be limited to: solid-state memories such
as a memory card or other package that houses one or more read-only
(non-volatile) memories, random access memories, or other
re-writable (volatile) memories; magneto-optical or optical medium
such as a disk or tape; and carrier wave signals such as a signal
embodying computer instructions in a transmission medium; and/or a
digital file attachment to e-mail or other self-contained
information archive or set of archives is considered a distribution
medium equivalent to a tangible storage medium. Accordingly, the
disclosure is considered to include any one or more of a
machine-readable medium or a distribution medium, as listed herein
and including art-recognized equivalents and successor media, in
which the software implementations herein are stored.
[0067] Although the present specification describes components and
functions implemented in the embodiments with reference to
particular standards and protocols, the disclosure is not limited
to such standards and protocols. Each of the standards for Internet
and other packet switched network transmission (e.g., TCP/IP,
UDP/IP, HTML, HTTP) represent examples of the state of the art.
Such standards are periodically superseded by faster or more
efficient equivalents having essentially the same functions.
Accordingly, replacement standards and protocols having the same
functions are considered equivalents.
[0068] The illustrations of embodiments described herein are
intended to provide a general understanding of the structure of
various embodiments, and they are not intended to serve as a
complete description of all the elements and features of apparatus
and systems that might make use of the structures described herein.
Many other embodiments will be apparent to those of skill in the
art upon reviewing the above description. Other embodiments may be
utilized and derived therefrom, such that structural and logical
substitutions and changes may be made without departing from the
scope of this disclosure. Figures are also merely representational
and may not be drawn to scale. Certain proportions thereof may be
exaggerated, while others may be minimized Accordingly, the
specification and drawings are to be regarded in an illustrative
rather than a restrictive sense.
[0069] Such embodiments of the inventive subject matter may be
referred to herein, individually and/or collectively, by the term
"invention" merely for convenience and without intending to
voluntarily limit the scope of this application to any single
invention or inventive concept if more than one is in fact
disclosed. Thus, although specific embodiments have been
illustrated and described herein, it should be appreciated that any
arrangement calculated to achieve the same purpose may be
substituted for the specific embodiments shown. This disclosure is
intended to cover any and all adaptations or variations of various
embodiments. Combinations of the above embodiments, and other
embodiments not specifically described herein, will be apparent to
those of skill in the art upon reviewing the above description.
[0070] The Abstract of the Disclosure is provided to comply with 37
C.F.R. .sctn.1.72(b), requiring an abstract that will allow the
reader to quickly ascertain the nature of the technical disclosure.
It is submitted with the understanding that it will not be used to
interpret or limit the scope or meaning of the claims. In addition,
in the foregoing Detailed Description, it can be seen that various
features are grouped together in a single embodiment for the
purpose of streamlining the disclosure. This method of disclosure
is not to be interpreted as reflecting an intention that the
claimed embodiments require more features than are expressly
recited in each claim. Rather, as the following claims reflect,
inventive subject matter lies in less than all features of a single
disclosed embodiment. Thus the following claims are hereby
incorporated into the Detailed Description, with each claim
standing on its own as a separately claimed subject matter.
* * * * *