U.S. patent application number 15/581957 was filed with the patent office on 2018-11-01 for sensor driven enhanced visualization and audio effects.
The applicant listed for this patent is Intel Corporation. Invention is credited to Manan Goel, Swarnendu Kar, Lakshman Krishnamurthy, Saurin Shah, Arthur D. Webb.
Application Number | 20180315405 15/581957 |
Document ID | / |
Family ID | 63761309 |
Filed Date | 2018-11-01 |
United States Patent
Application |
20180315405 |
Kind Code |
A1 |
Shah; Saurin ; et
al. |
November 1, 2018 |
SENSOR DRIVEN ENHANCED VISUALIZATION AND AUDIO EFFECTS
Abstract
Systems and methods may be used to provide effects corresponding
to movement of instrument objects or other objects. A method may
include receiving sensor data from an object based on movement of
the object, recognizing a gesture from the sensor data, and
determining an effect, such as a visualization or audio effect
corresponding to the gesture. The method may include causing the
effect to be output in response to the determination.
Inventors: |
Shah; Saurin; (Portland,
OR) ; Kar; Swarnendu; (Hillsboro, OR) ;
Krishnamurthy; Lakshman; (Portland, OR) ; Webb;
Arthur D.; (Santa Clara, CA) ; Goel; Manan;
(Hillsboro, OR) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Intel Corporation |
Santa Clara |
CA |
US |
|
|
Family ID: |
63761309 |
Appl. No.: |
15/581957 |
Filed: |
April 28, 2017 |
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06T 11/60 20130101;
G10H 2230/281 20130101; G10H 1/0083 20130101; G10H 2220/191
20130101; G10H 2220/201 20130101; G10H 1/0066 20130101; G10H
2220/005 20130101; G10H 2220/395 20130101; G06F 3/167 20130101;
G10H 2220/185 20130101; G06F 3/0346 20130101; G10H 2220/206
20130101; G10H 1/0008 20130101; G10H 1/0075 20130101; G10H 2220/365
20130101; G06F 3/011 20130101; G06F 3/165 20130101; G06F 3/017
20130101 |
International
Class: |
G10H 1/00 20060101
G10H001/00; G06F 3/01 20060101 G06F003/01; G06F 3/16 20060101
G06F003/16; G06T 11/60 20060101 G06T011/60 |
Claims
1. A server in communication with a pair of drum sticks, the server
comprising: a processor to: receive sensor data from a sensor of at
least one drum stick of the pair of drum sticks, the sensor data
based on movement of the at least one drum stick; recognize a
gesture from the sensor data; determine, from the gesture, a
visualization effect corresponding to the gesture and an audio
effect including a drum sound corresponding to the gesture; and
cause the visualization effect and the audio effect to be output in
response to the determination, wherein the visualization effect
includes a virtual drum set and is output to a display to be
displayed in coordination with the audio effect.
2. The server of claim 1, wherein to determine the visualization
effect, the processor is to use a visualization engine.
3. The server of claim 1, wherein the sensor data includes data
from sensors of both of the pair of drum sticks and wherein the
gesture includes movement of the pair of drum sticks in
coordination with each other.
4. The server of claim 1, wherein to cause the visualization effect
to be output, the processor is to send the visualization effect to
a virtual reality headset of a user controlling the pair of drum
sticks to be displayed on the virtual reality headset.
5. The server of claim 1, wherein to cause the audio effect to be
output, the processor is to send the audio effect to a speaker to
play the audio effect.
6. The server of claim 1, wherein the output to the display
includes captured video of a person performing the movement with
the at least one drumstick.
7. The server of claim 1, wherein the processor is further to
receive data from the sensor indicating an initial position of the
at least one drum stick, and wherein to recognize the gesture, the
processor is to determine a final position of the at least one drum
stick.
8. The server of claim 7, wherein the drum sound is determined
based on the initial position and the final position.
9. The server of claim 1, wherein the visualization effect
corresponding to the gesture and the audio effect including the
drum sound corresponding to the gesture are determined based on an
orientation of the at least one drum stick identified in the sensor
data.
10. The server of claim 1, wherein to determine the visualization
effect, the processor is to determine the visualization effect
based on a series of previously recognized gestures.
11. The server of claim 1, wherein the processor is further to:
receive additional sensor data from a second sensor attached to an
ankle or a foot of a user controlling the at least one drum stick;
recognize a second gesture from the additional sensor data;
determine, from the second gesture, a second audio effect including
a second drum sound corresponding to the second gesture; and cause
the second audio effect to be output with the visualization effect
and the audio effect.
12. The server of claim 1, wherein the processor is further to:
receive wearable sensor data from a plurality of wearable devices
within a predetermined proximity of the at least one drum stick;
and modify the visualization effect based on the wearable sensor
data.
13. The server of claim 1, wherein the visualization effect
includes a lighting effect, and wherein to cause the visualization
effect to be output, the processor is to send the lighting effect
to a plurality of wearable devices within a predetermined proximity
of the at least one drum stick to be displayed at the plurality of
wearable devices.
14. The server of claim 1, wherein the gesture includes at least
one of a linear movement, a tapping movement, a sweeping movement,
a minimum acceleration, or a minimum deceleration.
15. The server of claim 1, wherein the audio effect includes
Multidimensional Polyphonic Expression instructions for a musical
instrument digital interface (MIDI) player.
16. A method for providing effects corresponding to movement of
drum sticks, the method comprising: receiving sensor data from a
sensor of at least one drum stick of a pair of drum sticks, the
sensor data based on movement of the at least one drum stick;
recognizing a gesture from the sensor data; determining, from the
gesture, a visualization effect corresponding to the gesture and an
audio effect including a drum sound corresponding to the gesture;
and causing the visualization effect and the audio effect to be
output in response to the determination, wherein the visualization
effect includes a virtual drum set and is output to a display to be
displayed in coordination with the audio effect.
17. The method of claim 16, wherein determining the visualization
effect includes using a visualization engine.
18. The method of claim 16, wherein the sensor data includes data
from sensors of both of the pair of drum sticks and wherein the
gesture includes movement of the pair of drum sticks in
coordination with each other.
19. The method of claim 16, wherein causing the visualization
effect to be output includes sending the visualization effect to a
virtual reality headset of a user controlling the pair of drum
sticks to be displayed on the virtual reality headset.
20. At least one non-transitory machine-readable medium including
instructions for providing effects corresponding to movement of
drum sticks, which when executed by a machine, cause the machine
to: receive sensor data from a sensor of at least one drum stick of
a pair of drum sticks, the sensor data based on movement of the at
least one drum stick; recognize a gesture from the sensor data;
determine, from the gesture, a visualization effect corresponding
to the gesture and an audio effect including a drum sound
corresponding to the gesture; and cause the visualization effect
and the audio effect to be output in response to the determination,
wherein the visualization effect includes a virtual drum set and is
output to a display to be displayed in coordination with the audio
effect.
21. The at least one non-transitory machine-readable medium of
claim 20, further comprising instructions to: receive wearable
sensor data from a plurality of wearable devices within a
predetermined proximity of the at least one drum stick; and modify
the visualization effect based on the wearable sensor data.
22. The at least one non-transitory machine-readable medium of
claim 20, wherein the visualization effect includes a lighting
effect, and wherein the instructions to cause the visualization
effect to be output include instructions to send the lighting
effect to a plurality of wearable devices within a predetermined
proximity of the at least one drum stick to be displayed at the
plurality of wearable devices.
23. A virtual drum set system comprising: a pair of drum sticks
each including: a sensor to provide data based on movement of a
drum stick of the pair of drum sticks; and a transceiver to
transmit the sensor data; a device including a processor to:
recognize a gesture from the sensor data; and determine, from the
gesture, a visualization effect corresponding to the gesture and an
audio effect including a drum sound corresponding to the gesture;
and a display device to display the visualization effect, wherein
the visualization effect includes a virtual drum set and is output
to a display to be displayed in coordination with the audio effect;
and a speaker to play the audio effect.
24. The virtual drum set system of claim 23, wherein the display
device is a virtual realty headset and the visualization effect
includes a virtual drum set.
25. The virtual drum set system of claim 23, wherein the sensor
includes a nine-axis sensor including a magnetometer, an
accelerometer, and a gyroscope.
Description
BACKGROUND
[0001] Playing and listening to live music has been captivating
humans for millennia. Traditionally, live performances featured
little in the way of visual effects. More recently, live
performances have been augmented by video, lighting effects,
pyrotechnics, and props. While these effects have been
entertaining, they do not let the audience experience the
musician's point of view. These effects are further limited in that
they are not controllable by the musician during the
performance.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] In the drawings, which are not necessarily drawn to scale,
like numerals may describe similar components in different views.
Like numerals having different letter suffixes may represent
different instances of similar components. The drawings illustrate
generally, by way of example, but not by way of limitation, various
embodiments discussed in the present document.
[0003] FIG. 1 illustrates a pair of drum sticks for creating
visualization or audio effects in accordance with some
embodiments.
[0004] FIGS. 2A-2C illustrate example visualization effects in
accordance with some embodiments.
[0005] FIGS. 3A-3C illustrate instrumentation objects for creating
visualization or audio effects in accordance with some
embodiments.
[0006] FIG. 4 illustrates a system for providing effects
corresponding to movement of an instrumentation object in
accordance with some embodiments.
[0007] FIG. 5 illustrates a flowchart showing a technique for
providing effects corresponding to movement of an instrumentation
object in accordance with some embodiments.
[0008] FIG. 6 illustrates generally an example of a block diagram
of a machine upon which any one or more of the techniques (e.g.,
methodologies) discussed herein may perform in accordance with some
embodiments.
DETAILED DESCRIPTION
[0009] Systems and methods for providing virtual instrument
visualization effects corresponding to movement of physical
objects, such as drum sticks, a violin bow, a guitar pick, a
conductor baton, or the like are described herein. The systems and
methods described herein are used to augment and enrich experiences
from traditional musical instruments by communicating with a device
to perform motion sensing, gesture detection, and wireless
communication.
[0010] The systems and methods described herein are used for music
performance with enhanced experiences. Motion sensing is added to
one or more musical instruments, such as a drum stick, a
violin/viola/cello bow, a guitar pick, an end of a guitar (e.g.,
headstock), a conductor's baton, or the like. The musical
experience of an artist or audience may be enhanced, and may
include real instruments or virtual instruments, such as those used
in virtual reality (VR) or augmented reality (AR) systems.
[0011] Some traditional systems may use one or more cameras to
track musical instruments. An issue with these traditional systems
is that they may be occluded by a player's own hand. The presently
described systems and methods use a sensor, such as a nine-axis
gyroscope and accelerometer, which may measure rotation, location,
or orientation. The sensor may be used without a camera, which
allows uninterrupted rotation, location, or orientation information
to be available while a user plays, without concern for a camera
line of sight.
[0012] In an example, the systems and methods described herein
provide audible and visual feedback to be played and displayed,
respectively, when an action, motion, or gesture occurs at a device
(e.g., a drum stick, a violin/viola/cello bow, a guitar pick, an
end of a guitar, a conductor's baton, or the like). In an example,
the audible feedback may include sound created at a musical
instrument by the action, motion, or gesture (e.g., when a violin
how traverses a violin string, the string vibrates, causing sound
to be produced), which may be augmented (e.g., amplified and played
by a speaker, changed, or distorted) or the created sound may be
added to by the audible feedback (e.g., additional sound may be
played not caused by the musical instrument). In another example,
the audible feedback may include sound created by a computer
system, such as a Musical Instrument Digital Interface (MIDI)
system, a drum kit, etc. This type of audible feedback may be
created by, for example, tracking a gesture of one or more drum
sticks (e.g., without hitting an actual drum), tracking movement of
a conductor's baton, etc.
[0013] The visual feedback provided in the system may be displayed
for an audience, a performer, a remote viewer, or a combination
thereof. The visual feedback may include using a display screen, a
VR or AR display, etc. Other visual feedback may include fireworks,
animated or video sequences, lighting effects on a wearable device,
or the like. The visual or audible feedback described herein may be
based on additional activity of a user, such as dancing, gestures,
predetermined effects, or the like.
[0014] FIG. 1 illustrates a pair of drum sticks 100 for creating
visualization or audio effects in accordance with some embodiments.
The pair of drum sticks 100 include a first drum stick 102A and a
second drum stick 10213. Each of the pair of drum sticks 100
includes a respective motion sensor 104A and 104B and respective
circuitry 106A and 106B. Although a single sensor 104A and 104B are
illustrated in FIG. 1, it is understood that multiple sensors may
be used on each drum stick 102A and 102B. The circuitry 106A and
106B may include a transceiver, a processor, memory, or a system on
a chip. In an example, the first drum stick 102A may be a parent
and the second drum stick 102B may be a child. The parent may
receive information from the sensor 104B of the child, and the
parent may forward that sensor information along with information
from the sensor 104A of the parent to a device, such as a mobile
device, a computer, a server, etc., for further processing. In
another example, the pair of drum sticks 100 may independently send
information to a device.
[0015] The pair of drum sticks 100 may be paired by assigning one
to a left hand of a user and one to a right hand of a user (or
simulating such). The left hand assigned drum stick may be assigned
to output a first set of audible or visual feedback and the right
hand assigned drum stick may be assigned to output a second set of
audible or visual feedback. For example, either drum stick may be
used to cause drum sound as audible feedback based on the location
and motion of the drum stick, and one of the drum sticks may be
assigned to a first visual effect (e.g., a flashing light on a
proximate wearable device), and the other of the drum sticks may be
assigned to a second visual effect (e.g., a visual effect on a
display screen).
[0016] The sensors 104A, 104B may include a magnetometer,
accelerometer, or gyroscope, For example, the sensors 104A, 104B
may include a nine-axis sensor with a magnetometer, accelerometer,
and a gyroscope for detecting location, position, and movement. The
sensors 104A, 104B may be initiated at a starting location,
position, or orientation, such that the sensors 104A, 104B may
determine relative locations, positions, movement, or orientations
in response to changes by the pair of drum sticks 100.
[0017] The sensor 104A may provide data based on movement of the
drum stick 102A. The transceiver of circuitry 106A may transmit the
sensor data to a device, such as a wearable device (e.g., a smart
watch), a mobile device, a computer, a remote server, or the like.
The device receiving the sensor data may include a processor to
recognize a gesture from the sensor data. The gesture may be used
to determine a visualization effect corresponding to the gesture or
an audio effect including a drum sound corresponding to the
gesture. The gesture may include movement from the pair of drum
sticks 100 in coordination with each other. In an example, timing
information may be sent to coordinate displaying the visualization
effect and playing the audio effect. The gesture may be based on an
orientation, location, movement, or force of one or both of the
pair of drum sticks 100, for example as determined by one or more
of the first sensor 104A or the second sensor 104B.
[0018] The gesture may be determined based on additional input,
such as an additional sensor attached to an ankle or a foot of a
user controlling the drum stick 102A. The additional sensor may
output data to cause a second audio effect, such as a second drum
sound (e.g., a bass drum sound) corresponding to a second gesture.
The second audio effect may be played by a speaker. The gesture may
include one or more of a linear movement, a tapping movement, a
sweeping movement, a minimum acceleration, a minimum deceleration,
or the like.
[0019] A display device may be used to display the visualization
effect or a speaker may be used to play the audio effect. The
speaker may be controlled by a MIDI player. The audio effect may
include Multidimensional Polyphonic Expression instructions for use
by the MIDI player. In an example, the processor of circuitry 106A
may be used to determine the gesture. In an example, a
visualization effect may be determined based on one or more
previously recognized gestures (e.g., a series). The visualization
effect may include using a plurality of wearable devices within a
predetermined proximity of the drum stick 102A to display the
visualization effect. Devices within the predetermined proximity
may include devices on a network, within range of a Bluetooth
device or devices, within a specified distance, within a room,
etc.
[0020] FIGS. 2A-2C illustrate example visualization effects
200A-200C in accordance with some embodiments. The first
visualization effect 200A includes sensor visualization effects 202
and orientation visualization effects 204 and 206. The second
visualization effect 200B includes a front-facing view of a drum
set and a drummer. The third visualization effect 200C includes a
point-of-view perspective display of a drum set 208. The components
of the visualization effects 200A-200C may include virtual
components or augmented reality components. For example, the drum
sets displayed in visualization effects 200B-200C may be virtual
(e.g., displayed using a VR display).
[0021] Motion data or gesture primitives detected by a sensor may
be used to create visualizations or visual effects to accompany a
performance, such as to enhance an audience experience. For
example, enhanced visual experiences may be shown, such as
capturing activity level (e.g., sensor visualization effects 202
related to an accelerometer or gyroscope of a drum stick or other
musical instrument), orientation of the musical instrument device
(e.g., drum sticks, as shown in the orientation visualization
effects 204-206), or other aspects of a performance. A system to
predetermine the visualization effect 200A may include a
customizable platform to select a background or visual effects,
which may be manipulated by a user. Audience interaction may be
enabled, such as at wearable devices (e.g., wrist bands) where
lights may turn on and off, which may be controlled using the
musical instrument device (e.g., drum sticks).
[0022] In an example, the visualization effects 200A-200B may be
presented to an audience of a user controlling a musical instrument
device. For example, a display screen may be used to display
visualization effects 200A or 200B. For visualization effect 200C,
the point-of-view drum set 208 may be presented, in an example, on
a display screen to an audience. In an example, the visualization
effect 200C may be presented using a VR display to a controller of
the musical instrument device (e.g., drum sticks). The
visualization effect 200C within the VR display may show the
point-of-view drum set 208 and may include virtual drum sticks 210
or a virtual pedal 212. The virtual drum sticks 210 may be
displayed virtually at a location corresponding to real drum sticks
based on sensors and location information of the real drum sticks.
The user of the real drum sticks, while wearing the VR display, may
see the virtual drum sticks 210 as if the user was holding the
virtual drum sticks 210 (and hands may also be shown to further
this effect). The user may wear an ankle or foot device with a
sensor to detect motion of the ankle or foot. The detected motion
may cause the foot pedal 212 to move (e.g., in the VR environment)
and may cause an audible or visual effect to occur. The user
wearing the VR display may play the drum set 208 virtually with the
virtual drum sticks 210 by controlling the real drum sticks (and
optionally the foot pedal 212). The drum set 208 may move or
display a visualization according to motion of the real drum sticks
(e.g., the cymbals may crash, a drum head may appear to vibrate, a
played drum may light up, etc.).
[0023] FIGS. 3A-3C illustrate instrumentation objects 300A-300C for
creating visualization or audio effects in accordance with some
embodiments.
[0024] For example, the instrumentation object 300A may be a violin
bow, viola bow, cello bow, or other stringed instrument bow. The
instrumentation object 300A includes a sensor 304 and may include
circuitry 306, such as a transceiver, a processor, memory, or a
system on a chip. In the example shown in FIG. 3A, the sensor 304
is located at the tip of the instrumentation object 300A. In other
examples, the sensor 304 may be located in the middle of the
instrumentation object 300A or at the frog. The sensor 304 may
include a gyroscope, an accelerometer, or a magnetometer to
determine position, orientation, or movement of the instrumentation
object 300A. In an example, a plurality of sensors may be disposed
on the instrumentation object 300A (e.g., one at the tip, one in
the middle, and one at the frog).
[0025] The sensor 304 may track back and forth movement of the
instrumentation object 300A. The sensor 304 may track bow tapping
movements (e.g. in a perpendicular or partially perpendicular
movement to the back and forth traditional bow movement on a
stringed instrument). The tracked movement (or position or
orientation) of the instrumentation object 300A may be used to
create or identify visual effects to be shown. The visual effects
may be matched to the music created by playing the stringed
instrument with the instrumentation object 300A. In another
example, augmented audible feedback may be created or identified by
the tracked movement, which may be played in addition to the music
created. For example, a real violin may be played using the
instrumentation object 300A as a bow, and the sensor on the bow may
add to a performance experience by integrating visual or audio
effects in addition to the music created by playing the violin. For
example, when the bow moves in a first direction, a first visual or
audible effect may be created or identified and when the bow moves
in a second direction, a second visual or audible, effect may be
created or identified. In an example, mixers may be used to add in
augmented sound. For example, a Multidimensional Polyphonic
Expression for use with a MIDI player may be used to create or play
the augmented sound.
[0026] In an example, a player of an instrument using the
instrumentation object 300A may have a sensor on a finger or
fingers of the player. For example, a violin player may place a
sensor or sensors on one or more fingers used to play violin (e.g.,
a fourth finger of the player's left hand). The movement of the
pinky finger may indicate a particular visual effect. For example,
when playing a stringed instrument, notes may often be played
interchangeably with different fingers (e.g., the fourth finger in
first position on a first string, an open second string, or a
second finger in a third position on the first string may all
correspond to a single note). By playing with a particular finger,
a specific visual (or audible) affect may be identified or
created.
[0027] The instrumentation object 300A may be in communication with
a server. The server may include a processor to receive sensor data
from the sensor 304 of the instrumentation object 300A, the sensor
data may be based on movement of the instrumentation object 300A.
The processor may recognize a gesture from the sensor data, such as
a back or forth movement, a tapping of the instrumentation object
300A on a string, etc. The processor may determine, such as from
the gesture, a visualization effect corresponding to the gesture or
an audio effect corresponding to the gesture. The visualization
effect may be determined using a visualization engine. In an
example, the processor may cause the visualization effect or the
audio effect to be output in response to the determination. The
audio effect may include a natural sound caused by the movement of
the instrumentation object 300A.
[0028] In an example, causing the visualization effect to be output
may include sending the visualization effect to a virtual reality
headset of a user controlling the instrumentation object 300A, for
example, to be displayed on the VR headset. Causing the audio
effect to be output may include sending the audio effect to a
speaker to play the audio effect. The processor may send the
visualization effect to a display to be displayed in coordination
with the audio effect played by the speaker. The processor may
receive data from the sensor 304 indicating an initial position of
the instrumentation object 300A and recognize the gesture based on
a determined final position of the instrumentation object 300A. The
visualization effect or the audio effect may be determined based on
an orientation of the instrumentation object 300A identified in the
sensor data. The visualization effect may be based on one or more
(e.g., a series) of previously recognized gestures. The
visualization effect may include a lighting effect. Outputting the
visualization effect may include sending the lighting effect to a
plurality of wearable devices, such as within a predetermined
proximity of the instrumentation object 300A to be displayed, for
example at the plurality of wearable devices. Devices within the
predetermined proximity may include devices on a wi-fi network,
within range of a Bluetooth device or devices, within a specified
distance, within a room, etc. The gesture may include a linear
movement, a tapping movement, a sweeping movement, a minimum or
maximum acceleration, a maximum or minimum deceleration, or the
like. The audio effect may include a Multidimensional Polyphonic
Expression instruction for a MIDI player.
[0029] The instrumentation object 300B may be a guitar pick. The
instrumentation object 300B includes a sensor 310 and may include
circuitry 312, such as a transceiver, a processor, memory, or a
system on a chip. The instrumentation object 300B may be used to
strum a stringed instrument, such as a guitar. Movement of the
instrumentation object 300B may correspond with a visual or audible
effect to be produced. For example, when the instrumentation object
30013 is used to strum a guitar upward, a first visual or audible
effect may be identified and when the instrumentation object 3009
is used to strum the guitar downward, a second visual or audible
effect may be identified and used.
[0030] The instrumentation object 3009 may be in communication with
a server. The server may include a processor to receive sensor data
from the sensor 310 of the instrumentation object 300B, the sensor
data may be based on movement of the instrumentation object 300B.
The processor (e.g., circuitry 312) may recognize a gesture from
the sensor data, such as a strumming movement, a slapping movement,
etc. The processor may determine, such as from the gesture, a
visualization effect corresponding to the gesture or an audio
effect corresponding to the gesture. The visualization effect may
be determined using a visualization engine. In an example, the
processor may cause the visualization effect or the audio effect to
be output in response to the determination. The audio effect may
include a natural sound caused by the movement of the
instrumentation object 300B.
[0031] In an example, causing the visualization effect to be output
may include sending the visualization effect to a virtual reality
headset of a user controlling the instrumentation object 300B, for
example, to be displayed on the VR headset. Causing the audio
effect to be output may include sending the audio effect to a
speaker to play the audio effect. The processor may send the
visualization effect to a display to be displayed in coordination
with the audio effect played by the speaker. The processor may
receive data from the sensor 310 indicating an initial position of
the instrumentation object 3009 and recognize the gesture based on
a determined final position of the instrumentation object 300B. The
visualization effect or the audio effect may be determined based on
an orientation of the instrumentation object 300B identified in the
sensor data. The visualization effect may be based on one or more
(e.g., a series) of previously recognized gestures. The
visualization effect may include a lighting effect. Outputting the
visualization effect may include sending the lighting effect to a
plurality of wearable devices, such as within a predetermined
proximity of the instrumentation object 300B to be displayed, for
example at the plurality of wearable devices. Devices within the
predetermined proximity may include devices on a wi-fi network,
within range of a Bluetooth device or devices, within a specified
distance, within a room, etc. The gesture may include a linear
movement, a tapping movement, a sweeping movement, a minimum or
maximum acceleration, a maximum or minimum deceleration, or the
like. The audio effect may include a Multidimensional Polyphonic
Expression instruction for a MIDI player.
[0032] The instrumentation object 300C may be a conductor's baton.
The instrumentation object 300C includes a sensor 316 and may
include circuitry 318, such as a transceiver, a processor, memory,
or a system on a chip. The instrumentation object 300C may be used
to conduct an orchestra, either real or virtual. The real orchestra
may play music in response to movement of the instrumentation
object 300C or orchestral sound may be created in response to
movement of the instrumentation object 300C with a virtual
orchestra. A visual effect or audible effect may be created or
identified in response to movement of the instrumentation object
300C.
[0033] The instrumentation object 300C may be in communication with
a server. The server may include a processor to receive sensor data
from the sensor 316 of the instrumentation object 300C, the sensor
data may be based on movement of the instrumentation object 300C.
The processor (e.g., circuitry 318) may recognize a gesture from
the sensor data, such as a up and down or left and right movement,
a conducting cadence movement (e.g., based on a tempo of music
being played, such as 3/4, 4/4, 7/8, etc.), or the like. The
processor may determine, such as from the gesture, a visualization
effect corresponding to the gesture or an audio effect
corresponding to the gesture. The visualization effect may be
determined using a visualization engine. In an example, the
processor may cause the visualization effect or the audio effect to
be output in response to the determination.
[0034] In an example, causing the visualization effect to be output
may include sending the visualization effect to a virtual reality
headset of a user controlling the instrumentation object 300C, for
example, to be displayed on the VR headset. Causing the audio
effect to be output may include sending the audio effect to a
speaker to play the audio effect. The processor may send the
visualization effect to a display to be displayed in coordination
with the audio effect played by the speaker. The processor may
receive data from the sensor 316 indicating an initial position of
the instrumentation object 300C and recognize the gesture based on
a determined final position of the instrumentation object 300C. The
visualization effect or the audio effect may be determined based on
an orientation of the instrumentation object 300C identified in the
sensor data. The visualization effect may be based on one or more
(e.g., a series) of previously recognized gestures. The
visualization effect may include a lighting effect. Outputting the
visualization effect may include sending the lighting effect to a
plurality of wearable devices, such as within a predetermined
proximity of the instrumentation object 3000 to be displayed, for
example at the plurality of wearable devices. Devices within the
predetermined proximity may include devices on a wi-fi network,
within range of a Bluetooth device or devices, within a specified
distance, within a room, etc. The gesture may include a linear
movement, a tapping movement, a sweeping movement, a minimum or
maximum acceleration, a maximum or minimum deceleration, or the
like. The audio effect may include a Multidimensional Polyphonic
Expression instruction for a MIDI player.
[0035] FIG. 4 illustrates a system 400 for providing effects
corresponding to movement of an instrumentation object in
accordance with some embodiments. Remote instrumentation devices
may include a drum stick 410, a violin bow 416, a guitar pick 422,
a conductor's baton 428, or the like. The instrumentation devices
410, 416, 422, and 428 may include respective sensors (e.g., 412,
418, 424, 430) and optionally respective transceivers or processors
(e.g., 414, 420, 426, 432).
[0036] The system 400 includes a server 401 in communication with
one or more remote instrumentation devices (e.g., 410, 416, 422,
428), or a wearable device 408. The server 401 includes a processor
402, memory 404, and a visualization engine 406. The processor 402
may receive sensor data from a sensor (412, 418, 424, or 430) of
one or more of the remote instrumentation devices (e.g., 410, 416,
422, 428), such as the drum stick sensor 412. The drum stick 410
may be paired with a second drum stick, and the pair may include a
parent and a child drum stick. For example, the child drum stick
may have limited communication capabilities (e.g., capable of
communicating with the parent drum stick, but may be incapable of
communicating with another remote device. The parent drum stick may
have the processor 414 or a transceiver, for example to communicate
with a mobile device, wearable device, or remote device. The pair
of drum sticks may be used together. The processor 402 may receive
the sensor data from one of the pair of drum sticks (e.g., a
parent) or both (e.g., individually, or via the parent). In an
example, the sensor data is based on movement of the drum stick
410.
[0037] In an example, the processor 402 may recognize a gesture
from the sensor data. For example, the gesture may include a drum
strike, a violin playing movement, a conductor baton conducting
movement, a guitar strum, etc. The gesture may include one or more
of a linear movement, a tapping movement, a sweeping movement, a
minimum or maximum acceleration, a minimum or maximum deceleration,
or the like. The processor 402 may determine, for example using the
gesture, a visualization effect corresponding to the gesture. The
visualization effect may be determined using the visualization
engine 406. In an example, to determine the visualization effect,
the processor 402 is to determine the visualization effect based on
a series of previously recognized gestures. The visualization
effect may include a lighting effect, such as a flashing light or
light sequence on a screen, a virtual reality light effect, or a
light effect sent for display to a plurality of wearable devices
(e.g., the wearable device 408). The plurality of wearable devices
may be identified within a proximity of the remote instrumentation
devices (e.g., 410, 416, 422, 428). The processor 402 may receive
wearable sensor data from the plurality of wearable devices (e.g.,
the wearable device 408), which may be within a predetermined
proximity of the remote instrumentation devices (e.g., 410, 416,
422, 428). The visualization effect may be modified, for example,
based on the wearable sensor data. Devices within the predetermined
proximity may include devices on a wi-fi network, within range of a
Bluetooth device or devices, within a specified distance, within a
room, etc.
[0038] The processor 402 may determine an audio effect
corresponding to the gesture including, for example, a drum sound,
a violin sound (or other stringed instrument sound), a guitar
sound, an orchestral sound (e.g., a combination of sounds from a
plurality of instruments), or the like. In an example, the
visualization effect and the audio effect are determined based on
an orientation of the remote instrumentation devices (e.g., 410,
416, 422, 428) identified in the sensor data. In an example, the
audio effect may include Multidimensional Polyphonic Expression
instructions for a musical instrument digital interface (MIDI)
player.
[0039] The processor 402 may cause the visualization effect or the
audio effect to be output, such as in response to the
determination. In an example, the visualization effect may be
output to a virtual reality headset 434, which may display the
visualization effect using a virtual reality display. The virtual
reality headset 434 may be on a user who is controlling a remote
instrumentation device (e.g., 410, 416, 422, 428). The
visualization effect may be displayed in coordination with the
audio effect played, for example, by a speaker. The speaker may be
used to play the audio effect.
[0040] In an example, the processor 402 may receive data from the
sensor indicating an initial position of the remote instrumentation
devices (e.g., 410, 416, 422, 428). For example, the processor 402
may determine a final position of the remote instrumentation
devices (e.g., 410, 416, 422, 428), such as a drum stick 410. In an
example, the drum stick 410 may be used to generate the drum sound
(e.g., without striking a drum), which may be determined based on
the initial position and the final position.
[0041] In an example, the processor 402 may receive additional
sensor data from a second sensor attached to an ankle or a foot of
a user who is controlling the drum stick. A second gesture may be
recognized from the additional sensor data. The processor 402 may
determine from the second gesture, a second audio effect or a
second visualization effect. The second audio effect may include a
second drum sound corresponding to the second gesture. The
processor 402 may cause the second audio effect or the second
visualization effect to be output with the visualization effect or
the audio effect.
[0042] FIG. 5 illustrates a flowchart showing a technique 500 for
providing effects corresponding to movement of an instrumentation
object in accordance with some embodiments. The technique 500
includes an operation 502 to receive sensor data from a sensor of
at least one drum stick of a pair of drum sticks. The sensor data
may be based on movement of the at least one drum stick. The sensor
data may include data from sensors of both the pair of drum sticks.
The Gesture may include movement of the pair of drum sticks in
coordination with each other.
[0043] The technique 500 includes an operation 504 to recognize a
gesture. The gesture may include a linear movement, a tapping
movement, a sweeping movement, a minimum or maximum acceleration, a
minimum or maximum deceleration, or the like. The technique 500
includes an operation 506 to determine a visualization effect and
an audio effect. The effects corresponding to the gesture. In an
example, determining the visualization effect includes using a
visualization engine. In an example, determining the visualization
effect includes determining the visualization effect based on one
or more (e.g., a series) of previously recognized gestures. The
audio effect may include Multidimensional Polyphonic Expression
instructions for a MIDI player.
[0044] The technique 500 includes an operation 508 to output the
visualization effect and the audio effect, such as in response to
the determination. Outputting the visualization effect may include
sending the visualization effect to a virtual reality headset of a
user controlling the pair of drum sticks to be displayed on the
virtual reality headset. Outputting the audio effect may include
sending the audio effect to a speaker to play the audio effect. The
visualization effect and the audio effect may be displayed and
played, respectively, in coordination.
[0045] In an example, the technique 500 may include receiving data
from the sensor indicating an initial position of the drum stick,
and recognizing the gesture may include determining a final
position of the drum stick. The drum sound may be determined based
on the initial position or the final position. The visualization
effect corresponding to the gesture or the audio effect including
the drum sound may corresponding to the gesture may be determined
based on an orientation of the drum stick identified in the sensor
data.
[0046] The technique 500 may include receiving additional sensor
data from a second sensor attached to an ankle or a foot of a user
controlling the drum stick, recognizing a second gesture from the
additional sensor data, determining, from the second gesture, a
second audio effect including a second drum sound corresponding to
the second gesture, and causing the second audio effect to be
output with the visualization effect and the audio effect. The
technique 500 may include receiving wearable sensor data from a
plurality of wearable devices, such as within a predetermined
proximity of the drum stick. The visualization effect may be
modified based on the wearable sensor data. The visualization
effect may include a lighting effect, which may be sent to a
plurality of wearable devices within a predetermined proximity of
the drum stick, such as to be displayed at the plurality of
wearable devices. Devices within the predetermined proximity may
include devices on a wi-fi network, within range of a Bluetooth
device or devices, within a specified distance, within a room,
etc.
[0047] FIG. 6 illustrates generally an example of a block diagram
of a machine 600 upon which any one or more of the techniques
(e.g., methodologies) discussed herein may perform in accordance
with some embodiments. In alternative embodiments, the machine 600
may operate as a standalone device or may be connected (e.g.,
networked) to other machines. In a networked deployment, the
machine 600 may operate in the capacity of a server machine, a
client machine, or both in server-client network environments. In
an example, the machine 600 may act as a peer machine in
peer-to-peer (P2P) (or other distributed) network environment. The
machine 600 may be a personal computer (PC), a tablet PC, a set-top
box (STB), a personal digital assistant (PDA), a mobile telephone,
a web appliance, a network router, switch or bridge, or any machine
capable of executing instructions (sequential or otherwise) that
specify actions to be taken by that machine. Further, while only a
single machine is illustrated, the term "machine" shall also be
taken to include any collection of machines that individually or
jointly execute a set (or multiple sets) of instructions to perform
any one or more of the methodologies discussed herein, such as
cloud computing, software as a service (SaaS), other computer
cluster configurations.
[0048] Examples, as described herein, may include, or may operate
on, logic or a number of components, modules, or mechanisms.
Modules are tangible entities (e.g., hardware) capable of
performing specified operations when operating. A module includes
hardware. In an example, the hardware may be specifically
configured to carry out a specific operation (e.g., hardwired). In
an example, the hardware may include configurable execution units
(e.g., transistors, circuits, etc.) and a computer readable medium
containing instructions, where the instructions configure the
execution units to carry out a specific operation when in
operation. The configuring may occur under the direction of the
executions units or a loading mechanism. Accordingly, the execution
units are communicatively coupled to the computer readable medium
when the device is operating. In this example, the execution units
may be a member of more than one module. For example, under
operation, the execution units may be configured by a first set of
instructions to implement a first module at one point in time and
reconfigured by a second set of instructions to implement a second
module.
[0049] Machine (e.g., computer system) 600 may include a hardware
processor 602 (e.g., a central processing unit (CPU), a graphics
processing unit (GPU), a hardware processor core, or any
combination thereof), a main memory 604 and a static memory 606,
some or all of which may communicate with each other via an
interlink (e.g., bus) 608. The machine 600 may further include a
display unit 610, an alphanumeric input device 612 (e.g., a
keyboard), and a user interface (UI) navigation device 614 (e.g., a
mouse). In an example, the display unit 610, alphanumeric input
device 612 and UI navigation device 614 may be a touch screen
display. The machine 600 may additionally include a storage device
(e.g., drive unit) 616, a signal generation device 618 (e.g., a
speaker), a network interface device 620, and one or more sensors
621, such as a global positioning system (GPS) sensor, compass,
accelerometer, or other sensor. The machine 600 may include an
output controller 628, such as a serial (e.g., universal serial bus
(USB), parallel, or other wired or wireless (e.g., infrared (IR),
near field communication (NFC), etc.) connection to communicate or
control one or more peripheral devices (e.g., a printer, card
reader, etc.).
[0050] The storage device 616 may include a machine readable medium
622 that is non-transitory on which is stored one or more sets of
data structures or instructions 624 (e.g., software) embodying or
utilized by any one or more of the techniques or functions
described herein. The instructions 624 may also reside, completely
or at least partially, within the main memory 604, within static
memory 606, or within the hardware processor 602 during execution
thereof by the machine 600. In an example, one or any combination
of the hardware processor 602, the main memory 604, the static
memory 606, or the storage device 616 may constitute machine
readable media.
[0051] While the machine readable medium 622 is illustrated as a
single medium, the term "machine readable medium" may include a
single medium or multiple media (e.g., a centralized or distributed
database, or associated caches and servers) configured to store the
one or more instructions 624.
[0052] The term "machine readable medium" may include any medium
that is capable of storing, encoding, or carrying instructions for
execution by the machine 600 and that cause the machine 600 to
perform any one or more of the techniques of the present
disclosure, or that is capable of storing, encoding or carrying
data structures used by or associated with such instructions.
Non-limiting machine readable medium examples may include
solid-state memories, and optical and magnetic media. Specific
examples of machine readable media may include: non-volatile
memory, such as semiconductor memory devices (e.g., Electrically
Programmable Read-Only Memory (EPROM), Electrically Erasable
Programmable Read-Only Memory (EEPROM)) and flash memory devices;
magnetic disks, such as internal hard disks and removable disks;
magneto-optical disks; and CD-ROM and DVD-ROM disks.
[0053] The instructions 624 may further be transmitted or received
over a communications network 626 using a transmission medium via
the network interface device 620 utilizing any one of a number of
transfer protocols (e.g., frame relay, Internet protocol (IP),
transmission control protocol (TCP), user datagram protocol (UDP),
hypertext transfer protocol (HTTP), etc.). Example communication
networks may include a local area network (LAN), a wide area
network (WAN), a packet data network (e.g., the Internet), mobile
telephone networks (e.g., cellular networks), Plain Old Telephone
(POTS) networks, and wireless data networks (e.g., Institute of
Electrical and Electronics Engineers (IEEE) 802.11 family of
standards known as Wi-Fi.RTM., IEEE 802.16 family of standards
known as WiMax.RTM.), IEEE 802.15.4 family of standards,
peer-to-peer (P2P) networks, among others. In an example, the
network interface device 620 may include one or more physical jacks
(e.g., Ethernet, coaxial, or phone jacks) or one or more antennas
to connect to the communications network 626. In an example, the
network interface device 620 may include a plurality of antennas to
wirelessly communicate using at least one of single-input
multiple-output (SIMO), multiple-input multiple-output (MIMO), or
multiple-input single-output (MISO) techniques. The term
"transmission medium" shall be taken to include any intangible
medium that is capable of storing, encoding or carrying
instructions for execution by the machine 600, and includes digital
or analog communications signals or other intangible medium to
facilitate communication of such software.
VARIOUS NOTES & EXAMPLES
[0054] Each of these non-limiting examples may stand on its own, or
may be combined in various permutations or combinations with one or
more of the other examples.
[0055] Example 1 is a server in communication with a pair of drum
sticks, the server comprising: a processor to: receive sensor data
from a sensor of at least one drum stick of the pair of drum
sticks, the sensor data based on movement of the at least one drum
stick; recognize a gesture from the sensor data; determine, from
the gesture, a visualization effect corresponding to the gesture
and an audio effect including a drum sound corresponding to the
gesture; and cause the visualization effect and the audio effect to
be output in response to the determination.
[0056] In Example 2, the subject matter of Example 1 optionally
includes wherein to determine the visualization effect, the
processor is to use a visualization engine.
[0057] In Example 3, the subject matter of any one or more of
Examples 1-2 optionally include wherein the sensor data includes
data from sensors of both of the pair of drum sticks and wherein
the gesture includes movement of the pair of drum sticks in
coordination with each other.
[0058] In Example 4, the subject matter of any one or more of
Examples 1-3 optionally include wherein to cause the visualization
effect to be output, the processor is to send the visualization
effect to a virtual reality headset of a user controlling the pair
of drum sticks to be displayed on the virtual reality headset.
[0059] In Example 5, the subject matter of any one or more of
Examples 1-4 optionally include wherein to cause the audio effect
to be output, the processor is to send the audio effect to a
speaker to play the audio effect.
[0060] In Example 6, the subject matter of Example 5 optionally
includes wherein to cause the visualization effect to be output,
the processor is to send the visualization effect to a display to
be displayed in coordination with the audio effect played by the
speaker.
[0061] In Example 7, the subject matter of any one or more of
Examples 1-6 optionally include wherein the processor is further to
receive data from the sensor indicating an initial position of the
drum stick, and wherein to recognize the gesture, the processor is
to determine a final position of the drum stick.
[0062] In Example 8, the subject matter of Example 7 optionally
includes wherein the drum sound is determined based on the initial
position and the final position.
[0063] In Example 9, the subject matter of any one or more of
Examples 1-8 optionally include wherein the visualization effect
corresponding to the gesture and the audio effect including the
drum sound corresponding to the gesture are determined based on an
orientation of the drum stick identified in the sensor data.
[0064] In Example 10, the subject matter of any one or more of
Examples 1-9 optionally include wherein to determine the
visualization effect, the processor is to determine the
visualization effect based on a series of previously recognized
gestures.
[0065] In Example 11, the subject matter of any one or more of
Examples 1-10 optionally include wherein the processor is further
to: receive additional sensor data from a second sensor attached to
an ankle or a foot of a user controlling the drum stick; recognize
a second gesture from the additional sensor data; determine, from
the second gesture, a second audio effect including a second drum
sound corresponding to the second gesture; and cause the second
audio effect to be output with the visualization effect and the
audio effect.
[0066] In Example 12, the subject matter of any one or more of
Examples 1-11 optionally include wherein the processor is further
to: receive wearable sensor data from a plurality of wearable
devices within a predetermined proximity of the drum stick; and
modify the visualization effect based on the wearable sensor
data.
[0067] In Example 13, the subject matter of any one or more of
Examples 1-12 optionally include wherein the visualization effect
includes a lighting effect, and wherein to cause the visualization
effect to be output, the processor is to send the lighting effect
to a plurality of wearable devices within a predetermined proximity
of the drum stick to be displayed at the plurality of wearable
devices.
[0068] In Example 14, the subject matter of any one or more of
Examples 1-13 optionally include wherein the gesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0069] In Example 15, the subject matter of any one or more of
Examples 1-14 optionally include wherein the audio effect includes
Multidimensional Polyphonic Expression instructions for a musical
instrument digital interface (MIDI) player.
[0070] Example 16 is a method for providing effects corresponding
to movement of drum sticks, the method comprising: receiving sensor
data from a sensor of at least one drum stick of a pair of drum
sticks, the sensor data based on movement of the at least one drum
stick; recognizing a gesture from the sensor data; determining,
from the gesture, a visualization effect corresponding to the
gesture and an audio effect including a drum sound corresponding to
the gesture; and causing the visualization effect and the audio
effect to be output in response to the determination.
[0071] In Example 17, the subject matter of Example 16 optionally
includes wherein determining the visualization effect includes
using a visualization engine.
[0072] In Example 18, the subject matter of any one or more of
Examples 16-17 optionally include wherein the sensor data includes
data from sensors of both of the pair of drum sticks and wherein
the gesture includes movement of the pair of drum sticks in
coordination with each other.
[0073] In Example 19, the subject matter of any one or more of
Examples 16-18 optionally include wherein causing the visualization
effect to be output includes sending the visualization effect to a
virtual reality headset of a user controlling the pair of drum
sticks to be displayed on the virtual reality headset.
[0074] In Example 20, the subject matter of any one or more of
Examples 16-19 optionally include wherein causing the audio effect
to be output includes sending the audio effect to a speaker to play
the audio effect.
[0075] In Example 21, the subject matter of Example 20 optionally
includes wherein causing the visualization effect to be output
includes sending the visualization effect to a display to be
displayed in coordination with the audio effect played by the
speaker.
[0076] In Example 22, the subject matter of any one or more of
Examples 16-21 optionally include receiving data from the sensor
indicating an initial position of the drum stick, and wherein
recognizing the gesture includes determining a final position of
the drum stick.
[0077] In Example 23, the subject matter of Example 22 optionally
includes wherein the drum sound is determined based on the initial
position and the final position.
[0078] In Example 24, the subject matter of any one or more of
Examples 16-23 optionally include wherein the visualization effect
corresponding to the vesture and the audio effect including the
drum sound corresponding to the gesture are determined based on an
orientation of the drum stick identified in the sensor data.
[0079] In Example 25, the subject matter of any one or more of
Examples 16-24 optionally include wherein determining the
visualization effect includes determining the visualization effect
based on a series of previously recognized gestures.
[0080] In Example 26, the subject matter of any one or more of
Examples 16-25 optionally include receiving additional sensor data
from a second sensor attached to an ankle or a foot of a user
controlling the drum stick; recognizing a second gesture from the
additional sensor data; determining, from the second gesture, a
second audio effect including a second drum sound corresponding to
the second gesture; and causing the second audio effect to be
output with the visualization effect and the audio effect.
[0081] In Example 27, the subject matter of any one or more of
Examples 16-26 optionally include receiving wearable sensor data
from a plurality of wearable devices within a predetermined
proximity of the drum stick; and modifying the visualization effect
based on the wearable sensor data.
[0082] In Example 28, the subject matter of any one or more of
Examples 16-27 optionally include wherein the visualization effect
includes a lighting effect, and wherein causing the visualization
effect to be output includes sending the lighting effect to a
plurality of wearable devices within a predetermined proximity of
the drum stick to be displayed at the plurality of wearable
devices.
[0083] In Example 29, the subject matter of any one or more of
Examples 16-28 optionally include wherein the gesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0084] In Example 30, the subject matter of any one or more of
Examples 16-29 optionally include wherein the audio effect includes
Multidimensional Polyphonic Expression instructions for a musical
instrument digital interface (MIDI) player.
[0085] Example 31 is at least one machine-readable medium including
instructions for operation of a computing system, which when
executed by a machine, cause the machine to perform operations of
any of the methods of Examples 16-30.
[0086] Example 32 is an apparatus comprising means for performing
any of the methods of Examples 16-30.
[0087] Example 33 is at least one machine-readable medium including
instructions for providing effects corresponding to movement of
drum sticks, which when executed by a machine, cause the machine
to: receive sensor data from a sensor of at least one drum stick of
a pair of drum sticks, the sensor data based on movement of the at,
least one drum stick; recognize a gesture from the sensor data;
determine, from the gesture, a visualization effect corresponding
to the gesture and an audio effect including a drum sound
corresponding to the gesture; and cause the visualization effect
and the audio effect to be output in response to the
determination.
[0088] In Example 34, the subject matter of Example 33 optionally
includes wherein the instructions to determine the visualization
effect include instructions to use a visualization engine.
[0089] In Example 35, the subject matter of any one or more of
Examples 33-34 optionally include wherein the sensor data includes
data from sensors of both of the pair of drum sticks and wherein
the gesture includes movement of the pair of drum sticks in
coordination with each other.
[0090] In Example 36, the subject matter of any one or more of
Examples 33-35 optionally include wherein the instructions to cause
the visualization effect to be output include instructions to send
the visualization effect to a virtual reality headset of a user
controlling the pair of drum sticks to be displayed on e virtual
reality headset.
[0091] In Example 37, the subject matter of any one or more of
Examples 33-36 optionally include wherein the instructions to cause
the audio effect to be output include instructions to send the
audio effect to a speaker to play the audio effect.
[0092] In Example 38, the subject matter of Example 37 optionally
includes wherein the instructions to cause the visualization effect
to be output include instructions to send the visualization effect
to a display to be displayed in coordination with the audio effect
played by the speaker.
[0093] In Example 39, the subject matter of any one or more of
Examples 33-38 optionally include instructions to receive data from
the sensor indicating an initial position of the drum stick, and
wherein the instructions to recognize the gesture include
instructions to determine a final position of the drum stick.
[0094] In Example 40, the subject matter of Example 39 optionally
includes wherein the drum sound is determined based on the initial
position and the final position.
[0095] In Example 41, the subject matter of any one or more of
Examples 33-40 optionally include wherein the visualization effect
corresponding to the gesture and the audio effect including the
drum sound corresponding to the gesture are determined based on an
orientation the drum stick identified in the sensor data.
[0096] In Example 42, the subject matter of any one or more of
Examples 33-41 optionally include wherein the instructions to
determine the visualization effect include instructions to
determine the visualization effect based on a series of previously
recognized gestures.
[0097] In Example 43, the subject matter of any one or more of
Examples 33-42 optionally include instructions to: receive
additional sensor data from a second sensor attached to an ankle or
a foot of a user controlling the drum stick; recognize a second
gesture from the additional sensor data; determine, from the second
gesture, a second audio effect including a second drum sound
corresponding to the second gesture; and cause the second audio
effect to be output with the visualization effect and the audio
effect.
[0098] In Example 44, the subject matter of any one or more of
Examples 33-43 optionally include instructions to: receive wearable
sensor data from a plurality of wearable devices within a
predetermined proximity of the drum stick; and modify the
visualization effect based on the wearable sensor data,
[0099] In Example 45, the subject matter of any one or more of
Examples 33-44 optionally include wherein the visualization effect
includes a lighting effect, and wherein the instructions to cause
the visualization effect to be output include instructions to send
the lighting effect to a plurality of wearable devices within a
predetermined proximity of the drum stick to be displayed at the
plurality of wearable devices.
[0100] In Example 46, the subject matter of any one or more of
Examples 33-45 optionally include wherein the gesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0101] In Example 47, the subject matter of any one or more of
Examples 33-46 optionally include wherein the audio effect includes
Multidimensional Polyphonic Expression instructions for a musical
instrument digital interface (MIDI) player.
[0102] Example 48 is an apparatus for providing effects
corresponding to movement of drum sticks, the apparatus comprising:
means for receiving sensor data from a sensor of at least one drum
stick of a pair of drum sticks, the sensor data based on movement
of the at least one drum stick; means for recognizing a gesture
from the sensor data; means for determining, from the gesture, a
visualization effect corresponding to the gesture and an audio
effect including a drum sound corresponding to the gesture; and
means for causing the visualization effect and the audio effect to
be output in response to the determination.
[0103] In Example 49, the subject matter of Example 48 optionally
includes wherein the means for determining the visualization effect
include means for using a visualization engine.
[0104] In Example 50, the subject matter of any one or more of
Examples 48-49 optionally include wherein the sensor data includes
data from sensors of both of the pair of drum sticks and wherein
the gesture includes movement of the pair of drum sticks in
coordination with each other.
[0105] In Example 51, the subject matter of any one or more of
Examples 48-50 optionally include wherein the means for causing the
visualization effect to be output include means for sending the
visualization effect to a virtual reality headset of a user
controlling the pair of drum sticks to be displayed on the virtual
reality headset.
[0106] In Example 52, the subject matter of any one or more of
Examples 48-51 optionally include wherein the means for causing the
audio effect to be output include means for sending the audio
effect to a speaker to play the audio effect.
[0107] In Example 53, the subject matter of Example 52 optionally
includes wherein the means for causing the visualization effect to
be output include means for sending the visualization effect to a
display to be displayed in coordination with the audio effect
played by the speaker.
[0108] In Example 54, the subject matter of any one or more of
Examples 48-53 optionally include means for receiving data from the
sensor indicating an initial position of the drum stick, and
wherein the means for recognizing the gesture include means for
determining a final position of the drum stick.
[0109] In Example 55, the subject matter of Example 54 optionally
includes wherein the drum sound is determined based on the initial
position and the final position.
[0110] In Example 56, the subject matter of any one or more of
Examples 48-55 optionally include wherein the visualization effect
corresponding to the gesture and the audio effect including the
drum sound corresponding to the gesture are determined based on an
orientation of the drum stick identified in the sensor data.
[0111] In Example 57, the subject matter of any one or more of
Examples 48-56 optionally include wherein the means for determining
the visualization effect include means for determining the
visualization effect based on a series of previously recognized
gestures.
[0112] In Example 58, the subject matter of any one or more of
Examples 48-57 optionally include means for receiving additional
sensor data from a second sensor attached to an ankle or a foot of
a user controlling the drum stick; means for recognizing a second
gesture from the additional sensor data; means for determining,
from the second gesture, a second audio effect including a second
drum sound corresponding to the second gesture; and means for
causing the second audio effect to be output with the visualization
effect and the audio effect.
[0113] In Example 59, the subject matter of any one or more of
Examples 48-58 optionally include means for receiving wearable
sensor data from a plurality of wearable devices within a
predetermined proximity of the drum stick; and means for modifying
the visualization effect based on the wearable sensor data.
[0114] In Example 60, the subject matter of any one or more of
Examples 48-59 optionally include wherein the visualization effect
includes a lighting effect, and wherein the means for causing the
visualization effect to be output include means for sending the
lighting effect to a plurality of wearable devices within a
predetermined proximity of the drum stick to be displayed at the
plurality of wearable devices.
[0115] In Example 61, the subject matter of any one or more of
Examples 48-60 optionally include wherein the gesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0116] In Example 62, the subject matter of any one or more of
Examples 48-61 optionally include wherein the audio effect includes
Multidimensional Polyphonic Expression instructions for a musical
instrument digital interface (MIDI) player.
[0117] Example 63 is a virtual drum set system comprising: a pair
of drum sticks each including: a sensor to provide data based on
movement of the drum stick; and a transceiver to transmit the
sensor data; a device including a processor to: recognize a gesture
from the sensor data; and determine, from the gesture, a
visualization effect corresponding to the gesture and an audio
effect including a drum sound corresponding to the gesture; and a
display device to display the visualization effect; and a speaker
to play the audio effect.
[0118] In Example 64, the subject matter of Example 63 optionally
includes wherein the device is a mobile device.
[0119] In Example 65, the subject matter of any one or more of
Examples 63-64 optionally include wherein the device further
includes a device transceiver to receive the sensor data.
[0120] In Example 66, the subject matter of any one or more of
Examples 63-65 optionally include wherein the display device is a
virtual realty headset and the visualization effect includes a
virtual drum set.
[0121] In Example 67, the subject matter of any one or more of
Examples 63-66 optionally include wherein the sensor includes a
nine-axis sensor including a magnetometer, an accelerometer, and a
gyroscope.
[0122] In Example 68, the subject matter of any one or more of
Examples 63-67 optionally include wherein the speaker includes
headphones.
[0123] In Example 69, the subject matter of any one or more of
Examples 63-68 optionally include wherein one of the pair of drum
sticks is a parent drum stick and the transceiver of the parent
drum stick is configured to receive child sensor data from the
other of the pair of drum sticks and wherein the transceiver of the
parent drum stick is to send combined sensor data to the
device.
[0124] In Example 70, the subject matter of any one or more of
Examples 63-69 optionally include wherein the sensor data includes
data from sensors of both of the pair of drum sticks and wherein
the gesture includes movement of the pair of drum sticks in
coordination with each other.
[0125] In Example 71, the subject matter of any one or more of
Examples 63-70 optionally include wherein the processor is to send
timing information to the display device and the speaker to
coordinate displaying the visualization effect and playing the
audio effect.
[0126] In Example 72, the subject matter of any one or more of
Examples 63-71 optionally include wherein the visualization effect
corresponding to the gesture and the audio effect including the
drum sound corresponding to the gesture are determined based on an
orientation the drum stick identified in the sensor data.
[0127] In Example 73, the subject matter of any one or more of
Examples 63-72 optionally include wherein to determine the
visualization effect, the processor is to determine the
visualization effect based on a series of previously recognized
gestures.
[0128] In Example 74, the subject matter of any one or more of
Examples 63-73 optionally include wherein the system further
comprises an additional sensor attached to an ankle or a foot of a
user controlling the drum stick; and wherein the processor is
further to determine, from the additional sensor data of the
additional sensor, a second audio effect including a second drum
sound corresponding to the second gesture.
[0129] In Example 75, the subject matter of Example 74 optionally
includes wherein the speaker is to play the second audio
effect.
[0130] In Example 76, the subject matter of any one or more of
Examples 63-75 optionally include wherein the display device
includes a plurality of wearable devices within a predetermined
proximity of the drum stick, the visualization effect to be
displayed at the plurality of wearable devices.
[0131] In Example 77, the subject matter of any one or more of
Examples 63-76 optionally include wherein the gesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0132] In Example 78, the subject matter of any one or more of
Examples 63-77 optionally include wherein the speaker is controlled
by a musical instrument digital interface (MIDI) player and wherein
the audio effect includes Multidimensional Polyphonic Expression
instructions for use by the MIDI player.
[0133] Example 79 is a server in communication with a violin bow,
the server comprising: a processor to: receive sensor data from a
sensor of the violin how, the sensor data based on movement of the
violin bow; recognize a gesture from the sensor data; determine,
from the gesture, a visualization effect corresponding to the
gesture and an audio effect corresponding to the gesture; and cause
the visualization effect and the audio effect to be output in
response to the determination, the audio effect including a natural
sound caused by the movement of the violin bow.
[0134] In Example 80, the subject matter of Example 79 optionally
includes wherein to determine the visualization effect, the
processor is to use a visualization engine.
[0135] In Example 81, the subject matter of any one or more of
Examples 79-80 optionally include wherein to cause the
visualization effect to be output, the processor is to send the
visualization effect to a virtual reality headset of a user
controlling the violin bow to be displayed on the virtual reality
headset.
[0136] In Example 82, the subject matter of any one or more of
Examples 79-81 optionally include wherein to cause the audio effect
to be output, the processor is to send the audio effect to a
speaker to play the audio effect.
[0137] In Example 83, the subject matter of Example 82 optionally
includes wherein to cause the visualization effect to be output,
the processor is to send the visualization effect to a display to
be displayed in coordination with the audio effect played by the
speaker.
[0138] In Example 84, the subject matter of any one or more of
Examples 79-83 optionally include wherein the processor is further
to receive data from the sensor indicating an initial position of
the violin bow, and wherein to recognize the gesture, the processor
is to determine a final position of the violin bow.
[0139] In Example 85, the subject matter of any one or more of
Examples 79-84 optionally include wherein the visualization effect
corresponding to the gesture and the audio effect corresponding to
the gesture are determined based on an orientation of the violin
bow identified in the sensor data.
[0140] In Example 86, the subject matter of any one or more of
Examples 79-85 optionally include wherein to determine the
visualization effect, the processor is to determine the
visualization effect based on a series of previously recognized
gestures.
[0141] In Example 87, the subject matter of any one or more of
Examples 79-86 optionally include wherein the visualization effect
includes a lighting effect, and wherein to cause the visualization
effect to be output, the processor is to send the lighting effect
to a plurality of wearable devices within a predetermined proximity
of the violin bow to be displayed at the plurality of wearable
devices.
[0142] In Example 88, the subject matter of any one or more of
Examples 79-87 optionally include wherein the gesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0143] In Example 89, the subject matter of any one or more of
Examples 79-88 optionally include wherein the audio effect includes
Multidimensional Polyphonic Expression instructions for a musical
instrument digital interface (MIDI) player.
[0144] Example 90 is a server in communication with a guitar pick,
the server comprising: a processor to: receive sensor data from a
sensor of guitar pick, the sensor data based on movement of the
guitar pick; recognize a gesture from the sensor data; determine,
from the gesture, a visualization effect corresponding to the
gesture and an audio effect corresponding to the gesture; and cause
the visualization effect and the audio effect to be output in
response to the determination, the audio effect including a natural
sound caused by the movement of the guitar pick.
[0145] In Example 91, the subject matter of Example 90 optionally
includes wherein to determine the visualization effect, the
processor is to use a visualization engine.
[0146] In Example 92, the subject matter of any one or more of
Examples 90-91 optionally include wherein to cause the
visualization effect to be output, the processor is to send the
visualization effect to a virtual reality headset of a user
controlling the guitar pick to be displayed on the virtual reality
headset.
[0147] In Example 93, the subject matter of any one or more of
Examples 90-92 optionally include wherein to cause the audio effect
to be output, the processor is to send the audio effect to a
speaker to play the audio effect.
[0148] In Example 94, the subject matter of Example 93 optionally
includes wherein to cause the visualization effect to be output,
the processor is to send the visualization effect to a display to
be displayed in coordination with the audio effect played by the
speaker.
[0149] In Example 95, the subject matter of any one or more of
Examples 90-94 optionally include wherein the processor is further
to receive data from the sensor indicating an initial position of
the guitar pick, and wherein to recognize the gesture, the
processor is to determine a final position of the guitar pick.
[0150] In Example 96, the subject matter of any one or more of
Examples 90-95 optionally include wherein the visualization effect
corresponding to the gesture and the audio effect corresponding to
the gesture are determined based on an orientation of the guitar
pick identified in the sensor data.
[0151] In Example 97, the subject matter of any one or more of
Examples 90-96 optionally include wherein to determine the
visualization effect, the processor is to determine the
visualization effect based on a series of previously recognized
gestures.
[0152] In Example 98, the subject matter of any one or more of
Examples 90-97 optionally include wherein the visualization effect
includes a lighting effect, and wherein to cause the visualization
effect to be output, the processor is to send the lighting effect
to a plurality of wearable devices within a predetermined proximity
of the guitar pick to be displayed at the plurality of wearable
devices.
[0153] In Example 99, the subject matter of any one or more of
Examples 90-98 optionally include wherein the gesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0154] In Example 100, the subject matter of any one or more of
Examples 90-99 optionally include wherein the audio effect includes
Multidimensional Polyphonic Expression instructions for a musical
instrument digital interface (MIDI) player.
[0155] Example 101 is a server in communication with a conductor
baton, the server comprising: a processor to: receive sensor data
from a sensor of the conductor baton, the sensor data based on
movement of the conductor baton; recognize a gesture from the
sensor data; determine, from the gesture, a visualization effect
corresponding to the gesture and an audio effect corresponding to
the gesture; and cause the visualization effect and the audio
effect to be output in response to the determination, the audio
effect to be played with corresponding natural sounds directed by
the movement of the conductor baton.
[0156] In Example 102, the subject matter of Example 101 optionally
includes wherein to determine the visualization effect, the
processor is to use a visualization engine.
[0157] In Example 103, the subject matter of any one or more of
Examples 101-102 optionally include wherein to cause the
visualization effect to be output, the processor is to send the
visualization effect to a virtual reality headset of a user
controlling the conductor baton to be displayed on the virtual
reality headset.
[0158] In Example 104, the subject matter of any one or more of
Examples 101-103 optionally include wherein to cause the audio
effect to be output, the processor is to send the audio effect to a
speaker to play the audio effect.
[0159] In Example 105, the subject matter of Example 104 optionally
includes wherein to cause the visualization effect to be output,
the processor is to send the visualization effect to a display to
be displayed in coordination with the audio effect played by the
speaker.
[0160] In Example 106, the subject matter of any one or more of
Examples 101-105 optionally include wherein the processor is
further to receive data from the sensor indicating an initial
position of the conductor baton, and wherein to recognize the
gesture, the processor is to determine a final position of the
conductor baton.
[0161] In Example 107, the subject matter of any one or more of
Examples 101-106 optionally include wherein the visualization
effect corresponding to the gesture and the audio effect
corresponding to the gesture are determined based on an orientation
of the conductor baton identified in the sensor data.
[0162] In Example 108, the subject matter of any one or more of
Examples 101-107 optionally include wherein to determine the
visualization effect, the processor is to determine the
visualization effect based on a series of previously recognized
gestures.
[0163] In Example 109, the subject matter of any one or more of
Examples 101-108 optionally include wherein the visualization
effect includes a lighting effect, and wherein to cause the
visualization effect to be output, the processor is to send the
lighting effect to a plurality of wearable devices within a
predetermined proximity of the conductor baton to be displayed at
the plurality of wearable devices.
[0164] In Example 110, the subject matter of any one or more of
Examples 101-109 optionally include wherein the vesture includes at
least one of a linear movement, a tapping movement, a sweeping
movement, a minimum acceleration, or a minimum deceleration.
[0165] In Example 111, the subject matter of any one or more of
Examples 101-110 optionally include wherein the audio effect
includes Multidimensional Polyphonic Expression instructions for a
musical instrument digital interface (MIDI) player.
[0166] Example 112 is at least one machine-readable medium
including instructions, which when executed by a machine, cause the
machine to perform operations of any of the operations of Examples
1-111.
[0167] Example 113 is an apparatus comprising means for performing
any of the operations of Examples 1-111.
[0168] Example 114 is a system to perform the operations of any of
the Examples 1-111.
[0169] Example 115 is a method to perform the operations of any of
the Examples 1-111.
[0170] Method examples described herein may be machine or
computer-implemented at least in part. Some examples may include a
computer-readable medium or machine-readable medium encoded with
instructions operable to configure an electronic device to perform
methods as described in the above examples. An implementation of
such methods may include code, such as microcode, assembly language
code, a higher-level language code, or the like. Such code may
include computer readable instructions for performing various
methods. The code may form portions of computer program products.
Further, in an example, the code may be tangibly stored on one or
more volatile, non-transitory, or non-volatile tangible
computer-readable media, such as during execution or at other
times. Examples of these tangible computer-readable media may
include, but are not limited to, hard disks, removable magnetic
disks, removable optical disks (e.g., compact disks and digital
video disks), magnetic cassettes, memory cards or sticks, random
access memories (RAMS), read only memories (ROMs), and the
like.
* * * * *