U.S. patent application number 12/956646 was filed with the patent office on 2011-06-23 for multi-player augmented reality combat.
This patent application is currently assigned to EXENT TECHNOLOGIES, LTD.. Invention is credited to Itay Nave.
Application Number | 20110151955 12/956646 |
Document ID | / |
Family ID | 44151853 |
Filed Date | 2011-06-23 |
United States Patent
Application |
20110151955 |
Kind Code |
A1 |
Nave; Itay |
June 23, 2011 |
MULTI-PLAYER AUGMENTED REALITY COMBAT
Abstract
Techniques are described herein for performing multi-player
augmented reality combat. Each player wears (or is otherwise
associated with) an indicator (e.g., an object that has a
designated pattern, a visual tag, etc.) that identifies the player
or a team thereof. Each player has a mobile communication device
that is capable of identifying the players' indicators. A user may
point a camera of the user's mobile communication device at another
player (e.g., an opponent). The image that is captured by the
camera includes the indicator that is associated with the opponent.
The user may choose to fire a virtual bullet at the opponent using
an audio and/or tactile command, which is processed by the user's
mobile communication device. The mobile communication device
determines the time that the virtual bullet takes to travel from
the mobile communication device to the opponent based on the
distance between the user and the opponent.
Inventors: |
Nave; Itay; (Kfar Hess,
IL) |
Assignee: |
EXENT TECHNOLOGIES, LTD.
Petach-Tikva
IL
|
Family ID: |
44151853 |
Appl. No.: |
12/956646 |
Filed: |
November 30, 2010 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61289881 |
Dec 23, 2009 |
|
|
|
Current U.S.
Class: |
463/2 ;
463/42 |
Current CPC
Class: |
A63F 13/837 20140902;
A63F 2300/8023 20130101; A63F 13/79 20140902; A63F 2300/205
20130101; A63F 13/215 20140902; A63F 13/795 20140902; A63F 13/537
20140902; A63F 2300/8076 20130101; A63F 13/213 20140902; A63F
2300/6045 20130101; A63F 13/655 20140902; A63F 13/92 20140902; A63F
2300/204 20130101 |
Class at
Publication: |
463/2 ;
463/42 |
International
Class: |
A63F 9/24 20060101
A63F009/24 |
Claims
1. A mobile communication device comprising: a camera configured to
capture an image; an image recognition module configured to
identify a player indicator in the image, the player indicator
corresponding to a player; and an outgoing attack module configured
to transmit an outgoing attack indicator in response to a
user-initiated attack command that is received in response to
identification of the player indicator, the outgoing attack
indicator specifying that a virtual bullet is fired at the
player.
2. The mobile communication device of claim 1, further comprising:
a location module configured to determine a location of the mobile
communication device; a distance determination module configured to
determine a distance between the location of the mobile
communication device and a location of the player; and a time
determination module configured to determine an estimated duration
of a time period for the virtual bullet to travel from the mobile
communication device to the player based on the determined
distance.
3. The mobile communication device of claim 2, wherein the time
determination module is configured to determine the estimated
duration of the time period based on a virtual environmental
condition.
4. The mobile communication device of claim 2, wherein the time
determination module is configured to determine the estimated
duration of the time period based on a difference between an
altitude of the mobile communication device and an altitude of the
player indicator.
5. The mobile communication device of claim 2, wherein the time
determination module is configured to determine the estimated
duration of the time period based on at least one of an attribute
of the virtual bullet or an attribute of a virtual weapon that is
used to fire the virtual bullet.
6. The mobile communication device of claim 1, further comprising:
an environment module configured to modify the image to include a
virtual environmental condition; and a display configured to
display the modified image.
7. The mobile communication device of claim 6, further comprising:
an environment control module configured to control the virtual
environmental condition in response to a user-initiated environment
control command
8. The mobile communication device of claim 1, further comprising:
an identification module configured to modify the image to include
identification information regarding the player; and a display
configured to display the modified image.
9. The mobile communication device of claim 1, further comprising:
an incoming attack module configured to determine when a virtual
bullet is directed at a user of the mobile communication device;
and a sensory signal module configured to provide a sensory signal
to the user in response to determination that a virtual bullet is
directed at the user.
10. The mobile communication device of claim 9, further comprising:
a speed control module configured to control a speed of the virtual
bullet that is directed at the user in response to a user-initiated
speed control command.
11. A method of performing multi-player augmented reality combat
with respect to a user of a mobile communication device,
comprising: capturing an image; identifying a player indicator in
the image, the player indicator corresponding to a player;
receiving a user-initiated attack command in response to
identifying the player indicator; and transmitting an attack
indicator in response to receiving the user-initiated attack
command, the attack indicator specifying that a virtual bullet is
fired at the player.
12. The method of claim 11, further comprising: determining a
location of the mobile communication device; receiving a player
location indicator that specifies a location of the player;
determining a distance between the location of the mobile
communication device and the location of the player; and
determining an estimated duration of a time period for the virtual
bullet to travel from the mobile communication device to the player
based on the determined distance.
13. The method of claim 12, wherein determining the estimated
duration of the time period comprises: determining the estimated
duration of the time period based on a virtual environmental
condition.
14. The method of claim 12, wherein determining the estimated
duration of the time period comprises: determining the estimated
duration of the time period based on a difference between an
altitude of the mobile communication device and an altitude of the
player indicator.
15. The method of claim 12, wherein determining the estimated
duration of the time period comprises: determining the estimated
duration of the time period based on at least one of an attribute
of the virtual bullet or an attribute of a virtual weapon that is
used to fire the virtual bullet.
16. The method of claim 11, further comprising: modifying the image
to include a virtual environmental condition; and displaying the
modified image.
17. The method of claim 16, further comprising: controlling the
virtual environmental condition in response to a user-initiated
environment control command.
18. The method of claim 11, further comprising: modifying the image
to include identification information regarding the player; and
displaying the modified image.
19. The method of claim 11, further comprising: receiving an
incoming attack indicator that specifies that a virtual bullet is
directed at a user of the mobile communication device; and
20. The method of claim 19, further comprising: controlling a speed
of the virtual bullet that is directed at the user in response to a
user-initiated speed control command.
21. The method of claim 11, further comprising: controlling a rate
at which the user recovers from a virtual injury based on a
distance between the location of the mobile communication device
and a designated location.
22. A computer program product comprising a computer-readable
medium having computer program logic recorded thereon for enabling
a processor-based system to perform multi-player augmented reality
combat with respect to a user of a mobile communication device, the
computer program product comprising: a first program logic module
for enabling the processor-based system to capture an image; a
second program logic module for enabling the processor-based system
to identify a player indicator in the image, the player indicator
corresponding to a player; and a third program logic module for
enabling the processor-based system to transmit an outgoing attack
indicator in accordance with a mobile communication protocol in
response to a user-initiated attack command that is received in
response to identification of the player indicator, the outgoing
attack indicator specifying that a virtual bullet is fired at the
player.
23. The computer program product of claim 22, further comprising: a
fourth program logic module for enabling the processor-based system
to determine a location of the mobile communication device; a fifth
program logic module for enabling the processor-based system to
determine a distance between the location of the mobile
communication device and a location of the player; and a sixth
program logic module for enabling the processor-based system to
determine an estimated duration of a time period for the virtual
bullet to travel from the mobile communication device to the player
based on the determined distance.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of U.S. Provisional
Application No. 61/289,881, filed on Dec. 23, 2009, which is
incorporated by reference herein in its entirety.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The present invention relates to techniques for performing
multi-player augmented reality combat.
[0004] 2. Background
[0005] Augmented reality is a representation of a physical
real-world environment that is combined with (i.e., augmented by)
virtual (i.e., computer-generated) imagery. The augmentation of the
physical real-world environment is usually performed in real-time.
For example, the augmented reality may be displayed to a user via a
live-video stream. The user may view the live-video stream using
any suitable type of display, such as a head-mounted display, a
handheld display, a virtual retinal display, etc.
[0006] Augmented reality systems commonly include hand-held devices
each of which has network capabilities, a camera, and a display.
The cameras capture the physical real-world environment, and the
displays display the physical real-world environment in combination
with virtual objects. Virtual objects may include text, images, or
any other computer-generated object.
[0007] Augmented reality may be used in a variety of applications.
For example, augmented reality may be used to create a virtual
object in a museum, an exhibition, or a theme park attraction. In
another example, labels or text (e.g., operating instructions) may
be superimposed on an object or parts thereof, such as surgical
instruments or aircraft controls. In yet another example, virtual
imagery of a digital mock-up may be compared side-by-side to a
physical mock-up to determine discrepancies therebetween.
[0008] Another application in which augmented reality may be used
is gaming In conventional augmented reality games, each user's
hand-held device includes a camera and a display. A user's camera
captures images of physical real-world objects using pre defined
patterns. The user's display displays the images of the physical
real-world objects in combination with virtual objects. For
example, the virtual objects may be superimposed on an image of the
user's physical real-world environment. The user is often able to
interact with the virtual objects. For instance, the user may use a
virtual weapon to fire virtual bullets at the virtual objects. The
virtual weapon may be associated with a virtual targeting axis, for
example, that points to a location at which the virtual bullets are
to be fired. For instance, the display may display the virtual
targeting axis in the augmented reality. However, the virtual
objects with which the user interacts typically do not correspond
to physical objects in the physical real-world environment.
Accordingly, users of conventional augmented reality games
traditionally are not able to perform augmented reality combat with
another person.
[0009] Thus, systems, methods, and computer program products are
needed that are capable of performing multi-player augmented
reality combat.
BRIEF SUMMARY OF THE INVENTION
[0010] Various approaches are described herein for, among other
things, performing multi-player augmented reality combat.
Multi-player augmented reality combat is combat that is performed
between multiple players using augmented reality. Each player wears
(or is otherwise associated with) an indicator (e.g., an object
that has a designated pattern, a visual tag, etc.) that identifies
the player or a team in which the player is included. Each player
has a mobile communication device, which executes software that
enables the mobile communication device to identify the indicators
of the players. For example, a user of a mobile communication
device may point a camera of the mobile communication device at
another player (e.g., an opponent). The image that is captured by
the camera includes the indicator that is associated with the
opponent. The user may choose to fire a virtual bullet (e.g.,
spear, slug, cannon ball, dart, flames, buckshot, etc.) at the
opponent by providing an audio and/or tactile command to the user's
mobile communication device. The mobile communication device
determines an estimated duration of a time period for the virtual
bullet to travel from the mobile communication device to the
opponent based on the distance between the user and the opponent.
For example, the distance between the player and the opponent can
be calculated according to their positions, which may be determined
by respective location modules.
[0011] An example mobile communication device is described. The
mobile communication device includes a camera, an image recognition
module, a display, and an outgoing attack module. The camera is
configured to capture an image. The image recognition module is
configured to identify a player indicator in the image, the player
indicator corresponding to an opponent player. The display is
configured to display the image. The outgoing attack module is
configured to transmit an outgoing attack indicator in accordance
with a mobile communication protocol in response to a
user-initiated attack command that is received in response to
identification of the player indicator. The outgoing attack
indicator specifies that a virtual bullet is fired at the opponent
player.
[0012] The example mobile communication device may further include
a location module, a distance determination module, and a time
determination module. The location module is configured to
determine a location of the mobile communication device. For
example, the location of the mobile communication device may be
associated with a player indicator of the user of the mobile
communication device. The location of the mobile communication
device is available to mobile communication devices of other
players for distance calculation via a central server, P2P
communication, or any other communication technique. The distance
determination module is configured to determine a distance between
the location of the user's mobile communication device and a
location of the opponent player whose player indicator is
identified by the image recognition module. The time determination
module is configured to determine an estimated duration of a time
period for the virtual bullet to travel from the mobile
communication device to the player based on the determined
distance.
[0013] An example method for performing multi-player augmented
reality combat with respect to a user of a mobile communication
device is also described. In accordance with this method, an image
is captured and displayed. A player indicator is identified in the
image. The player indicator corresponds to an opponent player. A
user-initiated attack command is received in response to
identifying the player indicator. An attack indicator is
transmitted in accordance with a mobile communication protocol in
response to receiving the user-initiated attack command. The attack
indicator specifies that a virtual bullet is fired at the opponent
player.
[0014] In some aspects, a location of the mobile communication
device is determined A player location indicator that specifies a
location of the opponent player is received. A distance between the
location of the mobile communication device and the location of the
opponent player is determined An estimated duration of a time
period for the virtual bullet to travel from the mobile
communication device to the opponent player is determined based on
the determined distance.
[0015] A computer program product is also described. The computer
program product includes a computer-readable medium having computer
program logic recorded thereon for enabling a processor-based
system to perform multi-player augmented reality combat with
respect to a user of a mobile communication device. The computer
program logic includes first, second, and third program logic
modules. The first program logic module is for enabling the
processor-based system to capture an image. The second program
logic module is for enabling the processor-based system to identify
a player indicator in the image. The player indicator corresponds
to a player. The third program logic module is for enabling the
processor-based system to transmit an outgoing attack indicator in
accordance with a mobile communication protocol in response to a
user-initiated attack command that is received in response to
identification of the player indicator. The outgoing attack
indicator specifies that a virtual bullet is fired at the
player.
[0016] The computer program logic may further include fourth,
fifth, and sixth program logic modules. The fourth program logic
module is for enabling the processor-based system to determine a
location of the mobile communication device. The fifth program
logic module is for enabling the processor-based system to
determine a distance between the location of the mobile
communication device and a location of the player. The sixth
program logic module is for enabling the processor-based system to
determine an estimated duration of a time period for the virtual
bullet to travel from the mobile communication device to the player
based on the determined distance.
[0017] Further features and advantages of the disclosed
technologies, as well as the structure and operation of various
embodiments, are described in detail below with reference to the
accompanying drawings. It is noted that the invention is not
limited to the specific embodiments described herein. Such
embodiments are presented herein for illustrative purposes only.
Additional embodiments will be apparent to persons skilled in the
relevant art(s) based on the teachings contained herein.
BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
[0018] The accompanying drawings, which are incorporated herein and
form part of the specification, illustrate embodiments of the
present invention and, together with the description, further serve
to explain the principles involved and to enable a person skilled
in the relevant art(s) to make and use the disclosed
technologies.
[0019] FIGS. 1 and 2 are block diagrams of example augmented
reality combat systems in accordance with embodiments described
herein.
[0020] FIGS. 3, 9, and 11 depict flowcharts of methods for
performing multi-player augmented reality combat with respect to a
user of a mobile communication device in accordance with
embodiments described herein.
[0021] FIGS. 4, 10, and 12 are block diagrams of example
implementations of a mobile communication device shown in FIG. 1 in
accordance with embodiments described herein.
[0022] FIGS. 5-8 show mobile communication devices that display
example views of an augmented reality environment in accordance
with embodiments described herein.
[0023] FIG. 13 is a block diagram of a computer in which
embodiments may be implemented.
[0024] FIG. 14 illustrates a technique for determining a difference
between an altitude of a mobile communication device and an
altitude of a player in accordance with an embodiment described
herein.
[0025] The features and advantages of the disclosed technologies
will become more apparent from the detailed description set forth
below when taken in conjunction with the drawings, in which like
reference characters identify corresponding elements throughout. In
the drawings, like reference numbers generally indicate identical,
functionally similar, and/or structurally similar elements. The
drawing in which an element first appears is indicated by the
leftmost digit(s) in the corresponding reference number.
DETAILED DESCRIPTION OF THE INVENTION
I. Introduction
[0026] The following detailed description refers to the
accompanying drawings that illustrate exemplary embodiments of the
present invention. However, the scope of the present invention is
not limited to these embodiments, but is instead defined by the
appended claims. Thus, embodiments beyond those shown in the
accompanying drawings, such as modified versions of the illustrated
embodiments, may nevertheless be encompassed by the present
invention.
[0027] References in the specification to "one embodiment," "an
embodiment," "an example embodiment," or the like, indicate that
the embodiment described may include a particular feature,
structure, or characteristic, but every embodiment may not
necessarily include the particular feature, structure, or
characteristic. Moreover, such phrases are not necessarily
referring to the same embodiment. Furthermore, when a particular
feature, structure, or characteristic is described in connection
with an embodiment, it is submitted that it is within the knowledge
of one skilled in the art to implement such feature, structure, or
characteristic in connection with other embodiments whether or not
explicitly described.
[0028] Example embodiments are capable of performing multi-player
augmented reality combat. Multi-player augmented reality combat is
combat that is performed between multiple players using augmented
reality. In some example embodiments, each player wears (or is
otherwise associated with) an indicator (e.g., an object that has a
designated pattern, a visual tag, etc.) that identifies the player
or a team in which the player is included. Each player has a mobile
communication device that is capable of identifying the indicators
of the players. For example, a user of a mobile communication
device may point a camera of the mobile communication device at
another player (e.g., an opponent). The image that is captured by
the camera includes the indicator that is associated with the
opponent. The user may choose to fire a virtual bullet (e.g.,
spear, slug, cannon ball, dart, flames, buckshot, etc.) at the
opponent when the camera is pointed at the opponent by providing an
audio and/or tactile command to the user's mobile communication
device. The mobile communication device determines an estimated
duration of a time period for the virtual bullet to travel from the
mobile communication device to the opponent based on the distance
between the user and the opponent.
[0029] In other example embodiments, locations of the players are
determined using location signals, such as global positioning
system (GPS) signals, without the need for each player to be
associated with an indicator. For instance, a user's mobile
communication device may be capable of determining that a camera of
the mobile communication device is pointed at an opponent based on
a location of the user, a location of the opponent, and an
orientation of the mobile communication device.
[0030] Techniques described herein for performing multi-player
augmented reality combat have a variety of benefits as compared to
conventional augmented reality gaming techniques. For example, the
techniques described herein enable players to interact with other
players who exist in the physical real-world environment. Virtual
environmental conditions may be introduced into the augmented
reality that is perceived by the players. Such environmental
conditions may affect the time that a virtual bullet takes to
travel from a user's mobile communication device to an opponent
whose indicator is identified by the user's mobile communication
device. The environmental conditions may be controlled with respect
to a particular user's augmented reality based on environmental
control commands that are initiated by the user. The speed of a
virtual bullet that is directed at a user may be controlled based
on a speed control command that is initiated by the user.
II. Example Embodiments For Performing Multi-Player Augmented
Reality Combat
[0031] FIG. 1 is a block diagram of an example augmented reality
combat system 100 in accordance with an embodiment described
herein. Generally speaking, augmented reality combat system 100
operates to perform multi-player augmented reality combat with
respect to users of mobile communication devices based on commands
that the users provide via the mobile communication devices. In
accordance with example embodiments, when users (i.e., players)
provide commands to fire virtual bullets at other players,
augmented reality combat system 100 operates to determine the time
that each virtual bullet takes to reach the respective targeted
player.
[0032] As shown in FIG. 1, augmented reality combat system 100
includes a plurality of mobile communication devices 102A-102N, a
network 104, a device location system 106, and server(s) 108.
Device location system 106 provides location signals to mobile
communication devices 102A-102N via respective links 112A-112N in
accordance with a wireless protocol, such as a mobile communication
protocol, a global positioning system (GPS) protocol, or any other
suitable protocol over a wireless network. Communications among
mobile communication devices 102A-102N and server(s) 108 are
carried out over network 104 using well-known network communication
protocols. Network 104 may be a wide-area network (e.g., the
Internet), a local area network (LAN), another type of network, or
a combination thereof Communications between network 104 and mobile
communication devices 102A-102N are provided wirelessly via
respective wireless links 110A-110N.
[0033] Mobile communication devices 102A-102N are processing
systems that are capable of communicating with server(s) 108. An
example of a processing system is a system that includes at least
one processor that is capable of manipulating data in accordance
with a set of instructions. For instance, a processing system may
be a computer, a personal digital assistant, etc. Mobile
communication devices 102A-102N process data that is received from
server(s) 108 via network 104 to display an augmented
representation of the physical real-world environment to users of
the mobile communication devices 102A-102N. The augmented
representation of the physical real-world environment is referred
to herein as an augmented reality environment. For instance, the
mobile communication devices 102A-102N may combine virtual imagery,
text, and/or any other type of data with images of the physical
real-world environment to generate the augmented reality
environment.
[0034] To initialize augmented reality combat system 100, users
register their identities (e.g., user names), player identifiers,
device addresses, etc. using mobile communication devices
102A-102N, so that their identifiers may be associated with their
identities. Mobile communication devices 102A-102N are capable of
interpreting signals that are received from device location system
106 to determine their respective locations in the augmented
reality environment. Each mobile communication device 102A-102N may
be configured to provide information regarding its location to
server(s) 108 via network 104 in response to the mobile
communication device being moved and/or periodically in accordance
with a designated schedule.
[0035] Mobile communication devices 102A-102N are capable of
interpreting commands that are received from users of the mobile
communication devices 102A-102N to perform virtual actions in the
augmented reality environment. For instance, mobile communication
devices 102A-102N are capable of firing virtual bullets at users of
other mobile communication devices in the augmented reality
environment in response to user-initiated attack commands Mobile
communication devices 102A-102N are capable of identifying the
players based on player identifiers that correspond to the players.
Multiple players can use the same identity and/or player identifier
(e.g., to indicate that they belong to the same group). For
example, each of mobile communication devices 102A-102N may store a
list that cross-references the players and the player indicators.
In another example, the list is stored on (or otherwise accessible
to) server(s) 108, and each of mobile communication devices
102A-102N is configured to access the list from server(s) 108.
Techniques for performing multi-player augmented reality combat are
described in further detail in the following discussion.
[0036] Server(s) 108 is a processing system that is capable of
communicating with mobile communication devices 102A-102N.
Server(s) 108 provides data to mobile communication devices
102A-102N that are to be combined with images of the physical
real-world environment to provide an augmented reality environment.
Server(s) 108 processes commands that are received from mobile
communication devices 102A-102N, specifying actions to be taken
with respect to objects in the augmented reality environment. For
example, if a user of a first communication device 102A provides a
command to fire a virtual bullet at a user of a second
communication device 102B, server(s) may provide an indicator to
second communication device 102B that specifies that a virtual
bullet is directed at the user of the second mobile communication
device 102B. In accordance with this example, server(s) 108 may
update the virtual imagery of the augmented reality environment to
show the virtual bullet travelling toward the user of the second
communication device 102B. For instance, when users aim cameras of
their mobile communication devices at an area where the virtual
bullet virtually exists, the mobile communication devices may draw
the virtual bullet so that the users can see the virtual bullet on
displays of their mobile communication devices.
[0037] In an example embodiment, server(s) 108 is capable of
modifying the augmented reality environment to include virtual
environmental conditions. In accordance with this example
embodiment, server(s) may control such environmental conditions
with respect to one or more of the users in response to receiving
user-initiated environmental control commands. For example,
server(s) 108 may introduce or eliminate a virtual environmental
condition or reduce or increase the intensity of the virtual
environmental condition with respect to a user upon receiving a
user-initiated environmental control command regarding the
environmental condition from a mobile communication device of the
user. In another example, server(s) 108 may introduce or eliminate
the virtual environmental condition or reduce or increase the
intensity of the virtual environmental condition with respect to
users other than the user who initiated the environmental control
command upon receiving the environmental control command.
[0038] In another example embodiment, mobile communication devices
102A-102N are capable of modifying the augmented reality
environment to include environmental conditions. For instance,
server(s) 108 may store attributes of users that include
environmental control capabilities. Each user's mobile
communication device may store a copy of that user's attributes for
permitting the user to utilize environmental control commands that
are associated with the user's environmental control capabilities.
The user may initiate an environmental control command using a
button or touch screen of the user's mobile communication device,
an audible command, or any other suitable technique. Upon
initiation of the environmental control command, the user's mobile
communication device may change virtual environmental parameters,
such that each other users virtual environmental conditions that
are associated with the parameters are incorporated into the
versions of the augmented reality environment that are displayed to
the other users. For instance, the environmental condition
parameters may affect virtual objects, virtual attributes (e.g.,
health, visibility, etc.) of the user and/or other players, virtual
bullet speed and/or direction, a hit effect that is associated with
a virtual bullet, firing accuracy, range of explosion, or any other
virtual characteristic of the augmented reality combat.
[0039] Device location system 106 is a processing system that is
configured to provide location signals to mobile communication
devices 102A-102N via respective links 112A-112N. links 112A-112N
may be wireless links, GPS links, or any other suitable type of
links. For example, the location signals may specify the locations
of the respective mobile communication devices 102A-102N. In
another example, the location signals may include information that
may be used by the mobile communication devices 102A-102N to
determine their respective locations.
[0040] Links 112A-112N are shown in FIG. 1 to be unidirectional for
illustrative purposes and are not intended to be limiting. It will
be recognized that links 112A-112N may be bidirectional. For
example, device location system 106 may provide ping signals to
mobile communication devices 102A-102N for determining locations of
the respective mobile communication devices 102A-102N. In
accordance with this example, device location system 106 may
receive response signals from the respective mobile communication
devices 102A-102N in response to the respective ping signals.
Device location system 106 may determine a location of each mobile
communication device 102A-102N based on the time that elapses
between providing the respective ping signal and receiving the
respective response signal.
[0041] In accordance with an example embodiment, each of the mobile
communication devices 102A-102N reports its location to server 108
via network 104 and accesses server 108 to determine the locations
of the other mobile communication devices. A mobile communication
device may compare its location to a location of another mobile
communication device to calculate a distance therebetween. The
locations of the respective mobile communication devices 102A-102N,
as indicated by the location signals that are provided by device
location system 106, may be estimated locations. Accordingly, the
calculated distances between the mobile communication devices
102A-102N may be estimated distances.
[0042] Device location system 106 may be capable of providing a
positioning accuracy that is greater than the positioning accuracy
that is allowed by government regulations and/or laws. For
instance, the full positioning accuracy capabilities of device
location system 106 may be reserved for military applications. If
restrictions regarding the positioning accuracy of device location
system 106 are not imposed, and/or GPS (or other positioning
technique) allows for accurate positioning of approximately one
meter or less, device location system 106 may provide substantially
greater positioning accuracy. For example, the location of each
user may be determined using merely the capabilities of device
location system 106, without using image recognition
techniques.
[0043] FIG. 2 is a block diagram of another example augmented
reality combat system 200 in accordance with an embodiment
described herein. Augmented reality combat system 200 is similar to
the augmented reality combat system 100 shown in FIG. 1, except
that augmented reality combat system 200 does not include network
104 or server(s) 108, and mobile communication devices 202A-202N
are processing systems that are capable of communicating with each
other. Thus, communications between mobile communication devices
202A-202N are provided wirelessly using well-known communication
protocols that do not require a server. For instance, first mobile
communication device 202A and second mobile communication device
202B communicate via wireless link 204; second mobile communication
device 202B and nth mobile communication device 202N communicate
via wireless link 206; nth mobile communication device 202N and
first mobile communication device 202 A communicate via wireless
link 208, and so on.
[0044] Accordingly, in augmented reality combat system 200, a
mobile communication device that performs an action with respect to
the augmented reality environment may provide an indicator that
specifies the action to each of the other mobile communication
devices. Each mobile communication device 202A-202N may provide
information regarding its location to the other communication
devices in response to that mobile communication device being moved
and/or periodically in accordance with a designated schedule. The
mobile communication devices that receive such indicators and/or
information may update their respective displays of the augmented
reality environment based on the indicators and/or the
information.
[0045] In accordance with some example embodiments, augmented
reality combat system 200 includes a network and a server. For
instance, attributes that are associated with the players may be
stored on (or otherwise accessible to) the server. Examples of
attributes include but are not limited to identities of the
players, network addresses of the mobile communication devices of
the players, etc. Each mobile communication device may access the
attributes that are stored on (or otherwise accessible to) the
server via the network. In a first example implementation of
augmented reality combat system 200, a connection is established
directly between mobile communication devices for communication
therebetween. In a second example implementation, each mobile
communication device communicates with the server, and the server
is responsible for transferring communications from the originating
mobile communication devices to the recipient mobile communication
devices.
[0046] Augmented reality combat systems 100 and 200 are provided
for illustrative purposes and are not intended to be limiting. It
will be recognized that any of the mobile communication devices
described herein may communicate with each other directly and/or
via server(s).
[0047] FIG. 3 depicts a flowchart 300 of a method for performing
multi-player augmented reality combat with respect to a user of a
mobile communication device in accordance with an embodiment
described herein. Flowchart 300 is described from the perspective
of a mobile communication device. Flowchart 300 may be performed by
any of mobile communication devices 102A-102N of augmented reality
combat system 100 shown in FIG. 1, for example. For illustrative
purposes, flowchart 300 is described with respect to a mobile
communication device 400 shown in FIG. 4, which is an example of a
mobile communication device 102, according to an embodiment.
[0048] As shown in FIG. 4, mobile communication device 400 includes
a camera 402, an image recognition module 404, a command receipt
module 406, an outgoing attack module 408, a location module 410, a
location indicator receipt module 412, a distance determination
module 414, a time determination module 416, an environment module
418, an environment control module 420, an identification module
422, a display module 424, and an orientation determination module
426. Further structural and operational embodiments will be
apparent to persons skilled in the relevant art(s) based on the
discussion regarding flowchart 300. Flowchart 300 is described as
follows.
[0049] As shown in FIG. 3, the method of flowchart 300 begins at
step 302. In step 302, an image of a physical real-world
environment is captured. In an example implementation, camera 402
captures the image. For instance, display module 424 may render the
image for viewing by a user of mobile communication device 400. The
image may be augmented to include virtual objects, virtual
environmental conditions, etc. before it is rendered, though the
scope of the example embodiments is not limited in this
respect.
[0050] At step 304, a player indicator is identified in the image.
The player indicator corresponds to a player in the physical
real-world environment. For example, the player indicator may be a
designated pattern that is provided on an article of the player's
clothing or on another object that is associated with the player.
In another example, the player indicator may be a visual tag that
is associated with the player. The player indicator may be
identified in substantially real-time as the image of the physical
real-world environment is captured, though the scope of the example
embodiments is not limited in this respect. In an example
implementation, image recognition module 404 identifies the player
indicator.
[0051] At step 306, a user-initiated attack command is received in
response to identifying the player indicator. For instance, the
user of the mobile communication device may identify another player
in the real world by looking at it, aiming the device camera at the
other player and the image recognition module identifies the second
player. The player aims the center of the camera at the identified
second player and initiate the attack command by saying a word or
phrase that is associated with the attack command, pressing a
button on the mobile communication device that is associated with
the attack command, touching a touch screen of the mobile
communication device at the position on screen where the targeted
player is located in a manner that is associated with the attack
command (e.g., moving the user's finger up, down, right, left, or
diagonally on the touch screen; touching the touch screen in a
designated location that is associated with the attack command;
etc.), shaking the mobile device, or using any other suitable
technique. In an example implementation, command receipt module 406
receives the user-initiated attack command.
[0052] It will be recognized that a user may initiate an attack
command at any time, not only in response to identifying a player
indicator. For example, a user may initiate an attack command to
fire a virtual bullet at a virtual object. In another example, a
user may use a virtual weapon (e.g., a cannon) that has a
substantially wide area of hit to fire at players without the need
for identifying the players. In accordance with this example, the
players may be fired upon even if the players are hiding (i.e., not
in view of the user). For instance, location indicators that are
associated with the players may be relied upon for determining the
location of those players. Accordingly, a user-initiated attack
command may be received at any time.
[0053] At step 308, an attack indicator is transmitted in response
to receiving the user-initiated attack command. The attack
indicator specifies that a virtual bullet is fired at the player.
For instance, the attack indicator may be wirelessly transmitted in
accordance with a mobile communication protocol. In an example
implementation, outgoing attack module 408 transmits the attack
indicator to the central server or directly to the targeted player
device.
[0054] At step 310, a location of the mobile communication device
is determined In an example implementation, location module 410
determines the location of the mobile communication device.
[0055] In accordance with an example embodiment, the location of
the mobile communication device is determined based on location
signals (e.g., global positioning system (GPS) signals, wireless
signals received from a base station, etc.). For instance, each
location signal may include a location indicator that specifies a
location of its source and a time indicator that specifies a time
at which the location signal was transmitted by its source.
Location module 410 may combine the time indicators that are
included in the respective location signals and the times at which
the mobile communication device received the respective location
signals to determine transmit times of the respective location
signals. A transmit time is a duration of time for a location
signal to travel from its source to the mobile communication
device. Location module 410 may determine distances between the
mobile communication device and the sources of the respective
location signals based on the transmit times of the respective
location signals.
[0056] Location module 410 may combine the distances between the
mobile communication device and the sources of respective location
signals with the locations of the respective sources to determine
the location of the mobile communication device. For instance,
location module 410 may use a trilateration technique to combine
the distances with the sources' locations. Trilateration is a
technique for determining intersections of the surfaces of three
spheres based on the centers and radii of the spheres. In
accordance with this example embodiment, the centers of the spheres
correspond to the locations of the respective sources, and the
radii correspond to the distances between the mobile communication
device and the respective sources.
[0057] In accordance with another example embodiment, the location
of the mobile communication device is determined based on a
location indicator that specifies the location of the mobile
communication device. For example, one or more servers (e.g.,
server(s) 108) may determine the transmit times of the respective
location signals, determine the distances between the mobile
communication device and the sources of the respective location
signals, and combine the distances between the mobile communication
device and the sources of the respective location signals with the
locations of the respective sources to determine the location of
the mobile communication device. In accordance with this example,
location module 410 receives a location indicator from the
server(s) that specifies the location of the mobile communication
device. Location module 410 interprets the location indicator to
determine the location of the mobile communication device.
[0058] In another example, sources (e.g., base stations) may
provide request signals to the mobile communication device. The
mobile communication device may send response signals to the
respective sources in response to the request signals. Each source
may determine a distance between the mobile communication device
and the source based on a duration of a time period between
transmission of the respective request signal and receipt of the
corresponding response signal. The wireless communication system
may combine the distances between the mobile communication device
and the respective sources with the locations of the respective
sources to determine the location of the mobile communication
device. The wireless communication system may then provide a
location indicator that specifies the location of the mobile
communication device to the mobile communication device, enabling
the mobile communication device to determine its location based on
the location indicator.
[0059] In accordance with another example embodiment, the location
of the mobile communication device is determined based on the
strengths of signals that the mobile communication device receives
from respective sources. For instance, a trilateration technique
may be used to determine the location of the mobile communication
device based on the signal strengths.
[0060] At step 312, a player location indicator that specifies a
location of the player is received. For example, the player
location indicator may be wirelessly received. For instance, the
player location indicator may specify GPS coordinates that indicate
the location of the player. In an example implementation, location
indicator receipt module 412 receives the player location
indicator.
[0061] For example, each mobile communication device may send a
player location indicator that specifies the location of the player
that corresponds to that mobile communication device to a central
server, so that other mobile communication devices may access the
indicators on the server. In another example, each mobile
communication device may send a player location indicator that
specifies the location of the player that corresponds to that
mobile communication device to the other mobile communication
devices without routing the indicators through a server.
[0062] At step 314, a distance between the location of the mobile
communication device and the location of the player is determined.
In an example implementation, distance determination module 414
determines the distance between the location of the mobile
communication device and the location of the player.
[0063] At step 316, an estimated duration of a time period for the
virtual bullet to travel from the mobile communication device to
the player is determined based on the determined distance. In an
example implementation, time determination module 416 determines
the estimated duration of the time period for the virtual bullet to
travel from the mobile communication device to the player.
[0064] In accordance with an example embodiment, the estimated
duration of the time period is determined based on a virtual
environmental condition. Examples of a virtual environmental
condition include but are not limited to wind, sunlight, moonlight,
darkness, rain, snow, hail, fog, a sand storm, etc. For instance,
some environmental conditions (e.g., wind speed, wind direction,
rain, snow, etc.) may affect a flight time of the virtual bullet.
Virtual environmental conditions are discussed in further detail
below with reference to environment module 418 and environment
control module 420.
[0065] In accordance with another example embodiment, the estimated
duration of the time period is determined based on a difference
between an altitude of the mobile communication device and an
altitude of the player indicator. For example, distance
determination module 414 may provide a vector representation of the
distance between the location of the mobile communication device
and the location of the player that specifies a horizontal distance
and a vertical distance between the location of the mobile
communication device and the location of the player. In accordance
with this example, time determination module 416 may determine the
estimated duration of the time period based on the vector
representation of the distance. Further description of an example
technique for determining a difference between an altitude of the
mobile communication device and an altitude of the player indicator
is provided below with reference to orientation determination
module 426 and FIG. 14.
[0066] In accordance with yet another example embodiment, the
estimated duration of the time period is determined based on an
attribute of a virtual weapon that is used to fire the virtual
bullet. Examples of such an attribute include but are not limited
to a size and/or weight of the virtual bullet that is fired by the
virtual weapon, a force with which the virtual weapon fires the
virtual bullet, a condition of the virtual weapon, a type of the
virtual weapon, etc.
[0067] In some example embodiments, one or more steps 302, 304,
306, 308, 310, 312, 314, and/or 316 of flowchart 300 may not be
performed. Moreover, steps in addition to or in lieu of steps 302,
304, 306, 308, 310, 312, 314, and/or 316 may be performed.
[0068] Environment module 418 is configured to modify the image of
the physical real-world environment to include one or more virtual
environmental conditions. The image of the physical real-world
environment may include virtual objects and/or other information in
addition to the virtual environmental condition(s), though the
scope of the example embodiments is not limited in this respect.
For example, some virtual environmental conditions (e.g., sunlight
or mitigation of rain, snow, fog, etc.) may enhance or change
visibility with respect to physical and/or virtual objects in the
image. Other virtual environmental conditions (e.g., darkness,
rain, snow, fog, etc.) may inhibit visibility with respect to
physical and/or virtual objects in the image. In another example,
some virtual environmental conditions (e.g., mitigation of rain,
snow, fog, etc.) may increase a speed of the virtual bullet. Other
virtual environmental conditions (e.g., rain, snow, fog, etc.) may
decrease the speed of the virtual bullet. In yet another example,
some virtual environmental conditions (e.g., wind, sand storm,
hail, etc.) may change a direction of the virtual bullet as it
travels from the mobile communication device to the player.
[0069] Environment control module 420 is configured to control
virtual environmental conditions that are incorporated into the
image of the physical real-world environment in response to
user-initiated environmental control commands. For instance, a user
may acquire a virtual power that enables the user to initiate
commands for controlling one or more of the virtual environmental
conditions. Environment control module 420 may mitigate or
intensify an environmental condition with respect to the user's
view of the augmented reality environment (or the views of the
other players) upon receipt of a user-initiated environmental
control command from the user. The environmental condition may be
changed for a designated time period or indefinitely in response to
the user's environmental control command.
[0070] For example, the user may request that virtual sunlight be
provided with respect to the user's view of the augmented reality
environment to enhance the user's visibility. Environment control
module 420 may provide the virtual sunlight with respect to the
user's view of the augmented reality environment, but not with
respect to the views of the other users. In another example, the
user may request that the intensity of a virtual snowstorm in the
augmented reality environment be mitigated (or that the virtual
snowstorm be terminated) with respect to the user's view of the
augmented reality environment. Environment control module 420 may
mitigate (or terminate) the virtual snowstorm with respect to the
user's view of the augmented reality environment, but not with
respect to the views of the other users.
[0071] In yet another example, the user may request that virtual
rain be provided with respect to the other players' views of the
augmented reality environment to reduce visibility of the other
players. Environment control module 420 may provide the virtual
rain with respect to the other players' views, but not with respect
to the view of the user who requested the rain. In still another
example, the user may request that virtual sunlight be removed from
the other players' views of the augmented reality environment.
Environment control module 420 may remove the virtual sunlight from
the views of the other players, but not from the view of the user
who initiated the request.
[0072] The environment control examples described above are
provided for illustrative purposes and are not intended to be
limiting. It will be recognized that environment control module 420
may mitigate or intensify an environmental condition with respect
to all players' views of the augmented reality environment in
response to a user's environment control command.
[0073] Identification module 422 is configured to modify the image
to include identification information regarding a player whose
player indicator is identified in the image. Examples of
identification information include but are not limited to a
player's name, photograph, team affiliation, rank, experience
level, virtual attack success rate, Twitter.RTM. account address,
instant message address, score in the game, virtual shield type,
virtual shield strength, health condition, virtual weapons
available to the player and/or virtual weapon currently being used
by the player, etc. Identification module 422 may modify the image
to selectively include designated identification information
regarding players in accordance with instructions received from the
user of mobile communication device 400. For instance,
identification module 422 may modify the image to include all of
the identification information regarding the players when mobile
communication device 400 is pointed at the players (e.g., when
image recognition module 404 recognizes a player and/or when camera
402 is pointed toward an area having coordinates that correspond to
a location of the player).
[0074] Display module 424 is configured to render images that are
captured by camera 402 and/or modified by environment module 418
and/or identification module 422.
[0075] Orientation determination module 426 is configured to
determine an orientation (e.g., tilt) of mobile communication
device 400. For instance, if camera 402 is pointed at a player,
orientation determination module 426 is capable of determining a
difference between an altitude of mobile communication device 400
and an altitude of the player. For example, orientation
determination module 426 may include an accelerometer for
determining the orientation of mobile communication device 1402. In
accordance with an embodiment, distance determination module 416
determines the distance between the location of mobile
communication device 400 and the location of the player based on
the altitude difference that is determined by orientation
determination module 426.
[0076] FIG. 14 illustrates a technique for determining a difference
(labeled as "A") between an altitude of a mobile communication
device and an altitude of a player in accordance with an embodiment
described herein. As shown in FIG. 14, a mobile communication
device 1402 is pointed at a player 1404. For instance, a camera of
mobile communication device 1402 may be pointed at player 1404.
Element 1406 represents a two-dimensional (e.g., GPS) location of
mobile communication device 1402. Element 1408 represents a
two-dimensional (e.g., GPS) location of player 1404. Accordingly,
the distance "B" represents a two-dimensional distance between
mobile communication device 1402 and player 1404.
[0077] Element 1410 represents a three-dimensional location of
mobile communication device 1402. Accordingly, the distance "C"
represents a three-dimensional distance between mobile
communication device 1402 and player 1404. The altitude of mobile
communication device 1402 is shown with reference to the altitude
of player 1404 for ease of discussion. It will be recognized that
mobile communication device 1402 and player 1404 may have any
respective altitudes. The three-dimensional distance "C" between
mobile communication device 1402 and player 1404 may be determined
in accordance with the following equation:
C = B sin ( .alpha. ) , ##EQU00001##
where B is the two-dimensional distance "B" between mobile
communication device 1402 and player 1404, and .alpha. is the angle
between lines A and C.
[0078] It will be recognized that mobile communication device 400
of FIG. 4 may not include one or more of camera 402, image
recognition module 404, command receipt module 406, outgoing attack
module 408, location module 410, location indicator receipt module
412, distance determination module 414, time determination module
416, environment module 418, environment control module 420,
identification module 422, display module 424, and/or orientation
determination module 426. Furthermore, mobile communication device
400 may include modules in addition to or in lieu of camera 402,
image recognition module 404, command receipt module 406, outgoing
attack module 408, location module 410, location indicator receipt
module 412, distance determination module 414, time determination
module 416, environment module 418, environment control module 420,
identification module 422, display module 424, and/or orientation
determination module 426.
[0079] The functionality of environment module 418, environment
control module 420, and identification module 422 is described in
further detail below with reference to FIGS. 5-8. FIGS. 5-8 show
mobile communication devices 500, 600, 700, and 800 that display
example views of an augmented reality environment in accordance
with embodiments described herein. As shown in FIG. 5, mobile
communication device 500 includes a display 502 that displays an
image of a physical real-world environment. The image shows a
player 504 who has a player indicator 506 affixed to his shirt. It
will be recognized that player indicator 506 may be associated with
player 504 in any suitable manner and need not necessarily be
affixed to the player's person. Mobile communication device 500 is
configured to identify player indicator 506.
[0080] Upon identifying player indicator 506, mobile communication
device 500 may provide a sensory signal to a user of mobile
communication device 500 to indicate that player indicator 506 has
been identified, though the scope of the example embodiments is not
limited in this respect. A sensory signal is a signal that is
perceptible by a human. For instance, the sensory signal may be an
audio signal having a frequency in the audible spectrum (e.g., in a
range between 20 hertz (Hz) and 20,000 kilohertz (kHz)), a visual
signal having a frequency in the visible spectrum (e.g., in a range
between 400 terahertz (THz) and 790 THz), a tactile signal, or any
other signal that is human-perceptible. A tactile signal is a
signal that a human is capable of perceiving using the sense of
touch. For example, a tactile signal may be provided using a
vibration mechanism of mobile communication device 500.
[0081] As shown in FIG. 6, mobile communication device 600 displays
an image of an augmented reality environment that includes the
physical real-world environment as shown in FIG. 5 with the
addition of a virtual environmental condition. The virtual
environmental condition in this example is rain 602. It will be
recognized that rain 602 may reduce the visibility of a user of
mobile communication device 600.
[0082] FIG. 7 illustrates that a user may control a virtual
environmental condition with respect to a view of the augmented
reality environment that is displayed to the user. As shown in FIG.
7, mobile communication device 700 displays the augmented reality
environment as shown in FIG. 6, except that rain 702 in FIG. 7 is a
mitigated version of rain 602 that is shown in FIG. 6. FIG. 7
illustrates that a user of mobile communication device 700 moves
her finger 706 downward on a touch screen of mobile communication
device 700, as depicted by arrow 704. The downward motion is
interpreted by communication device to be an environmental control
command, in response to which mobile communication device 700
mitigates the intensity of virtual rain 602 to provide rain 702. It
will be recognized that mitigation of the environmental condition
in this example increases visibility with respect to the augmented
reality environment.
[0083] As shown in FIG. 8, mobile communication device 800 displays
an image of an augmented reality environment that includes the
physical real-world environment as shown in FIG. 5 with the
addition of identification information 802 regarding player 504.
Identification information 802 is shown to include a name of player
504 and a team affiliation of player 504 for illustrative purposes
and is not intended to be limiting. Identification information 802
may include any suitable information regarding player 504.
[0084] FIG. 9 depicts a flowchart 900 of a method for performing
multi-player augmented reality combat with respect to a user of a
mobile communication device in accordance with an embodiment
described herein. For illustrative purposes, flowchart 900 is
described with respect to a mobile communication device 1000 shown
in FIG. 10, which is an example of a mobile communication device
102, according to an embodiment.
[0085] As shown in FIG. 10, mobile communication device 1000
includes an incoming attack module 1002, a sensory signal module
1004, an authorization determination module 1006, a command
determination module 1008, a speed control module 1010, and a
stationary determination module 1012. Further structural and
operational embodiments will be apparent to persons skilled in the
relevant art(s) based on the discussion regarding flowchart 900.
Flowchart 900 is described as follows.
[0086] As shown in FIG. 9, the method of flowchart 900 begins at
step 902. In step 902, an incoming attack indicator that specifies
that a virtual bullet is directed at a user of a mobile
communication device is received. For instance, the incoming attack
indicator may be wirelessly received. In an example implementation,
incoming attack module 1002 receives the incoming attack
indicator.
[0087] At step 904, a sensory signal is provided to the user in
response to receiving the incoming attack indicator. For instance,
the sensory signal may be an audio signal having a frequency in the
audible spectrum (e.g., in a range between 20 hertz (Hz) and 20,000
kilohertz (kHz)), a visual signal having a frequency in the visible
spectrum (e.g., in a range between 400 terahertz (THz) and 790
THz), a tactile signal, or any other signal that is
human-perceptible. In an example implementation, sensory signal
module 1004 provides the sensory signal.
[0088] At step 906, a determination is made whether the user is
authorized to provide a speed control command for controlling a
speed of the virtual bullet. The determination may be based on
attributes of the user, attributes of a player who fired the
virtual bullet at the user, and/or attributes of the game. In an
example implementation, authorization determination module 1006
determines whether the user is authorized to provide a speed
control command. For example, the determination may be based on
whether the user has acquired an attribute power that authorizes
the user to provide a speed control command. In accordance with
this example, if the user has acquired the power, authorization
determination module 1006 determines that the user is authorized to
provide a speed control command. In further accordance with this
example, if the user has not acquired the power, authorization
determination module 1006 determines that the user is not
authorized to provide a speed control command. If the user is
authorized to provide a speed control command, flow continues to
step 908. Otherwise, flowchart 900 ends.
[0089] At step 908, a determination is made whether a
user-initiated speed control command is received. In an example
implementation, command determination module 1008 determines
whether a user-initiated speed control command is received. If a
user-initiated speed control command is received, flow continues to
step 910. Otherwise, flow continues to step 912.
[0090] In accordance with an example embodiment, players are
capable having a power-blocking power that blocks another player's
ability to utilize a power. For instance, a player who has a
power-blocking power may block use of the speed control command
described in step 908, thereby preventing the user-initiated speed
control command from being received.
[0091] At step 910, the speed of the virtual bullet is controlled
in response to the user-initiated speed control command. For
example, the speed of the virtual bullet may be reduced in response
to the user-initiated speed control command. In accordance with
this example, the virtual bullet's reduced speed may provide the
user more time to react to the virtual bullet. For instance, the
user may view the virtual bullet on a display of the communication
device and take action in the physical real-world environment to
avoid being hit by the virtual bullet in the augmented reality
environment. In an example implementation, speed control module
1010 controls the speed of the virtual bullet. Upon completion of
step 910, flowchart 900 ends.
[0092] The user may utilize any of a variety of powers in an
attempt to avoid being hit by the virtual bullet and/or mitigate an
affect of being hit by the virtual bullet. For example, the user
may increase the strength of the user's virtual shield for a
specified duration or indefinitely, increase a speed with which the
user is capable of moving for a specified duration or indefinitely,
etc.
[0093] At step 912, a determination is made whether the virtual
bullet is stationary. For instance, if the virtual bullet is no
longer in motion, it may be non-productive to continue to determine
whether a user-initiated speed control command is received. In an
example implementation, stationary determination module 1012
determines whether the virtual bullet is stationary. If the virtual
bullet is stationary, flowchart 900 ends. Otherwise, flow returns
to step 908.
[0094] In some example embodiments, one or more steps 902, 904,
906, 908, 910, and/or 912 of flowchart 900 may not be performed.
Moreover, steps in addition to or in lieu of steps 902, 904, 906,
908, 910, and/or 912 may be performed.
[0095] It will be recognized that mobile communication device 900
may not include one or more of incoming attack module 1002, sensory
signal module 1004, authorization determination module 1006,
command determination module 1008, speed control module 1010,
and/or stationary determination module 1012. Furthermore, mobile
communication device 900 may include modules in addition to or in
lieu of incoming attack module 1002, sensory signal module 1004,
authorization determination module 1006, command determination
module 1008, speed control module 1010, and/or stationary
determination module 1012.
[0096] FIG. 11 depicts a flowchart 1100 of a method for performing
multi-player augmented reality combat with respect to a user of a
mobile communication device in accordance with an embodiment
described herein. For illustrative purposes, flowchart 1100 is
described with respect to a mobile communication device 1200 shown
in FIG. 12, which is an example of a mobile communication device
102, according to an embodiment.
[0097] As shown in FIG. 12, mobile communication device 1200
includes an injury determination module 1202, a distance
determination module 1204, and a recovery control module 1206.
Further structural and operational embodiments will be apparent to
persons skilled in the relevant art(s) based on the discussion
regarding flowchart 1100. Flowchart 1100 is described as
follows.
[0098] As shown in FIG. 11, the method of flowchart 1100 begins at
step 1102. In step 1102, a determination is made that a user has
incurred a virtual injury. In an example implementation, injury
determination module 1202 determines that the user has incurred the
virtual injury.
[0099] At step 1104, a distance between a location of a mobile
communication device of the user and a designated location is
determined In an example implementation, distance determination
module 1204 determines the distance between the location of the
mobile communication device of the user and the designated
location.
[0100] At step 1106, a rate at which the user recovers from the
virtual injury is controlled based on the determined distance. For
example, if the designated location is a location of a physical or
virtual hospital, the rate at which the user recovers from the
virtual injury may be inversely proportional to the distance
between the location of the mobile communication device and the
hospital. For instance, the user may recover more quickly if the
user is closer to the hospital. The user may recover more slowly if
the user is farther from the hospital. In another example, if the
designated location is a location of a toxic dump site, the rate at
which the user recovers from the virtual injury may be directly
proportional to the distance between the location of the mobile
communication device and the toxic dump site. For instance, the
user may recover more quickly if the user is farther from the toxic
dump site. The user may recover more slowly if the user is closer
to the toxic dump site. In an example implementation, recovery
control module 1206 controls the rate at which the user recovers
from the virtual injury.
[0101] When a player is hit with a virtual bullet, that player's
view of the augmented reality environment may be changed to
simulate the hit. Each hit may be graded according to physical
and/or virtual factors, including but not limited to the type of
virtual bullet, the type of weapon that fired the virtual bullet,
the distance traveled by the virtual bullet before it hit the
player, collected tools that each player is using, etc. A higher
hit grade corresponds to a higher extent of injury, and a lower hit
grade corresponds to a lower extent of injury. For instance, each
player may use a virtual shield that offers protection from some
weapons and/or bullets. Each shield may be more (or less) effective
against weapons and/or bullets in designated virtual environmental
conditions.
[0102] A hit grade may have a designated effect on a virtual
condition of a player. For instance, the player's view of the
augmented reality environment may be changed such that aiming a
virtual weapon is more difficult for the player. Other ways in
which the player's view may be changed include but are not limited
to showing fog, showing blood on the display, covering the player's
view of the augmented reality environment (or a portion thereof),
causing the player's view of the augmented reality environment to
be unstable, out of focus, zoomed out, zoomed in, etc.
[0103] In accordance with an example embodiment, a player may
recover from a virtual injury as time passes or by collecting
and/or using virtual curing objects. As the player recovers, the
player's hit grade decreases. Each curing object, each virtual
shield, and each unit of time (e.g., second, minute, etc.) may have
a respective designated curing effect. In accordance with another
example embodiment, some players may have virtual healing tools
that they may use to heal other players. For example, a player who
possesses a healing tool may stand near a virtually wounded player
in the physical real-world environment to assist the recovery of
the wounded player in the augmented reality environment. In another
example, a player who possesses a virtual healing weapon may fire a
virtual healing bullet at a wounded player to assist the recovery
of the wounded player. When a player's hit grade reaches an upper
threshold, the player may be considered as virtually dead. Any
suitable technique may be used to revive the player from the
deceased state.
[0104] In accordance with another example embodiment, each type of
virtual weapon and/or virtual bullet may have respective
characteristics. One type of characteristic is a hit effect. A hit
effect is an effect that results when a virtual bullet that is
fired by a virtual weapon hits an object. For example, a virtual
shotgun may have a wider diameter of hit, but a lower range, as
compared to a virtual sniper rifle. A virtual cannon may have a
substantially wide area of hit. For instance, using a virtual
cannon, a group of players may be targeted based solely on player
location indicator(s) associated with the players, without the need
for identifying the players using an image recognition technique.
Targeting players in this manner may be useful if the players are
hiding in such a way that a camera cannot be used to capture an
image of the players. In another example, a virtual sniper rifle
may be used to fire a relatively fast and accurate virtual bullet
that is less affected by virtual environmental conditions and/or
user-initiated speed control commands.
[0105] Any suitable type of virtual weapons, virtual bullets,
and/or virtual tools having user-defined characteristics may be
created in the augmented reality environment. Such creations may be
acquired by players and used in the augmented reality environment
to affect targeting, shooting, and hitting simulation in the
augmented reality environment.
[0106] Virtual items may be located throughout the augmented
reality environment. The virtual items may represent points,
virtual weapons, virtual tools, and/or any other suitable virtual
items. Players may collect a virtual item in any of a variety of
ways. For example, a player may collect a virtual item by moving in
the physical real-world environment such that the player moves
closer to a virtual location of the virtual item in the augmented
reality environment. Each virtual item has a location in the real
world, and when a device location system (e.g., device location
system 106) determines that the user is close enough to the virtual
item's location, the user is collects the virtual item. In another
example, the player may fire a virtual bullet that hits the virtual
item. In yet another example, the player may collect a virtual item
by trading points for the virtual item. In still another example,
the player may collect a virtual item by purchasing the virtual
item using virtual money, real money, or a combination thereof.
[0107] Players may acquire points by completing tasks; firing
virtual bullets that hit other players; collecting virtual points;
purchasing the points using virtual money, real money, or a
combination thereof; etc. The points may be traded for virtual
items, virtual money, real money, or a combination thereof.
[0108] Players may view a map showing some or all of the locations
of the other players in the augmented reality environment. For
instance, players may see the locations of their team members.
Players may leave virtual markers, notes, voice messages,
recordings, and/or virtual items on the map for other players to
pick up. In another example, players can leave traps for other
players, such as virtual mines, virtual grenades, etc that may be
triggered when those players come within a designated proximity of
the traps. Players may communicate using audio and/or video
conferencing. A variety of other features that are known in the
relevant art(s) (e.g., the computer gaming art) may be incorporated
into the multi-player augmented reality combat techniques described
herein.
[0109] In accordance with another example embodiment, the direction
of a virtual bullet may be controlled by a user who fires the
virtual bullet. For example, after firing the bullet, the user may
move the camera of the user's mobile communication device to cause
the virtual bullet to shift direction toward the new camera
direction. The ability to control the bullet may be a property of
the bullet, the virtual weapon that is used to fire the virtual
bullet, and/or a power that is associated with a virtual item that
is collected by the user. Controlling the virtual bullet direction
is similar to controlling a guided missile that can be used to
target players as the players move (e.g., to get away from the
virtual bullet) or to target players who are hiding behind
shelters.
[0110] In accordance with another example embodiment, a user may
initiate virtual mobile controlled objects that may be controlled
by the user. Examples of virtual mobile controlled objects include
but are not limited to virtual aircraft (e.g., helicopters,
airplanes, gliders, steerable balloons, etc.), virtual vessels
(e.g., ships, submarines, etc.), virtual land vehicles, etc. Each
virtual mobile controlled object may have a corresponding speed, a
duration of availability, and a pre-defined source (i.e., virtual
base). The user may control the navigation of the virtual mobile
controlled objects. A virtual mobile controlled object may view
location indicators of other players at a designated range
according to the location of each user's mobile communication
device and the type of the virtual mobile controlled object.
[0111] For example, a virtual airplane may carry virtual bombs that
the user can release on top of other players. The other players'
mobile communication devices may have object indicators that are
capable of indicating that a virtual mobile controlled object is
nearby. For instance, the object indicators may provide a sound,
cause the virtual mobile controlled object to be rendered on a
screen of a mobile communication device when a camera of the mobile
communication device is pointed at the virtual mobile controlled
object, etc. The players may fire virtual bullets at the virtual
mobile controlled object in order to hit it. Other types of virtual
mobile controlled objects may be virtual soldiers or any other type
of virtual mobile controlled object that a user may control. If a
virtual mobile controlled object does not return back to its
virtual base within a specified time, is the virtual mobile
controlled object may be disabled. It is possible to for a user to
collect virtual mobile controlled objects like any other virtual
item described herein.
[0112] In accordance with yet another example embodiment, players
may trigger virtual weapons, such as artillery, remotely. Each such
virtual weapon has a specified range from its source position. A
user may define the source position of a remotely triggered virtual
weapon with respect to the user's location. When the user fires the
remotely triggered virtual weapon, the mobile communication devices
of the players who are in the area of the expected hit may generate
a sensory signal, such as a sound, vibration of the players'
devices, etc. as described in previous examples.
[0113] In still another example embodiment, an augmented reality
combat system may include a station controlled by a connected
communication device that is connected to the network. The station
may act as a command center that displays the positions of the
players on a map, statuses of the players, etc. The command center
may use a relatively large screen and/or a relatively more powerful
computer system that can consume and process substantial data. A
user who uses the command center can assist the other players to
perform and act as a team. The command center may collect
information about the positions of opponents whose locations are
being provided by their mobile communication devices or by virtual
mobile controlled objects. The command center may display prior
locations of the players, so that the user who uses the command
center may have a better understanding of the movement patterns of
the opponent players.
[0114] In another example embodiment, players may be capable of
using camera zoom capabilities to facilitate identification of
player indicators. The ability to use the camera zoom capabilities
may be based on possession of specified attributes, use of
specified weapons, or any other suitable criteria. For example,
although a virtual cannon may not use zoom, a virtual sniper rifle
may.
[0115] Each mobile communication device may be mounted on or
incorporated in a physical device that is shaped like a weapon.
Displays of the mobile communication devices may be displayed using
glasses. For instance, cameras of the mobile communication devices
may be attached to the glasses, so that the players may point by
looking in a direction. Any of the devices and/or components
thereof may be carried on a player's person or in accessories that
are available to the player.
III. Example Computer Implementation
[0116] The embodiments described herein, including systems,
methods/processes, and/or apparatuses, may be implemented using
well known computers, such as computer 1300 shown in FIG. 13. For
example, elements of example augmented reality combat systems 100
and 200, including server(s) 108 depicted in FIG. 1, device
location system 106 depicted in FIGS. 1 and 2, any of the mobile
communication devices 102A-102N depicted in FIGS. 1 and 2, any of
mobile communication devices 400, 500, 600, 700, 800, 1000, and
1200 depicted in respective FIGS. 4, 5, 6, 7, 8, 10, and 12 and
elements thereof, and each of the steps of flowcharts 300, 900, and
1100 depicted in respective FIGS. 3, 9, and 11 can each be
implemented using one or more computers 1300.
[0117] Computer 1300 can be any commercially available and well
known computer capable of performing the functions described
herein, such as computers available from Apple, Dell, Gateway, HP,
International Business Machines, Sony, etc. Computer 1300 may be
any type of computer, including a desktop computer, a server,
etc.
[0118] As shown in FIG. 13, computer 1300 includes one or more
processors (e.g., central processing units (CPUs)), such as
processor 1306. Processor 1306 may include camera 402, image
recognition module 404, command receipt module 406, outgoing attack
module 408, location module 410, location indicator receipt module
412, distance determination module 414, time determination module
416, environment module 418, environment control module 420,
identification module 422, display module 424, and/or orientation
determination module 426 of FIG. 4; incoming attack module 1002,
sensory signal module 1004, authorization determination module
1006, command determination module 1008, speed control module 1010,
and/or stationary determination module 1012 of FIG. 10; injury
determination module 1202, distance determination module 1204,
and/or recovery control module 1206 of FIG. 12; or any portion or
combination thereof, for example, though the scope of the
embodiments is not limited in this respect. Processor 1306 is
connected to a communication infrastructure 1302, such as a
communication bus. In some embodiments, processor 1306 can
simultaneously operate multiple computing threads.
[0119] Computer 1300 also includes a primary or main memory 1308,
such as a random access memory (RAM). Main memory has stored
therein control logic 1324A (computer software), and data.
[0120] Computer 1300 also includes one or more secondary storage
devices 1310. Secondary storage devices 1310 include, for example,
a hard disk drive 1312 and/or a removable storage device or drive
1314, as well as other types of storage devices, such as memory
cards and memory sticks. For instance, computer 1300 may include an
industry standard interface, such as a universal serial bus (USB)
interface for interfacing with devices such as a memory stick.
Removable storage drive 1314 represents a floppy disk drive, a
magnetic tape drive, a compact disk drive, an optical storage
device, tape backup, etc.
[0121] Removable storage drive 1314 interacts with a removable
storage unit 1316. Removable storage unit 1316 includes a computer
useable or readable storage medium 1318 having stored therein
computer software 1324B (control logic) and/or data. Removable
storage unit 1316 represents a floppy disk, magnetic tape, compact
disc (CD), digital versatile disc (DVD), Blue-ray disc, optical
storage disk, memory stick, memory card, or any other computer data
storage device. Removable storage drive 1314 reads from and/or
writes to removable storage unit 1316 in a well known manner.
[0122] Computer 1300 also includes input/output/display devices
1304, such as monitors, keyboards, pointing devices, etc.
[0123] Computer 1300 further includes a communication or network
interface 1320. Communication interface 1320 enables computer 1300
to communicate with remote devices. For example, communication
interface 1320 allows computer 1300 to communicate over
communication networks or mediums 1322 (representing a form of a
computer useable or readable medium), such as local area networks
(LANs), wide area networks (WANs), the Internet, etc. Network
interface 1320 may interface with remote sites or networks via
wired or wireless connections. Examples of communication interface
1322 include but are not limited to a modem, a network interface
card (e.g., an Ethernet card), a communication port, a Personal
Computer Memory Card International Association (PCMCIA) card,
etc.
[0124] Control logic 1324C may be transmitted to and from computer
1300 via the communication medium 1322.
[0125] Any apparatus or manufacture comprising a computer useable
or readable medium having control logic (software) stored therein
is referred to herein as a computer program product or program
storage device. This includes, but is not limited to, computer
1300, main memory 1308, secondary storage devices 1310, and
removable storage unit 1316. Such computer program products, having
control logic stored therein that, when executed by one or more
data processing devices, cause such data processing devices to
operate as described herein, represent embodiments of the
invention.
[0126] For example, each of the elements of example mobile
communication device 400 depicted in FIG. 4, including camera 402,
image recognition module 404, command receipt module 406, outgoing
attack module 408, location module 410, location indicator receipt
module 412, distance determination module 414, time determination
module 416, environment module 418, environment control module 420,
identification module 422, display module 424, and orientation
determination module 426; each of the elements of example mobile
communication device 1000 depicted in FIG. 10, including incoming
attack module 1002, sensory signal module 1004, authorization
determination module 1006, command determination module 1008, speed
control module 1010, and stationary determination module 1012; each
of the elements of example mobile communication device 1200
depicted in FIG. 12, including injury determination module 1202,
distance determination module 1204, and recovery control module
1206; and each of the steps of flowcharts 300, 900, and 1100
depicted in respective FIGS. 3, 9, and 11 can be implemented as
control logic that may be stored on a computer useable medium or
computer readable medium, which can be executed by one or more
processors to operate as described herein.
[0127] The invention can be put into practice using software,
hardware, and/or operating system implementations other than those
described herein. Any software, hardware, and operating system
implementations suitable for performing the functions described
herein can be used.
IV. Conclusion
[0128] While various embodiments have been described above, it
should be understood that they have been presented by way of
example only, and not limitation. It will be apparent to persons
skilled in the relevant art(s) that various changes in form and
details can be made therein without departing from the spirit and
scope of the invention. Thus, the breadth and scope of the present
invention should not be limited by any of the above-described
exemplary embodiments, but should be defined only in accordance
with the following claims and their equivalents.
* * * * *