Augmented Reality Mission Generators

Lee; Brian Elan ;   et al.

Patent Application Summary

U.S. patent application number 13/414491 was filed with the patent office on 2012-09-13 for augmented reality mission generators. This patent application is currently assigned to Fourth Wall Studios, Inc.. Invention is credited to Brian Elan Lee, Michael Sean Stewart, James Stewartson.

Application Number20120231887 13/414491
Document ID /
Family ID45976510
Filed Date2012-09-13

United States Patent Application 20120231887
Kind Code A1
Lee; Brian Elan ;   et al. September 13, 2012

Augmented Reality Mission Generators

Abstract

Augmented reality (AR) mission generators are described that generate missions based on environmental data separate from a user's location. The environmental data can be obtained from a user's mobile device or using other sensors or third-party information. The missions can be generated from an AR mission template stored in a mission database, and presented to the user on the user's mobile device.


Inventors: Lee; Brian Elan; (Venice, CA) ; Stewart; Michael Sean; (Davis, CA) ; Stewartson; James; (Manhattan Beach, CA)
Assignee: Fourth Wall Studios, Inc.
Culver City
CA

Family ID: 45976510
Appl. No.: 13/414491
Filed: March 7, 2012

Related U.S. Patent Documents

Application Number Filing Date Patent Number
61450052 Mar 7, 2011

Current U.S. Class: 463/39
Current CPC Class: A63F 2300/209 20130101; A63F 13/217 20140902; A63F 2300/69 20130101; A63F 13/79 20140902; A63F 13/216 20140902; A63F 2300/807 20130101; A63F 13/92 20140902; A63F 13/65 20140902; A63F 2300/406 20130101; A63F 13/822 20140902; A63F 2300/8082 20130101; A63F 13/213 20140902; A63F 13/332 20140902
Class at Publication: 463/39
International Class: A63F 9/24 20060101 A63F009/24

Claims



1. An augmented reality mission generator comprising: a mission database storing augmented reality (AR) mission templates; an AR mission generator engine coupled with the mission database and with a mobile device capable of transmitting a location of the mobile device and ambient environmental data separate from the mobile device's location, the AR mission generator configured to: obtain the environmental data from the mobile device, generate a mission based on at least one mission template and the environmental data, and configure the mobile device to present the mission.

2. The generator of claim 1, wherein the AR mission templates comprises a mission defined based on a grammar.

3. The generator of claim 2, wherein the grammar comprises verbs relating to AR objects.

4. The generator of claim 1, wherein the AR mission templates comprise AR mission template objects.

5. The generator of claim 4, wherein the AR mission generator engine is further configured to generate an AR mission object by populating an AR mission template object based on the environmental data.

6. The generator of claim 5, where in the AR mission object comprise a mission objective.

7. The generator of claim 5, wherein the AR mission object comprises a reward object.

8. The generator of claim 7, wherein the reward object comprises at least one of the following: an award point, a currency, a virtual object, a real-world object, a relationship, and a promotion.

9. The generator of claim 1, the AR mission generator engine is further configured to select a mission template based on the environmental data.

10. The generator of claim 1, wherein the AR mission templates comprise a dynamic mission template.

11. The generator of claim 1, wherein the AR mission templates comprise a chain mission template.

12. The generator of claim 1, wherein the mobile device comprises at least one of the following: a vehicle, a phone, a sensor, a gaming platform, a portable computer, and a media player.

13. The generator of claim 1, the AR mission templates comprise a multi-player mission template.

14. The generator of claim 13, wherein the multi-player mission template comprises cooperative objectives.

15. The generator of claim 13, wherein the multi-player mission template comprises counter objectives.

16. The generator of claim 1, wherein the AR mission templates comprise an exercise program.

17. The generator of claim 1, further comprising an analysis engine configured to establish correlations between player demographics and mission objectives.

18. The generator of claim 1, wherein the environmental data comprises a digital representation of a scene.

19. The generator of claim 18, wherein the digital representation comprises data from multiple sensors.

20. The generator of claim 18, wherein the digital representation comprises at least one of the following types of data: image data, audio data, haptic data, weather data, location data, movement data, biometric data, and orientation data.
Description



[0001] This application claims the benefit of priority to U.S. provisional application having Ser. No. 61/450,052 filed on Mar. 7, 2011. This and all other extrinsic materials discussed herein are incorporated by reference in their entirety. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.

FIELD OF THE INVENTION

[0002] The field of the invention is mixed reality technologies.

BACKGROUND

[0003] With the popularity of on-line virtual world games like World of Warcraft.RTM. and advances in mobile device processing capabilities, it is quite a wonder that no viable, marketable union of the two has yet been achieved. One likely reason is the static nature of the virtual worlds offer very limited pliability in the real-world. Another reason might include the failure to integrate real-world aspects into a game so that game has a broader appeal. Ideally, an augmented reality environment would combine goals of a game with the real-world.

[0004] To some degree U.S. pat. publ. no. 2004/0041788 to Ternullo (publ. March 2004) provides some techniques suitable for us in an augmented reality system. Simplistically, Ternullo only allows for a virtual walk through of a home.

[0005] Others have put forth some effort in combing virtual and real-world gaming systems. For example, U.S. pat. no. 6951515 to Oshima et al., and U.S. pat. no. 6972734 also to Oshima et al., both describe integrating virtual objects with the real-world. Unfortunately the Oshima approaches require bulky support equipment and fail to appreciate the world itself could be a platform for a mixed reality environment.

[0006] More recently, U.S. pat. no. 7564469 to Cohen and U.S. pat. publ. no. 2007/0104348 also to Cohen (publ. May 2007) both provide additional details regarding interacting with virtual objects in the real-world. Still, these citations merely focus on interactions between virtual object and the real-world as opposed to game play.

[0007] U.S. pat. publ. no. 2006/0223635 to Rosenberg (publ. October 2006) takes simulated gaming a step further by combing simulated gaming objects and events with the real-world. A display can present simulated objects on a display. However, even Rosenberg fails to appreciate the dynamic nature of the real-world and that each game player can have their game play experience.

[0008] U.S. pat. publ. no. 2007/0281765 to Mullen (publ. December 2007) discusses systems and methods for location based games. Although Mullen contemplates using the physical location of the user to correspond to a virtual location of a virtual character, Mullen fails to contemplate the use of ambient environmental information apart from location information when generating the game. U.S. pat. publ. no. 2011/0081973 to Hall (publ. April 2011) discusses a different location based game, but also fails to contemplate the use of ambient environmental information apart from location information when generating the game.

[0009] U.S. pat. publ. no. 2011/0319148 to Kinnebrew, et al. (publ. December 2011) advances location-based gaming by combining real world and virtual elements to influence game play. However, Kinnebrew also fails to contemplate the use of ambient environmental information apart from location information when generating the game, which limits the influence of a player's real-world environment on the game play.

[0010] Unless the context dictates the contrary, all ranges set forth herein should be interpreted as being inclusive of their endpoints, and open-ended ranges should be interpreted to include commercially practical values. Similarly, all lists of values should be considered as inclusive of intermediate values unless the context indicates the contrary.

[0011] It has yet to be appreciated that an augmented reality platform can be constructed to generate augmented reality missions for users. Rather than being bound to a static game play, a mission can be generated, possibly from a template, based on a user's environment or data collected about the user's environment. Mission objects can have their attributes populated based on the environment data. For example, all red cars local to the user can become mission objects. As the missions are based on a user's environment, two users could experience quite different missions even though the missions are generated from the same template.

[0012] Thus, there is still a need for augmented reality mission generators that utilize ambient environmental data to generate a mission.

SUMMARY OF THE INVENTION

[0013] The inventive subject matter provides apparatus, systems and methods in which one can provide augmented or mixed reality experiences to users. One of the many aspects of the inventive subject matter includes an augmented reality (AR) gaming system capable of generating one or more AR missions. An AR mission can be presented to a user via a mobile device (e.g., portable computer, media player, cell phone, vehicle, game system, sensor, etc.) where the user can interact with the mission via the mobile device, or other interactive devices.

[0014] AR missions can be generated via an AR mission generator that includes a mission database storing one or more AR mission templates and an AR mission engine coupled with the database. The AR mission engine can obtain environmental data apart from location information (e.g., GPS coordinates) from one or more remote sensing devices, including the user's mobile device, where the environmental data comprises a digital representation of a scene. The AR mission engine can combine information derived from the digital representation of the scene with an AR mission template to construct a quest (i.e., an instantiated mission) for the user. For example, the AR mission engine can select a mission template from the database based on the environmental data and the location of the user's mobile device, and then populate the mission template with AR objects (e.g., objectives, rewards, goals, etc.) to flush out the mission. One should note the attributes of the AR objects can also be populated based on the environmental data. When the user is presented with the mission, it is contemplated that one or more of the AR objects can be superimposed on a real-world view of a scene. Using the inventive subject matter discussed herein, users can conceivably convert the entire planet into a usable game space.

[0015] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWING

[0016] FIG. 1 is a schematic of an augmented reality system having an augmented reality mission generator.

DETAILED DESCRIPTION

[0017] It should be noted that while the following description is drawn to a computer/server based augmented reality generator, various alternative configurations are also deemed suitable and may employ various computing devices including servers, interfaces, systems, databases, engines, agents, controllers, or other types of computing devices operating individually or collectively. One should appreciate the computing devices comprise a processor configured to execute software instructions stored on a tangible, non-transitory computer readable storage medium (e.g., hard drive, solid state drive, RAM, flash, ROM, etc.). The software instructions preferably configure the computing device to provide the roles, responsibilities, or other functionality as discussed below with respect to the disclosed apparatus. In especially preferred embodiments, the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on SMS, MMS, HTTP, HTTPS, AES, public-private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchanging methods. Data exchanges preferably are conducted over a packet-switched network, the Internet, LAN, WAN, VPN, PAN, or other type of packet switched network.

[0018] One should appreciate that the disclosed techniques provide many advantageous technical effects including providing an augmented reality infrastructure capable of configuring one or more mobile devices to present a mixed reality interactive environment to users. One should also appreciate that the mixed reality environment, and accompany missions, can be constructed from external data obtained from sensors that are external to the infrastructure. For example, a mission can be populated with information obtained from satellites, Google.RTM. StreetView.TM., third party mapping information, security cameras, kiosks, televisions or television stations, set top boxes, weather stations, radios or radio stations, web sites, cellular towers, or other data sources.

[0019] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously.

[0020] The following discussion provides many example embodiments of the inventive subject matter. Although each embodiment represents a single combination of inventive elements, the inventive subject matter is considered to include all possible combinations of the disclosed elements. Thus if one embodiment comprises elements A, B, and C, and a second embodiment comprises elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.

[0021] FIG. 1 presents an overview of one embodiment of an augmented or mixed reality environment 100 where a user can obtain one or more missions from an AR mission generator 110. In the embodiment shown, each user can utilize a mobile device 102 to obtain sensor data from one or more sensor(s) 104 related to a scene 120 or the user's environment that is separate from a user's location information. Upon proper registration, authentication, or authorization, a user's mobile device 102 can exchange the collected environmental data or a digital representation of the scene 120 with the AR mission generator 110. Data exchanges preferably are conducted over a network 130, which could include, for example, cell networks, mesh networks, Internet, LANs, WANs, VPNs, PANs, or other types of networks or combinations thereof. In some embodiments, the AR mission generator 110 can generate one or more missions for the user, at least in part based on the obtained environment data. The mobile device 102 could also transmit location information such as GPS coordinates and/or cellular triangulation information to the AR mission generator 110.

[0022] The mobile device 102 is presented as a smart phone, which represents one of many different types of devices that can integrate into the overall AR environment 100. Mobile devices can include, for example, smart phones and other wireless telephones, laptops, netbooks, tablet PCs, and other mobile computers, vehicles, sensors (e.g., a camera), media players, personal digital assistants, MP3 or other media players, watches, and gaming platforms. Other types of devices can include electronic picture frames, desktop computers, appliances (e.g., STB, kitchen appliances, etc), kiosks, non-mobile sensors, media players, game consoles, televisions, or other types of devices. Preferred devices have a communication link and offer a presentation system (e.g., display, speakers, vibrators, etc.) for presenting AR data to the user.

[0023] Environmental data or a digital representation of the scene 120 can include data from multiple sources or sensors. In the embodiment shown, a sensor 122 (e.g., a camera) collects data from a lamppost while the mobile device 102 also collects data via at least one sensor 104. Contemplated sensors can include, for example, microphones, magnetometers, accelerometers, biosensors, still and video cameras, weather sensors, optical sensors, or other types of sensors. Furthermore, the types of data used to form a digital representation of the scene can cover a wide range of modalities including image data, audio data, haptic data, or other modalities. Even further, additional data can include weather data, location data, orientation data, movement data, biometrics data, or other types of data.

[0024] The AR mission generator 110 can include one or more modules or components configured to support the roles or responsibilities of the AR mission generator 110. As shown in FIG. 1, the AR mission generator 110 can include an AR mission template database 112 and an AR mission engine 114. Although the AR mission template database 112 and AR mission engine 114 are shown as local to the AR mission generator 110, it is contemplated that one or both of the AR mission template database 112 and AR mission engine 114 can be separate from, and located locally or remotely with respect to, the AR mission generator 110. The AR mission template database 112 can store a plurality of AR mission template objects where each mission template object comprises attributes or metadata describing characteristics of a mission. In some embodiments, the mission template objects can be stored as an XML file or other serialized format. A mission template object can include a wide spectrum of information including, for example, name/ID of mission, a type of mission (e.g., dynamic, chain, etc.), goals, supporting objects, rewards, narratives, digital assets (e.g., video, audio, etc), mission requirements (e.g., required weapons, achievements, user level, number of players, etc.), location requirements (e.g., indoors or outdoors), conditions, programmatic instructions, links to other missions, or other information that can be used to instantiate a mission.

[0025] The AR mission generator 110 is illustrated as being remote relative to the scene 120 or mobile device 102. However, it is specifically contemplated that some or all of the features of the mission generator 110, AR mission engine 114 and/or AR mission template database 112, for example, can be integrated into the mobile device 102. In such embodiments, information can be exchanged through an application program interface (API) or other suitable interface. In other embodiments, the AR mission engine 114 or other components can comprise a distal computing server, a distributed computing platform, or even an AR computing platform.

[0026] The AR mission engine 114 is preferably configured to obtain environmental data from the user's mobile device 102, about the scene 120 proximate to the mobile device 102. Based on the environmental data, the AR mission engine 114 can determine the characteristics of the scene 120 and generate one or more missions (i.e., an instantiated mission) from an AR mission template object from the mission template database 112. Scene characteristics can include user identification and capabilities of the mobile device 102 including, for example, available sensors 104, screen size, processor speed, available memory, presence of a camera or other imaging sensor. Scene characteristics can also include weather conditions, visual images, location information, orientation, captured audio, presence and type of real-world objects, or other types of characteristics. The AR mission engine 114 can compare the characteristics to the requirements, attributes, or conditions associated with the stored AR mission template objects to select a mission template. Once selected or otherwise obtained, the AR mission engine 114 can instantiate a mission for the user from the selected mission template object. It is contemplated that the AR mission generator 110 can configured the mobile device 102 to present the generated mission.

[0027] It is also contemplated that missions can be generated through numerous methods. In preferred embodiments, a mission template object includes a defined grammar having verbs that define user actions with respect to one or more AR objects associated with a mission. For example, an AR mission template object might have several verbs that define a mission with respect to the user's actions. Contemplated verbs include, for example, read, view, deliver, fire (e.g., a weapon, etc.), upgrade, collect, converse, travel, or other actions. By defining AR mission templates based at least in part on a grammar, mission template development can be greatly streamlined, and mission complexity can be significantly reduced for users.

[0028] The AR objects associated with a mission template can also be stored as a template, or rather as AR object templates. When a mission is generated, the selected AR mission template object can be populated based on the environmental data. As an example, a user could be in a shopping mall and log in to the AR mission generator 110 via their mobile phone to obtain a mission. The AR mission engine 114 recognizes from the user's location (e.g., based on GPS coordinates) that the user is in a mall, and selects a mission that requires the user to collect objects. With the knowledge that the user is in a mall, the AR mission engine 114 instantiates AR objects as mannequins, and the mission requires that the user travels around the mall photographing mannequins (e.g., collecting the AR objects) to complete the mission. One should note the mobile device 102 could be configured to identify the mannequins, or other object of interest, by its associated features such as by using image recognition software.

[0029] Populating attributes or features of a mission or associated AR objects can also be achieved through object recognition. As a user collects data associated with the scene 120, such as through still images or video from a camera on the mobile device 102 for example, the AR mission engine 114, perhaps in the mobile device 102, can recognize real-world objects in the scene 120 and use the objects' attributes to populate attributes of the one or more AR objects 124. The attributes can be simply observed or looked-up from a database based on object recognition algorithms (e.g., SIFT, vSLAM, Viper, etc.).

[0030] Thus, for example, a user may capture a picture of a scene having a plurality of trees. The trees can be recognized by the AR mission engine, and AR objects can be generated based upon the trees' attributes (e.g., size, leave color, distance from mobile device, etc.).

[0031] The AR objects associated with a mission can range across a full spectrum of objects from completely real-world objects through completely virtual objects. Exemplary AR objects can include, for example, a mission objective, a reward, an award point, a currency, a relationship, a virtual object, a real-world object, a promotion, a coupon, or other types of objects. The AR objects can be integrated into the real-world via mobile device 102. For example, as the user pans and tilts their mobile device 102, the AR objects associated with the mission could be superimposed (overlaid) on the captured scene 120 while also maintaining their proper location and orientation with respect to real-world objects within the scene 120. Superimposing images of AR objects on a real-world image can be accomplished by many techniques. One suitable technique that could be adapted for use with the inventive subject matter includes those found in U.S. pat. no. 6771294 to Pulli et al. One should appreciate that superimposing AR objects on a digital representation of a real-world scene is considered to include other modalities beyond visual data, including audio, haptic, kinesthetic, temperature, or other types of modal data.

[0032] Using the inventive subject matter discussed herein, many different types of missions are possible, especially in view that a mission can be customized for a specific user based on the user's specific environment. Still, the missions can be efficiently based on just a few types of mission templates. One especially interesting type of mission is a dynamic mission that can be fully customizable for the user. Dynamic missions can be a single one-off mission constructed in real-time if desired based on the obtained environmental data. While completion of a dynamic mission may not advance a story, users may obtain rewards for completing the mission including, for example, points, levels, currency, weapons, and experience. Examples of dynamic missions include shooting ten boars, collective five coins, going on a night patrol, finding a treasure, and so forth.

[0033] Another interesting type of mission is a chain mission that can be linked with preceding or succeeding missions to form a story arch. Chain mission can be constructed with more thought to create a greater level of immersion for the user.

[0034] Up to this point, missions have been presented as a single player platform. However, one should appreciate that missions can also comprise multi-player missions requiring two or more users. When multiple users are involved, new types of interactions can occur. Some multi-player missions might require cooperative objectives, while other multi-player missions might comprise counter objectives for the players where the players oppose or compete against each other. Because of the AR nature of the missions, it is contemplated that players could be in a variety of disparate locations while interacting with one another. An exemplary mission having counter objectives could be to infiltrate an enemy's base or to defend a fort.

[0035] In more preferred embodiments, missions are associated with game play. Still, missions can bridge across many markets beyond game play. Other types of missions can be constructed as an exercise program, an advertising campaign, or even following an alternative navigation route home. By constructing various types of missions for a user, the user can be enticed to discover new businesses or opportunities, possibly commercial opportunities.

[0036] Interestingly, missions constructed around commercial opportunities can target a wide variety of player demographics, psychographics, or other player attributes. Contemplated AR systems can include an analysis engine that correlates player attributes against mission objectives. Collecting and tracking of such information can be advantageous to businesses when targeting promotions or missions to players or other individuals.

[0037] Methods for generating AR missions are also contemplated. In some contemplated embodiments, ambient environmental data separate from the mobile device's location can be received. An AR mission generator can select an AR mission template from a mission database coupled to the AR mission generator. It is contemplated that the AR mission template can be selected based at least in part upon the ambient environmental data.

[0038] A mission can be generated using the AR mission generator and the selected AR mission template, where the mission is based on at least a portion of the ambient environmental data. A mobile device can be configured via the AR mission generator to present the generated mission to a user.

[0039] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the scope of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C . . . and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.

* * * * *


uspto.report is an independent third-party trademark research tool that is not affiliated, endorsed, or sponsored by the United States Patent and Trademark Office (USPTO) or any other governmental organization. The information provided by uspto.report is based on publicly available data at the time of writing and is intended for informational purposes only.

While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, reliability, or suitability of the information displayed on this site. The use of this site is at your own risk. Any reliance you place on such information is therefore strictly at your own risk.

All official trademark data, including owner information, should be verified by visiting the official USPTO website at www.uspto.gov. This site is not intended to replace professional legal advice and should not be used as a substitute for consulting with a legal professional who is knowledgeable about trademark law.

© 2024 USPTO.report | Privacy Policy | Resources | RSS Feed of Trademarks | Trademark Filings Twitter Feed