U.S. patent application number 09/790157 was filed with the patent office on 2001-11-08 for behavior modeling in a gaming environment with contextual accuracy.
Invention is credited to Brown, Geoffrey Parker.
Application Number | 20010039203 09/790157 |
Document ID | / |
Family ID | 26880040 |
Filed Date | 2001-11-08 |
United States Patent
Application |
20010039203 |
Kind Code |
A1 |
Brown, Geoffrey Parker |
November 8, 2001 |
Behavior modeling in a gaming environment with contextual
accuracy
Abstract
An interactive gaming system is described in which characters
are generated and controlled to model human behavior in an accurate
manner. The behaviors of the characters are contextually accurate
because the behaviors adapt to accurately reflect the attitudes of
the characters toward the user. A database stores relationship data
representing the attitudes of the characters. A gaming engine
executing on a computer is coupled to the database and receives
input representing the user's interaction with the character and
updates the relationship data. The gaming engine generates media
for display to the user based on the relationship data. The media
may include text-based dialogue, digital photographs of the
character, video, audio and the like. The database stores data
defining one or more behavior patterns by which the gaming engine
dynamically generates the gaming environment and controls the
character so as to model human nature.
Inventors: |
Brown, Geoffrey Parker;
(Deephaven, MN) |
Correspondence
Address: |
Shumaker & Sieffert, P.A.
150 Gateway Corporate Center I
576 Bielenberg Drive
St. Paul
MN
55125
US
|
Family ID: |
26880040 |
Appl. No.: |
09/790157 |
Filed: |
February 21, 2001 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
60184341 |
Feb 23, 2000 |
|
|
|
Current U.S.
Class: |
463/16 |
Current CPC
Class: |
A63F 13/12 20130101;
A63F 13/63 20140902; A63F 2300/807 20130101; A63F 2300/6018
20130101; A63F 13/10 20130101 |
Class at
Publication: |
463/16 |
International
Class: |
A63F 009/24 |
Claims
1. A method comprising: maintaining relationship data representing
perception of at least one attribute of a user by a character in a
gaming environment; selecting one of a plurality of behavior
patterns for the character based on the relationship data; and
generating media for the gaming environment based on the selected
behavior pattern.
2. The method of claim 1 further comprising receiving input
representing interaction of a user with the character within the
gaming environment.
3. The method of claim 1 further comprising communicating the media
to the user via a computer network.
4. The method of claim 3, wherein generating media comprises
generating text-based dialogue.
5. The method of claim 3, wherein generating media comprises
generating graphics.
6. The method of claim 5, wherein each behavior pattern is
associated with a set of frames and generating media comprises
assembling character-specific media and situation-specific media
based on a current frame of the selected behavior pattern.
7. The method of claim 6, wherein assembling the character specific
media and the situation specific media comprises selecting
character-specific media and situation-specific media based on
media tags specified by the current frame.
8. The method of claim 1 further comprising selecting the character
from a plurality of characters, wherein each character is
associated with character-specific media.
9. The method of claim 8 further comprising maintaining a set of
character tag types; maintaining a set of character tags for each
of the characters based on the set of character tag types; and
associating the character-specific media with the corresponding
character via the character tags.
10. The method of claim 3, wherein generating media comprises
selecting a digital photograph of the character based on the
selected behavior pattern.
11. The method of claim 1 further comprising selecting one of a
plurality of situations, wherein each situation is associated with
situation-specific media, and generating media includes generating
one of the situation-specific media based on the selected behavior
pattern and the selected situation.
12. A method comprising: selecting one of a plurality of
situations, wherein each situation is associated with media;
selecting one of a plurality of characters, wherein each character
is associated with media; selecting one of a plurality of behavior
patterns for the character; and forming a gaming environment from
the media associated with the selected situation and the media
associated with the selected character based on the selected
behavior pattern.
13. The method of claim 12, wherein further comprising: receiving
input from the user representing interactions of the user with the
characters; and updating a set of relationship variables for each
character based on the received input.
14. The method of claim 13, further comprising selecting a second
behavior pattern as a function of the updated relationship
variables.
15. The method of claim 13, further comprising selecting a
photograph for the character based on the relationship
variables.
16. The method of claim 13, further comprising selecting a second
situation as a function of the updated relationship variables.
17. A computer-readable medium having instructions for causing a
programmable processor to: maintain relationship data representing
attitude of a character of a game toward a user; and select one of
a plurality of behavior patterns based on the relationship
data.
18. The computer-readable medium of claim 17, further comprising
instructions for causing the processor to: receive input
representing interaction of the user with the character; and update
relationship data based on the input, wherein the relationship data
represents attitude of the character toward the user; and
19. The computer-readable medium of claim 17 wherein each behavior
pattern has a set of frames and the instructions cause the
processor to assemble character-specific media and
situation-specific media based on a current frame of the selected
behavior pattern.
20. The computer-readable medium of claim 17, further comprising
instructions for causing the processor to control the character
within the gaming environment based on the relationship data.
21. The computer-readable medium of claim 17 further comprising
instructions for causing the processor to: generate media based on
the relationship data; and communicate the media to the user.
22. The computer-readable medium of claim 21 further comprising
instructions for causing the processor to generate text-based
dialogue.
23. The computer-readable medium of claim 21 further comprising
instructions for causing the processor to select character-specific
media and situation-specific media based on media tags specified by
the current frame.
24. The computer-readable medium of claim 17 further comprising
instructions for causing the processor to select the character from
a plurality of characters, wherein each character is associated
with character-specific media.
25. The computer-readable medium of claim 24 further comprising
instructions for causing the processor to: maintain a set of
character tag types; maintain a set of character tags for each
character of character based on the set of character tag types; and
associate the character-specific media with the corresponding
character via the character tags.
26. The computer-readable medium of claim 17 further comprising
instructions for causing the processor to select a digital
photograph of the character based on the relationship data.
27. The computer-readable medium of claim 19 further comprising
instructions for causing the processor to select one of a plurality
of situations, wherein each situation is associated with
situation-specific media.
28. A computer-readable medium having data structure stored
thereon, the data structure comprising: a set of data fields to
store behavior patterns for a character of an online game; and a
set of data fields to store relationship data for the character,
wherein the relationship data represents perception by the
character of a user.
29. The computer-readable medium of claim 28 further comprising a
third set of data fields to store range data mapping the behavior
patterns to ranges of the relationship data.
30. The computer-readable medium of claim 28 further comprising: a
set of data fields to store situations for the online game; and a
set of data fields to store range data mapping the situations to
ranges of the relationship data.
31. The computer-readable medium of claim 28 further comprising a
set of data fields to store initialization data for the
relationship data, wherein the initialization data represents one
or more predispositions of the character.
32. A computer-readable medium having a data structure stored
thereon comprising: a data field to store a likeability
relationship skill variable representing a degree to which a
character likes a user; a data field to store a trust relationship
skill variable representing a degree to which the character trusts
the user; and a third data field to store a power relationship
skill variable representing a level of power that the character
perceives the user having.
33. The computer-readable medium of claim 32 further including: a
fourth data field to store a intelligence relationship skill
variable representing a level of intelligence that the character
perceives the user having; and a fifth data field to store an
attraction relationship skill variable representing a level of
attraction that the character has for the user.
34. A system comprising: a database to store relationship data
representing an attitude of a character toward a user of an online
game, and behavioral data defining one or more behavior patterns
for the character; and a gaming engine executing on a computer
coupled to the database to select one of the behavior patterns
based on the relationship data.
35. The system of claim 34, wherein the gaming engine generates
media for display to the user based on the relationship data.
36. The system of claim 35, wherein the media comprises text-based
dialogue.
37. The system of claim 35, wherein the media comprises a digital
photograph of the character.
38. The system of claim 35, wherein the media comprises video and
audio.
39. The system of claim 34, wherein the gaming engine receives
input representing interaction of the user with the character and
updates the relationship data.
40. The system of claim 34, wherein the database stores a set of
frames for each behavior pattern and the gaming engine generates
the media for the online game by assembling character-specific
media and situation-specific media specified by the current frame
of the selected behavior pattern.
Description
[0001] This application claims priority from U.S. provisional
application Ser. No. 60/184,341, filed Feb. 23, 2000, the contents
being incorporated herein by reference.
TECHNICAL FIELD
[0002] The invention relates to computer software and, more
particularly, to techniques for modelling human behavior with
contextual accuracy in a gaming environment.
BACKGROUND
[0003] The computer gaming industry has seen tremendous growth and
now includes a wide variety of platforms including hand-held games,
software games executing on a desktop computer, dedicated gaming
machines such as Nintendo and the Sony PlayStation, and online
games provided over computer networks such as the World Wide Web
(WWW). The games range from simply action-oriented games designed
to test the user's reflexes to interactive, role-playing games
where the user interacts with characters in a two-dimensional or
three-dimensional gaming environment.
[0004] With the migration of computer games to the WWW, more and
more people are participating in interactive games in which players
interact with each other or with a number of predefined characters.
Some games may even have predefined modular stories that can be
reused to control the settings for the game.
SUMMARY
[0005] In general, the invention provides an interactive gaming
environment in which fictional characters are generated and
controlled to accurately model human behavior. The behaviors of the
characters adapt to accurately reflect the characters' behavioral
attitude toward individual users (players) and, therefore, are
contextually accurate as the user interacts with the various
characters. In this manner, the invention provides realistic
characters.
[0006] Relationship skill variables are maintained for the various
characters of the gaming environment to track how the users treat
the characters and how the characters perceive the players. The
relationship skill variables influence the gaming environment and
how the characters treat the users in future encounters. For
example, if a user is belligerent or unfriendly toward a character,
the character may treat the user badly in the future. If the user
is helpful to the character, the character may be helpful
later.
[0007] In one embodiment, the invention is directed to a system in
which a database stores relationship data representing the attitude
of a character toward a user. A gaming engine executing on a
computer coupled to the database receives input representing
interaction of the user with the character and updates the
relationship data. The gaming engine generates media for display to
the user based on the relationship data. The media may include
text-based dialogue, digital photographs of the character, video,
audio and the like. The database stores data defining one or more
behavior patterns by which the gaming engine controls the character
and models human nature. The gaming engine selects one of the
behavior patterns based on the relationship data.
[0008] In another embodiment, the invention is directed to a method
in which input is received representing interaction of a user with
a character within a gaming environment. Relationship data
representing the attitude of a character toward the user is
maintained based on the input. The character and the gaming
environment are controlled based on the relationship data. For
example, media, such as dialogue, graphics, audio and video can be
generated based on the relationship data.
[0009] In another embodiment the invention is directed to a method
in which one of a plurality of situations defined within a database
is selected, each situation being associated with media. One of a
plurality of characters is selected, each character being
associated with media. One of a plurality of behavior patterns is
selected for the character. A gaming environment is formed from the
media associated with the selected situation, the media associated
with the selected character and the selected behavior pattern. The
gaming environment is then presented to the user.
[0010] In another embodiment, the invention is directed to a
computer-readable medium having instructions for causing a
programmable processor to receive input representing a user's
interaction with a character within a gaming environment and
maintain relationship data based on the input. The relationship
data represents the attitude of the character toward the user.
[0011] In yet another embodiment, the invention is directed to a
computer-readable medium having a data structure stored thereon
comprising a set of data fields to store behavior patterns for a
character of an online game and a set of data fields to store
relationship data for the character, where the relationship data
represents the character's perception of a user. The
computer-readable medium may also comprise a set of data fields to
store range data mapping the behavior patterns to ranges of the
relationship data.
[0012] The invention provides many advantages. For example, as
described above, characters behave more naturally because they
follow the behavior patterns that are consistent with their
perceptions of the users. Production costs for the online gaming
environment may be dramatically reduced because any behavior
pattern can merge with any situation to dynamically create
media.
[0013] The details of one or more embodiments of the present
invention are set forth in the accompanying drawings and the
description below. Other features, objects, and advantages of the
present invention will be apparent from the description, drawings
and the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] FIG. 1 is a block diagram illustrating a system for
providing an interactive gaming environment according to the
invention.
[0015] FIG. 2 is a block diagram illustrating the system in further
detail.
[0016] FIG. 3 illustrates an example schema for a database
configured to store a variety of data for the gaming
environment.
[0017] FIG. 4 is a flow diagram illustrating a typical mode of
operation for a gaming engine while starting a new game.
[0018] FIG. 5 is a flow diagram illustrating a typical mode of
operation for the gaming engine while the user proceeds through a
behavior pattern.
[0019] FIG. 6 is a flow diagram illustrating a typical mode of
operation for the gaming engine while transitioning from one
behavior pattern to another.
[0020] FIG. 7 illustrates an example user interface presented by a
client computing device by which a system administrator controls
and configures the online gaming environment.
[0021] FIG. 8 illustrates the user interface when the system
administrator has elected to configure an existing situation.
[0022] FIG. 9 illustrates a window presented by the user interface
for preventing a situation from being selected.
[0023] FIG. 10 illustrates an example window in which the system
administrator interacts with the gaming engine to exclude behavior
patterns based on situation.
[0024] FIG. 11 illustrates an example window of the user interface
presented by the gaming engine by which the system administrator
can manage situation tags for assembling dialogue based on the
current situation.
[0025] FIG. 12 illustrates an example window of the user interface
by which the system administrator can define new locations and
modify existing locations.
[0026] FIG. 13 illustrates a window by which the system
administrator can define the probabilities that a character will
appear at the various locations.
[0027] FIG. 14 illustrates an example window by which the user can
define a new behavior pattern or modify existing behavior
patterns.
[0028] FIG. 15 illustrates an example window by which the system
administrator can create new situation tags.
[0029] FIG. 16 illustrates an example window presented by the user
interface that displays the behavior pattern frames of the
currently defined behavior patterns.
[0030] FIG. 17 illustrates an example window for creating frames
(lines) for behavior patterns.
[0031] FIG. 18 illustrates an example window for creating a new
character.
[0032] FIG. 19 illustrates an example window presented by which the
system administrator can create new character tags.
[0033] FIG. 20 illustrates an example window presented by the
gaming engine by which the system administrator can view and modify
the photographs of the gaming environment.
[0034] FIG. 21 illustrates an example window for creating and
modifying character tag types.
[0035] FIG. 22 illustrates an example window of the user interface
for setting the game sensitivity.
DETAILED DESCRIPTION
[0036] FIG. 1 is a block diagram illustrating a system 2 for
providing an interactive gaming environment according to an
embodiment of the invention. One or more users 4 access online
gaming environment 6 via network 8 and interact with one or more
fictional characters. As explained in detail below, online gaming
environment 6 models human behavior in a natural and accurate
manner to provide realistic characters. The behaviors of the
characters adapt to accurately reflect their current attitudes
toward users 4 based on past interaction.
[0037] Online gaming environment 6 integrates behavior patterns,
situations, locations and characters to create interactive media,
such as text dialogue and graphics. In this manner, gaming
environment 6 reflects the characters' behavioral attitude toward
the individual user and, therefore, is contextually accurate as
users 4 interact with the various characters. In particular, for a
given situation, the characters may exhibit different behavioral
attitudes toward different users based on past interaction with the
users.
[0038] Each user 4 typically interacts with a computing device
suitable for accessing online gaming environment 6 via network 8.
For example, a user 4 may interact with a personal computer, laptop
computer, or even a personal digital assistant (PDA) such as a
Palm.TM. organizer from Palm Inc. of Santa Clara, Calif. The
communication device executes communication software, typically a
web browser such as Internet Explorer.TM. from Microsoft
Corporation of Redmond, Wash., in order to communicate with gaming
environment 8. Network 8 represents any communication link suitable
for communicating digital data, such as a wide-area network, local
area network, or a global computer network like the World Wide
Web.
[0039] FIG. 2 is a block diagram illustrating gaming environment 6
in further detail. Web servers 12 provide an interface for
communicating with users 4 via network 8. In one configuration, web
servers 12 execute web server software, such as Internet
Information Server.TM. from Microsoft Corporation, of Redmond,
Wash. In another configuration, web servers 26 execute Websphere
Application Server.TM. on a Domino.TM. Server from International
Business Machines Corporation (IBM) of Armonk, N.Y. As such, web
servers 12 provide an environment for interacting with users 4
according to gaming engine 14, which may comprise a variety of
software modules including Active Server Pages, Java scripts, Java
Applets, Lotus scripts, web pages written in hypertext markup
language (HTML) or dynamic HTML, Active X modules, Shockwave.TM.,
Flash.TM. and other suitable software modules.
[0040] Gaming engine 14 generates the gaming environment including
assembling content, such as text, video, audio and graphic
elements. Gaming engine 14 accesses database 16 to store and
retrieve information describing the possible characters, locations,
situations and character behavior patterns. Database 16 can be
implemented in many forms including a relational database, such as
Microsoft Access or SQL Server from Microsoft Corporation, running
on one or more database servers. Photograph repository 18 stores a
set of photographs for each of the characters and locations of
gaming environment 6 illustrating a variety of expressions and
reactions.
[0041] FIG. 3 illustrates an example schema for database 16
configured to store a variety of data used by gaming engine 14 to
generate and control the gaming environment. Generally, gaming
engine 14 maintains data in a number of tables including behavior
pattern table 22, situation table 24, character table 26 and
location table 28. Gaming engine 14 extracts data from the various
tables of database 16 and assembles the data to form content for
the game. For example, gaming engine 14 may integrate a current
behavior pattern with a situation, a location and one or more
characters to create interactive media, such as text dialogueue and
graphics, for the game. In this manner, the media can be assembled
in a modular manner, taking attributes from the tables in database
16. The assembled media reflect the character's current behavioral
attitude toward the user and, as explained below, is contextually
accurate as the user interacts with the various characters.
[0042] Behavior pattern table 22 stores one or more behavior
patterns that are modular branching scripts or patterns with finite
ends. Each pattern provides a model by which the character
interacts with the user according to a consistent attitude. For
example, a behavior pattern may define a series of interactions
between the user and character in which the character flirts with
the user. A second behavior pattern may define a series of
interactions in which the character criticizes the user. Other
examples of behavior patterns for characters include self-serving,
discussing, disclosing and repulsing.
[0043] An individual behavior pattern has a set of frames that, as
described below, are linked so as to control the flow of the
character's interaction with the user during a particular behavior.
Each frame includes a sequence of behavior tags for selecting and
incorporating media, such as text-based dialogue, from situation
table 24, location table 28 and character table 26.
[0044] Notably, each behavior pattern has a limited number of
frames, one or more starting frames and one or more ending frames.
In this manner, behavior patterns can readily be used with a
variety of different situations, characters and locations to
efficiently generate dialogue without appearing to the user as
redundant.
[0045] In one embodiment, each frame of a behavior pattern
comprises (1) media defining a number of choices for the user; (2)
a set of pointers to other destination frames within the behavior
pattern, each pointer corresponding to one of the choices; (3)
presentation information to be displayed when the frame is pointed
to by another frame; and (4) a set of modification values for
adjusting the current character's attitudes toward the user when
the user causes the character to arrive at the particular frame of
a behavior pattern.
[0046] To display a current frame within a behavior pattern, gaming
engine 14 locates the destination frames pointed to by the current
frame and retrieves the corresponding presentation information.
Ending frames have a frame attribute set indicating the frame is an
end of a behavior pattern. Typically, the frames within a behavior
pattern are written with minimal static media having most of the
media pulled from the situation table 24 and character table
26.
[0047] Situation table 24 stores one or more situations that
describe the context in which the user interacts with the
character. In an office setting, for example, the character may be
accusing the user of stealing papers from the office, giving the
user a project for completion or discussing a recent hiring
decision by the company.
[0048] Each situation defines a number of situation tags that can
be used in the frames of the behavior patterns to identify
situation-specific media. The tags can be created, modified and
incorporated into behavior patterns by a system administrator or
other author. Generally, a tag can be written either from the
user's perspective or from the character's perspective. The
following table illustrates example situation tags for generating
media:
1TABLE II AdOneTask CounterSmartSolutionUSER SmartSolution
AdOneTaskName CounterStupidSolution SmartSolutionUSER
AdOneTaskThing CounterStupidSolutionUSER StationName AdThreeTask
CounterUnlikelySolution StupidSolution AdThreeTaskName
CounterUnlikelySolutionUSER StupidSolutionUSER AdThreeTaskThing
Excuse1 SubTask1 AdTwoTask Excuse1USER SubTask1USER AdTwoTaskName
Excuse2 SubTask1Reason AdTwoTaskThing Excuse2USER
SubTask1ReasonUSER ConsequencesIfDone Excuse2USERR SubTask2
ConsequencesIfDoneUSER Excuse3 SubTask2USER ConsequencesIfUndone
Excuse3USER Task ConsequencesIfUndoneUSER GameClueOne TaskName
CounterExcuse1USER NeutralFact1 TaskNameUSER CounterExcuse2
NeutralFact1USER TaskOverseer CounterExcuse2USER NeutralFact2
TaskOverseerUSER CounterExcuse3 NeutralFact2USER TaskUSER
CounterExcuse3USER OpeningStatusSentence TaskThing CounterExecuse1
PossibleSolution1 TaskThingUSER CounterPossibleSolution1
PossibleSolution1USER TestType CounterPossibleSolution1USER
PossibleSolution2 UnlikelySolution CounterPossibleSolution2
PossibleSolution2USER UnlikelySolutionUSER
CounterPossibleSolution2USER ReasonForTask Weather
CounterSmartSolution ReasonForTaskUSER WeatherUSER
[0049] Character table 26 defines one or more characters for
interacting with the user. The characters are configured to model
real life in that each character has his or her own expressions and
patterns of talking. These patterns and characteristics change
according to the character's interaction with the user. Each record
of character table 26 stores a set of relationship skill variables
that reflect how a particular character perceives the user's
actions, such as the user's responses to choices. Gaming engine 14
updates the relationship data of the characters as the user
proceeds through the game and interacts with the characters. As the
user moves through various locations, situations and behavior
patterns, the characters act and react according to previous
interactions with the user, i.e., the characters' behaviors reflect
the changes in the relationship data.
[0050] In one embodiment, each character record stores the
relationship data in five relationship skill variables representing
the character's opinion of the user's trustworthiness, likeability,
intelligence, power and attraction. Other possible relationship
skill variables include perceptiveness, clarity and
knowledgeability. However, it may be preferable to limit the number
of variables to save production costs, storage space and processing
time, and so as not to dilute the character's reaction to the user.
Each character's relationship skill variables are valid over a
predefined range, such as -50 to +50, and are initially preset to
reflect any predispositions of the character. Each character record
may also define a set of sensitivity values representing the
character's sensitivity for each skill variable.
[0051] Gaming engine 14 selects the behavior patterns for the
various characters based on the present value of the relationship
skill variables for the current character. Each behavior pattern is
mapped to a predefined range of the skill variables. For example,
the system administrator may define a "disclose" behavior pattern,
in which the character discloses secrets to the user, when
likeability is greater than +30, trust is greater than +45,
intelligence is greater than +15, power is greater than +10 and
attractiveness is greater than +40.
[0052] In addition to the skill relationship variables, each
character record stored in character table 26 corresponds to a set
of photographs within photograph repository 18. For example, a
character record may correspond to 50 or 100 close-up photographs
of a person illustrating a wide variety of emotional expressions.
The photos may be provided without a background to facilitate their
use in different locations. Furthermore, each photo is mapped to a
range of one of the skill variables. Based on the current value of
a cbaracter's skill variables, gaming engine 14 selects a
photograph and overlays the selected photograph of the character
with the photograph of the current location, discussed below, to
form media for the present frame of the behavior pattern. This
allows the character's expressions to readily change independently
of location. Other media may also used to portray the character's
expressions including video and audio.
[0053] The character records of character table 26 may further
define probabilities of the characters appearing at each location.
This reflects the real-life fact that people tend to hang out at
certain locations. For example, a character may have a 25%
likelihood of being in the kitchen but a 75% chance of being in the
laboratory.
[0054] Like records of situation table 24, each record of character
table 26 defines a number of tags that can be used in the frames of
the behavior patterns to identify character-specific media for use
in the behavior frames. The following table illustrates example
character tags for generating media:
2TABLE III AdOneTask CharacterName Joke2 AdOneTaskName
CharGameClueOne Joke3 AdOneTaskThing Curse Joke4 AdThreeTask
CurseAdjective OfficeVictory1 AdThreeTaskName CurseAdverb
Philosophy AdThreeTaskThing CurseNoun SelfSecret1 AdTwoTask
Exclamation SelfSecret2 AdTwoTaskName HobbyNoun StationName
AdTwoTaskThing HobbySentence TestType AmusementNoun HouseholdTask
UnpleasantAlternative AmusementTask Joke1 WarStory
[0055] Location table 28 stores one or more location records that
define the gaming environment. Gaming engine 14 accesses location
table 28 to retrieve media, such as text-graphics, audio and video,
representing the current setting in which the character and the
user interact. For example, each record in the location table 28
may map to one or more background photographs stored in photograph
repository 18.
[0056] The user may navigate through the gaming environment in a
number of ways. For example, the user may interact with gaming
engine 14 to move a corresponding icon within a 2D or 3D space.
Alternatively, the user may simply choose a location from a list of
possible locations available in a pull-down menu.
[0057] FIG. 4 is a flow diagram illustrating a typical mode of
operation for gaming engine 14 when starting a new game. First,
gaming engine 14 receives information from a user 4 accessing the
gaming environment 8 via network 8 (32). For example, the gaming
machine 14 may receive a variety of information from the user
including a name, preferences and characteristics. Next, gaming
engine 14 randomly selects a situation from one of the defined
situations of situation table 24 within database 16 (34). Based on
the selected situation, gaming engine 14 selects a location (36).
For example, not all locations are necessarily valid for all
situations.
[0058] Next, gaming engine 14 accesses database 16 and identifies
all valid characters based on the selected situation and location.
The characters have probabilities of appearing at the various
locations. Based on the characters' probabilities, gaming engine 14
selects one of the characters (38). After selecting the character,
gaming engine 14 initializes the character's relationship skill
variables. Each character may have a predisposition toward the
user, such as a propensity to like or dislike the user, that
influences how the character's skill variables should be
initialized. In one embodiment, the character has a set of
predefined initial settings for the character's skill
variables.
[0059] After selecting the character, gaming engine 14 selects a
character's photograph (39). Each character has a set of
photographs that are mapped to ranges of one or more of the
character's skill variables. For example, in one embodiment each
character may have 50-100 photographs offering a range of different
expressions. Gaming engine 14 identifies the set of valid
photographs for the character based on the current value of the
character's skill variables. Next, gaming engine 14 randomly
selects a photograph from the identified photographs. In one
embodiment, gaming engine 14 automatically selects a character's
photograph based on the current value of the character's
"likeability" skill variable.
[0060] After selecting the photograph, gaming engine 14 identifies
the valid behavior patterns based on the current settings of the
character's skill variables. For example, if the character's
"likeability" setting is extremely low, i.e., the character does
not like the user, the character is unlikely to enter a "flirt"
behavior pattern. Next, gaming engine 14 removes any behavior
patterns that are excluded based on the situation. For example, a
character may not act according to a flirt behavior pattern if the
selected situation is a search of the character's desk for stolen
items. Gaming engine 14 then randomly selects one of the behavior
patterns from the remaining set of valid behavior patterns
(40).
[0061] After selecting a behavior pattern, gaming engine 14
identifies and randomly selects one of the possible starting frames
within the selected behavior pattern (42). Finally, gaming engine
14 presents the selected frame to the user and begins the gaming
(44).
[0062] FIG. 5 is a flow diagram illustrating a typical mode of
operation for gaming engine 14 while the character acts according
to a behavior pattern. Based on the current frame within the
behavior pattern, gaming engine 14 assembles dialogue and other
media and presents the media to the user as described above (46).
Typically, the media may elicit interaction from the user (48). For
example, the media may include dialogue and a number of possible
responses for selection by the user.
[0063] After gaming engine 14 receives the user's interaction,
gaming engine 14 determines the next frame of the behavior pattern
(50). In one embodiment, each possible response presented to the
user in the current frame is mapped to a next frame within the
behavior pattern. Based on the response, gaming engine 14 accesses
database 16 and retrieves data for the next frame. Based on the
data, gaming engine 14 updates the character's skill variables that
track the character's perceptions of the user (52).
[0064] In one embodiment, each frame of a behavior pattern contains
a set of modification values for adjusting corresponding skill
variables when the character's behavior reaches the frame. For
example, a frame may define a +2 modification value for the
"likeability" skill variable. In addition, each character may have
a set of sensitivity values representing the character's
sensitivity for each skill variable. Upon reaching a new behavior
frame, gaming engine 14 multiplies each modification value by the
character's sensitivity for the corresponding skill variable. The
result may be additionally multiplied by a game sensitivity in
order to increase or decrease the speed of the game, thereby
adjusting the playing time. Finally, gaming engine 14 adds the
resultant modification values to the current corresponding skill
variables. The skill variables are then checked to ensure they fall
within a valid range, such as -50 to +50.
[0065] After updating the skill variables, gaming engine 14
determines whether an outcome trigger has fired, i.e., whether an
outcome condition has been satisfied, based on the character's
perceptions of the user (54). Gaming engine 14 determines whether
the skill variables have reached any pre-defined thresholds. For
example, thresholds for the behavior variables may be set that,
when met, indicate the user's relationship skills have reached
levels dictating that he or she should be promoted, fired, seduced,
etc., by the character. In the event an outcome has been reached,
gaming engine 14 jumps out of the current behavior pattern and
displays an outcome page (58). If a trigger has not fired, gaming
engine 14 displays the next behavior frame (56).
[0066] FIG. 6 is a flow diagram illustrating a typical mode of
operation for gaming engine 14 after the user transitions from one
location to another. Upon concluding a behavior pattern, the user
interacts with gaming engine 14 (60). For example, the user may
interact with gaming engine 14 to move a corresponding icon within
a two or three-dimensional space. Alternatively, the user may
simply choose a location from a list of possible locations. Based
on the user's input, gaming engine 14 selects a location (62).
Based on the selected location, gaming engine 14 selects a
situation from a set of valid situations, and may exclude recent
situations that have been selected (64).
[0067] As described above, situations define ranges for acceptable
skill variables. Gaming engine identifies all valid characters
having current skill variables within the acceptable ranges that
are defined by the selected situation. The characters' skill
variables are persistent throughout the entire gaming session, and
possibly across sessions. In other words, characters remember how
they feel toward the user. Gaming engine 14 may exclude some
characters based on the situation and the defined acceptable ranges
for skill variables. Gaming engine 14 chooses one of the valid
characters based on the characters' probabilities to be at the
selected location (66).
[0068] After selecting the character, gaming engine 14 identifies
the set of valid photographs for the character based on the current
value of the character's skill variables and selects one of the
valid photographs (67). Next, gaming engine 14 identifies a set of
valid behavior patterns based on the current settings of the
character's skill variables and the selected situation. Gaming
engine 14 then randomly selects one of the valid behavior patterns
(68). After selecting a behavior pattern, gaming engine 14
identifies and randomly selects one of the possible starting frames
within the selected behavior pattern (70) and presents the selected
starting frame to the user and begins the game (72).
[0069] FIG. 7 illustrates an example user interface 74 presented by
a client computing device by which a system administrator controls
and configures online gaming environment 6. For example, user
interface 74 allows the system administrator to configure the
various components of database 16, gaming engine 14 and photograph
repository 18. As illustrated by user interface 74, the system
administrator can configure the situation table 24, the location
table 28, the behavior pattern table 22 and the character table 26
by selecting the corresponding input area within user interface
74.
[0070] FIG. 8 illustrates user interface 74 when the system
administrator has elected to configure an existing situation. Here,
user interface 74 has displayed a window 76 illustrating an example
situation involving interaction at the office coffee pot. By
interacting with window 76, the system administrator can define one
or more situation tags 78 to be displayed to the user upon
interacting with gaming engine 14. For example, gaming engine 14
may select a number of the situation tags to assemble the media
displayed to the user. For example, the ConsequencesIfDone
situation tag maps to the corresponding dialogue "everybody will be
so buzzed to get their coffee going again." In addition, the system
administrator may define ranges for the relationship skill
variables for which this situation is valid. More specifically,
ranges 80 limit the possibilities of the coffee pot situation being
selected by gaming engine 14. Here, the system administrator has
defined the coffee pot situation to be valid when the likeability
skills relationship variable is within a -15 to a +15 value, the
trust variable is between -50 and +50, the smart variable is
between +15 and -15, the power variable is between +50 and -50 and
the attraction variable is between +15 and -15.
[0071] FIG. 9 illustrates a window 82 presented by gaming engine 14
for preventing a situation being selected. More specifically,
window 82 allows the system administrator to exclude certain
situations based on the current location of the user within gaming
environment 6. For example, a number of exclusions are displayed
via list 84 including the coffee pot situation, which could be
excluded from an outdoor patio location.
[0072] FIG. 10 illustrates an example window 86 presented by the
user interface for excluding behavior patterns based on situation.
Window 86 displays a list 88 of a number of behavior patterns that
are excluded based on situation. For example, the behavior pattern
Disclose Two is excluded if the situation is Desk Search. In this
case, disclosing information to a user would be inconsistent with
executing a search of his desk for stolen items.
[0073] FIG. 11 illustrates an example window 90 presented by gaming
engine 14 by which the system administrator can manage situation
tags types for assembling situation-specific content. For example,
window 90 includes a list 92 of currently defined situation tag
types for defining situation-specific media.
[0074] FIG. 12 illustrates an example window 94 of the user
interface for defining new locations and modifying existing
locations. The system administrator can define a name for the
location as well as a background photograph stored within
photograph repository 18 for display when the location is selected
by gaming engine 14. For example, window 94 illustrates the system
administrator defining a location entitled "Boss Office."
[0075] FIG. 13 illustrates a window 96 for controlling the
probabilities that a character will appear at the various
locations. For example, window 96 displays a list 98 of the various
multipliers for controlling the likelihood of the current
characters appearing in the various defined locations. A multiplier
of 50, for example, means that the character is fifty times more
likely to appear at the corresponding location than a character
having a multiplier of one.
[0076] FIG. 14 illustrates an example window 100 by which the user
can define a new behavior pattern or modify existing behavior
patterns. For example, window 100 illustrates a system
administrator defining a Flirt behavior pattern. For each behavior
pattern, the system administrator can interact with window 100 and
define various ranges 102 for the skills relationship
variables.
[0077] FIG. 15 illustrates an example window 104 presented by
gaming engine 14 by which the system administrator can create new
situation tags for assembling situation-specific content. For
example, the system administrator may define a new situation tag
"AdOneTaskName" for the situation "Copier Seminar."
[0078] FIG. 16 illustrates an example window 106 presented by the
user interface of gaming engine 14 for creating and managing the
frames for the various behavior patterns. Window 106 presents all
of the behavior pattern frames (lines) of the currently defined
behavior patterns. For each behavior pattern frame, window 106
lists a corresponding behavior pattern identifier 107A to which the
behavior pattern line is associated. Identifiers 107B list three
possible choices (C1, C2, and C3) for user selection. The three
choices 107B point to three destination behavior pattern lines from
which to select media to present to the user as choices. Notably,
the destination behavior pattern lines can be associated with
different behavior patterns. Media 107E defines media to present to
the user when the frame is the current frame of the behavior model.
Presentation media 107C contains text-based dialogue for use when
the behavior pattern frame is identified as a possible destination
by a current behavior pattern frame. Modification values 107D
define values with which to update the characters' relationship
data upon reaching the particular behavior pattern frame.
[0079] FIG. 17 further illustrates the process of defining a frame
of a behavior pattern. By interacting with window 108, the system
administrator can create new behavior pattern lines. For each new
behavior pattern, the system administrator can define text 111 to
display to the user when the new behavior pattern is involved in
the current frame. In addition, the system administrator can define
choice text 113 to display when the behavior pattern line is a
destination frame identified by the current behavior pattern line.
In order to assemble the various text boxes 111, 113 and 115, the
system administrator can select character tags 110 and situation
tags 112 and paste them into the corresponding input fields.
Pointers 114 can be set to point at up to three possible
destination frames. In addition, the system administrator can
define modification values 116 for updating the characters
relationship skill variables when the user reaches the new behavior
pattern.
[0080] FIG. 18 illustrates an example window 118 for creating a new
character record within character table 26. By interacting with
window 118, the system administrator can define a number of
character tags 120 for selection and incorporation by gaming engine
14 in order to generate dialogue and other content for gaming
environment 6. The system administrator can define a number of
sensitivity values 122 used as multipliers when modifying the
corresponding relationship skill variables. Furthermore, the system
administrator can define a set of initial values 124 for
initializing the relationship skill variables, thereby reflecting
any predisposition of the character.
[0081] FIG. 19 illustrates an example window 126 represented by
gaming engine 14 by which the system administrator can create new
character tags for the various characters of gaming environment 6.
For example, as illustrated by window 126, the system administrator
can associate media, such as dialogue-based text, with a character
tag such as "Joke1." For each tag, the system administrator can
define corresponding media such as text, video, audio or other
media.
[0082] FIG. 20 illustrates an example window 128 represented by
gaming engine 14, by which the system administrator can define and
modify links to photographs stored within photograph repository 18
as referenced by character table 26 of database 16. For example, as
illustrated in window 128, the character Annie has a number of
photographs 130 associated with her. A first photograph 132 is
valid when the attractiveness relationship skill variable is
between the range of -41 to -50 and is associated with a particular
image file 133 stored within photograph repository 18. Similarly,
photograph 134 is valid when the likeability relationship skill
variable for Annie falls within the range of -46 to -50.
[0083] FIG. 21 illustrates an example window 136 presented by
gaming engine 14 by which the system administrator can create and
modify character tag types. For example, window 136 lists a number
of current character tag types 138 for use when creating character
tags. Each character can, therefore, have character tags of the
various character tag types. For example, in the illustrated
embodiment, a character can have character tags of type Joke 1,
Joke 2, Joke 3 and Joke 4.
[0084] FIG. 22 illustrates an example window 140 for setting a
sensitivity for gaming engine 14. By adjusting sensitivity values
142, the system administrator can effectively increase or decrease
the speed at which the player navigates the gaming environment. As
described above, sensitivity values 142 can be used as multipliers
when updating the relationship skill variables maintained by the
various characters for interaction with the player.
[0085] A number of embodiments of the invention have been
described. Nevertheless, it will be understood that various
modifications may be made without departing from the spirit and
scope of the invention. Accordingly, these and other embodiments
are within the scope of the following claims.
* * * * *