U.S. patent application number 10/441031 was filed with the patent office on 2004-04-01 for game device, image processing device and image processing method.
Invention is credited to Sanbongi, Kazutomo, Sata, Takenao, Shibasaki, Junji, Shimokawa, Hitoshi, Tsuji, Yukio.
Application Number | 20040063501 10/441031 |
Document ID | / |
Family ID | 29705737 |
Filed Date | 2004-04-01 |
United States Patent
Application |
20040063501 |
Kind Code |
A1 |
Shimokawa, Hitoshi ; et
al. |
April 1, 2004 |
Game device, image processing device and image processing
method
Abstract
The present invention is an image processing method for moving a
virtual camera located in a three-dimensional virtual space at a
predetermined speed, and changing a distance between the virtual
camera and a character defined in the virtual space, wherein a
moving speed of the virtual camera changes based on the distance
between the virtual camera and the character.
Inventors: |
Shimokawa, Hitoshi; (Tokyo,
JP) ; Tsuji, Yukio; (Tokyo, JP) ; Sanbongi,
Kazutomo; (Tokyo, JP) ; Shibasaki, Junji;
(Tokyo, JP) ; Sata, Takenao; (Tokyo, JP) |
Correspondence
Address: |
Finnegan, Henderson, Farabow,
Garrett & Dunner, L.L.P.
1300 I Street, N.W.
Washington
DC
20005-3315
US
|
Family ID: |
29705737 |
Appl. No.: |
10/441031 |
Filed: |
May 20, 2003 |
Current U.S.
Class: |
463/49 |
Current CPC
Class: |
A63F 2300/64 20130101;
A63F 13/10 20130101; A63F 2300/8076 20130101; A63F 2300/6684
20130101; A63F 13/5258 20140902; A63F 2300/6661 20130101; A63F
13/577 20140902; A63F 2300/643 20130101; A63F 13/837 20140902 |
Class at
Publication: |
463/049 |
International
Class: |
A63F 009/24 |
Foreign Application Data
Date |
Code |
Application Number |
May 21, 2002 |
JP |
2002-146900 |
Claims
I claim:
1. A game device for simulating a player's shooting of a character
defined in a three-dimensional virtual space while moving a virtual
camera located in the three-dimensional virtual space, comprising:
computing means for computing a damage point of the character
caused by the player's shooting of it, based on a distance between
the virtual camera and the character and the distance between the
character and the center of an effective shooting radius that
changes in accordance with the distance between the virtual camera
and the character.
2. The game device according to claim 1, wherein the damage point
computing means computes a damage point of the character caused by
the player's shooting of it, by multiplying a damage value, which
is determined based on the distance between the virtual camera and
the character, by a damage rate that is determined based on the
distance between the character and the center of the effective
shooting radius that changes in accordance with the distance
between the virtual camera and the character.
3. The game device according to claim 1 or 2, wherein the damage
value is determined such that the further the distance between the
virtual camera and the character is, the smaller the damage value
is, the effective shooting radius is determined such that the
further the distance between the virtual camera and th character
is, the larger the effective shooting radius is, and the damage
rate is determined such that the further the distance between the
character and the center of the effective shooting radius is, the
smaller the proportion is.
4. A game device for simulating a player's shooting of a character
defined in a three-dimensional virtual space while moving a virtual
camera located in the three-dimensional virtual space, comprising:
computing means for computing a damage point of the character
caused by the shooting in accordance with a proportion of an
overlapping area to the shooting radius, and in the overlapping
area an effective shooting radius and a collision sphere of the
enemy overlap.
5. A game device for simulating a player's shooting of a character
defined in a three-dimensional virtual space while moving a virtual
camera located in the three-dimensional virtual space, comprising:
computing means for computing a damage point of the character
caused by the shooting in accordance with a proportion of an
overlapping area to the collision sphere of the enemy, and in the
overlapping area an effective shooting radius and a collision
sphere of the enemy overlap.
6. A controlling method of a game device for simulating a player's
shooting of a character defined in a three-dimensional virtual
space while moving a virtual camera located in the
three-dimensional virtual space, wherein a damage point of the
character caused by the player's shooting of it is computed based
on a distance between the virtual camera and the character and the
distance between the character and the center of an ffective
shooting radius that changes in accordance with the distance
between the virtual camera and the character.
7. A controlling method of a game device for simulating a player's
shooting of a character defined in a three-dimensional virtual
space while moving a virtual camera located in the
three-dimensional virtual space, wherein a damage point of the
character caused by the shooting is computed in accordance with a
proportion of an overlapping area to the shooting radius, and in
the overlapping area an effective shooting radius and a collision
sphere of the enemy overlap.
8. A controlling method of a game device for simulating a player's
shooting of a character defined in a three-dimensional virtual
space while moving a virtual camera located in the
three-dimensional virtual space, wherein a damage point of the
character caused by the shooting is computed in accordance with a
proportion of an overlapping area to the collision sphere, and in
the overlapping area an effective shooting radius and a collision
sphere of the enemy overlap.
9. A game controlling method whereby a game device is controlled
such that it determines whether a hit decision area generated in a
virtual space in accordance with a players manipulation overlaps
with an object located in the virtual space, and the character is
damaged when it is determined that the hit decision area overlaps
with the object, the method comprising: a step of obtaining first
positional information indicating a position of a character being
manipulated by the player in the virtual space; a step of obtaining
second positional information indicating a position of the object
in the virtual space; a step of computing a distance between the
character and the object based on the obtained first and second
positional information; a step of changing the size of the hit
decision area based on the obtained distance; and a step of
computing, when it is determined that the hit decision area and the
object overlap with each other, a damage amount for the object
based on the obtained distance, and realizing damage to the object
based on the obtained damage amount.
10. The game controlling method according to claim 9, wherein when
the obtained distance is shorter than a predetermined distance, the
hit decision area is set to small and the damage amount is set to
large.
11. The game controlling method according to claim 9 or 10, wherein
when the computed distance is further than a predetermined
distance, the hit decision area is set to large and the damage
amount is set to small.
12. A game controlling method whereby a game device is controlled
such that it determines whether a hit decision area, generated in a
virtual space with a predetermined target point at the center in
accordance with a player's manipulation, overlaps with an object
located in the virtual space and the object is damaged when it is
determined that the hit decision area overlaps with the object, the
method comprising: a step of obtaining positional information
indicating a position of the object in th virtual space; a step of
computing an area wherein the hit decision area and the object
overlap with each other, based on the hit decision area and the
obtained positional information; and a step of generating data of a
damage amount to be attributed to the object, based on the obtained
area, and providing damage to the object based on the generated
damage amount data.
13. A shooting game controlling method, wherein the shooting game
is controlled such that it simulates a player's shooting of a
character defined in a three-dimensional virtual space, while a
virtual camera located in the three-dimensional virtual space is
moving at a predetermined speed, comprising: a step of changing the
distance between the character and the virtual camera; a step of
changing the moving speed of the virtual camera or the speed of
directing the virtual camera to the character, based on the
distance between the character and the virtual camera; a step of
changing the player's effective shooting radius based on the
distance between the virtual camera and the character; a step of
determining whether a bullet has hit the character, based on the
character's position and the location of the effective shooting
radius; and a step of computing, when it is determined that the
bullet has hit the character in the determination step, a damage
amount caused to the character by the shooting, based on both the
distance between the virtual camera and the character as well as
the distance between the character and the center of the effective
shooting radius.
14. An image processing method for moving a virtual camera located
in a three-dimensional virtual space at a predetermined speed and
changing the distance between the virtual camera and a charact r
defined in the three-dimensional virtual space, wherein the moving
speed of the virtual camera changes based on the distance between
the virtual camera and the character.
15. The image processing method according to claim 14, wherein a
first character defined in the three-dimensional virtual space and
a second character manipulated by a player, are displayed, and the
moving speed of the virtual camera changes based on the distance
between the first and second characters.
16. An image processing method for directing a virtual camera to a
character located in a three-dimensional virtual space, wherein a
fixation point of the virtual camera is set on the character in a
manner so that the speed of directing the virtual camera to the
character changes based on the distance between the virtual camera
and the character.
17. A game device composed such that a virtual camera located in a
three-dimensional virtual space moves at a predetermined speed and
the distance between the virtual camera and a character defined in
the virtual space changes, comprising: virtual camera controlling
means for changing a moving speed of the virtual camera based on
the distance between the virtual camera and the character.
18. The game device according to claim 17, wherein a first
character defined in the three-dim nsional virtual space and a
second character manipulated by a player are displayed, and the
virtual camera controlling means changes the moving speed of the
virtual camera based on the distance between the first and second
characters.
19. The game device according to claim 17, wherein the virtual
camera controlling means controls the virtual camera moving speed
in a manner that the shorter the distance between the virtual
camera and the character becomes, the more the virtual camera
moving speed decreases.
20. The game device according to claim 17, wherein a plurality of
areas are provided in the three-dimensional virtual space with the
virtual camera at the center, and the virtual camera controlling
means determines in which area a character closest to the virtual
camera exists, and controls the virtual camera moving speed in
accordance with the determined area.
21. A game device for directing a virtual camera to a character
located in a three-dimensional virtual space, comprising: fixation
point setting means for setting a fixation point of the virtual
camera on the character such that the speed of directing the
virtual camera to the character changes in accordance with the
distance between the virtual camera and the character.
22. An image processing device composed to change a distanc between
a character and a virtual camera both located in a
three-dimensional virtual space, comprising: means for computing
the distance between the virtual camera and th character; and
virtual camera controlling means for changing a moving speed of the
virtual camera in accordance with the computed distance.
23. An information processing program for making a computer execute
the game device controlling method according to any one of claims 6
to 16.
24. A computer-readable recording medium having an information
processing program stored therein that makes a computer execute the
game device controlling method according to any one of claims 6 to
16.
Description
BACKGROUND
[0001] The present invention relates to an image processing device,
particularly to a game device.
[0002] In recent years, numerous image processing devices of the
type called three-dimensional game devices have been proposed. An
image processing device defines various kinds of characters in a
virtual space formed by the computer and loads the manipulation
information of players into the computer through peripheral
equipment such as joysticks, thereby realizing image processing for
moving characters, etc. As a result of the image processing, images
which are viewed from a three-dimensional virtual space viewpoint,
called a virtual camera, are displayed on a TV monitor for the
players to see.
[0003] One example of the image processing device is a game device
in which players compete against each other over shooting
characters displayed on the screen (for example, "House of the
Dead" available from Sega Enterprises, Ltd.). This game device is
composed such that the virtual camera moves along a predetermined
course in the three-dimensional space, and a player moves while
shooting enemies (zombies). Each enemy has some weak points and
when the player accurately hits a weak point, the enemy incurs a
damage point, and when the total damage point exceed a
predetermined value, the enemy is defeated. Further, a time limit
is predetermined for the player to defeat the enemies. Therefore,
if the player fails to defeat the enemies within the predetermined
tim limit, the player is attack d by the en mies and the player
incurs damage points.
[0004] This game device is also composed such that, when the player
switches on a trigger of a gun pointing at the screen, the time
elapsed for the gun to detect a scanning line on the screen is
computed, and the coordinates of the location pointed at by the gun
is further computed, and thereby a decision is made as to whether
or not a bullet will hit the enemy character.
[0005] This type of image processing device, however, has the
following problems.
[0006] First of all, the conventional game device has
configurations unfavorable to the player, for example, a time to
fight with the enemies is predetermined and if the player can not
defeat the enemies within the predetermined time limit, the player
is attacked by the enemies and his/her damage points increase.
However, in the case where the player does defeat the enemies
within the time limit, there is no arrangement favorable to the
player. For instance, when the player defeats the enemies just in
time, or even when the player defeats them with time to spare,
there are no positive effects on the future development of the game
or the player's game score. Accordingly, there is a problem that
the players, who are skilled in the shooting technique and can
defeat the enemies within the time limit, lose their fighting
spirits.
[0007] Secondly, the conventional game device is hardly intended to
be realistic based upon deciding whether a bullet hits an enemy.
Therefore, the player cannot develop a strategy by familiarizing
himself/herself with the types and characteristics of weapons.
Accordingly, the conventional gam d vice does not pres nt nough
entertaining characteristics for a gun shooting game.
SUMMARY
[0008] In order to achieve the above objects, the present invention
provides an image processing method for moving a virtual camera
located in a three-dimensional virtual space at a predetermined
speed and changing the distance between the virtual camera and a
character defined in the three-dimensional virtual space, wherein
the moving speed of the virtual camera changes based on the
distance between the virtual camera and the character.
[0009] According to the image processing method, a first character
defined in the three-dimensional virtual space and a second
character manipulated by the player, are displayed, and the moving
speed of the virtual camera changes based on the distance between
the first and second characters.
[0010] The present invention also provides an image processing
method for directing a virtual camera to a character located in a
three-dimensional virtual space, wherein a fixation point of the
virtual camera is set on the character in a manner so that the
speed of directing the virtual camera to the character changes
based on the distance between the virtual camera and the
character.
[0011] The present invention further provides a game d vice that is
composed such that a virtual camera located in a three-dimensional
virtual space moves at a predetermined speed and the distance
between the virtual camera and a character defined in the virtual
space changes, comprising: virtual camera controlling means for
changing a moving speed of the virtual camera based on the distance
b twe n the virtual camera and the character.
[0012] It is desirable that the game device display a first
character defined in the three-dimensional virtual space and a
second character manipulated by the player, and the virtual camera
controlling means change the moving speed of the virtual camera
based on the distance between the first and second characters.
[0013] It is also desirable that the virtual camera controlling
means control the virtual camera moving speed in a manner that the
shorter the distance between the virtual camera and the character
becomes, the more the virtual camera moving speed decreases.
[0014] Regarding the game device, it is desirable that a plurality
of areas be provided in the three-dimensional virtual space with
the virtual camera at the center, and the virtual camera
controlling means determine in which area a character closest to
the virtual camera exists, and control the virtual camera moving
speed in accordance with the determined area.
[0015] The present invention provides a game device for directing a
virtual camera to a character located in a three-dimensional
virtual space comprising: fixation point setting means for setting
a fixation point of the virtual camera on the character such that
the speed of directing the virtual camera to the character chang s
in accordance with the distance between the virtual camera and the
character.
[0016] The present invention also provides a game device for
simulating a player's shooting of a character defined in a
three-dimensional virtual space while moving a virtual camera
located in the three-dimensional virtual space, comprising:
computing means for computing a damage point of the character
caused by the player's shooting of it, based on a distance between
the virtual camera and the character and the distance between the
character and the center of an effective shooting radius that
changes in accordance with the distance between the virtual camera
and the character.
[0017] It is desirable that the damage point computing means
compute a damage point of the character caused by the player's
shooting of it, by multiplying a damage value, which is determined
based on the distance between the virtual camera and the character,
by a damage rate that is determined based on the distance between
the character and the center of the effective shooting radius that
changes in accordance with the distance between the virtual camera
and the character.
[0018] It is desirable that: the damage value be determined such
that the further the distance between the virtual camera and the
character is, the smaller the damage value is; the effective
shooting radius be determined such that the further the distance
between the virtual camera and the character is, the larger the
effective shooting radius is; and the damage rate be determined
such that the further the distance between the character and the
center of the effective shooting radius is, the smaller the
proportion is.
[0019] The present invention provides a game device for simulating
a player's shooting of a charact r defined in a three-dimensional
virtual space while moving a virtual camera located in the
three-dimensional virtual space, comprising: computing means for
computing a damage point of the character caused by the shooting in
accordance with a proportion of an overlapping area to the shooting
radius, and in the overlapping area an effective shooting radius
and a collision sphere of the enemy overlap.
[0020] The present invention provides a game device for simulating
a player's shooting of a character defined in a three-dimensional
virtual space while moving a virtual camera located in the
three-dimensional virtual space, comprising: computing means for
computing a damage point of the character caused by the shooting in
accordance with a proportion of an overlapping area to the
collision sphere of the enemy, and in the overlapping area an
effective shooting radius and a collision sphere of the enemy
overlap.
[0021] The present invention provides an image processing device
that is composed to change a distance between a character and a
virtual camera both located in a three-dimensional virtual space,
comprising: means for computing the distance between the virtual
camera and the character; and virtual camera controlling means for
changing a moving speed of the virtual camera in accordance with
the computed distance.
[0022] The present invention provides a controlling method of a
game device for simulating a player's shooting of a character
defined in a three-dimensional virtual space while moving a virtual
camera located in the three-dimensional virtual space, wherein a
damage point of the character caused by the player's shooting of it
is computed on the basis of a distance between the virtual camera
and the character and the distance between the character and the
center of an effective shooting radius that changes in accordance
with the distance between the virtual camera and the character.
[0023] The present invention provides a controlling method of a
game device for simulating a player's shooting of a character
defined in a three-dimensional virtual space while moving a virtual
camera located in the three-dimensional virtual space, wherein a
damage point of the character caused by the shooting is computed in
accordance with a proportion of an overlapping area to the shooting
radius, and in the overlapping area an effective shooting radius
and a collision sphere of the enemy overlap.
[0024] The present invention provides a controlling method of a
game device for simulating a player's shooting of a character
defined in a three-dimensional virtual space while moving a virtual
camera located in the three-dimensional virtual space, wherein a
damage point of the character caused by the shooting is computed in
accordance with a proportion of an overlapping area to the
collision sphere, and in the overlapping area an effective shooting
radius and a collision sphere of the enemy overlap.
[0025] The present invention provides a game controlling method
whereby a game device is controlled such that it determines whether
a hit decision area generated in a virtual space in accordance with
a player's manipulation overlaps with an object located in the
virtual space, and the character is damaged when it is determined
that the hit decision area overlaps with the object, the method
comprising: a step of obtaining first positional information
indicating a position of a character being manipulated by the
player in the virtual space; a step of obtaining second positional
information indicating a position of the object in the virtual
space; a step of computing a distance between the character and the
object based on the obtained first and second positional
information; a step of changing the size of the hit decision area
based on the obtained distance; and a step of computing, when it is
determined that the hit decision area and the object overlap with
each other, a damage amount for the object based on the obtained
distance, and realizing damage to the object based on the obtained
damage amount.
[0026] It is desirable that the hit decision area be set to small
and the damage amount be set to large when the obtained distance is
shorter than a predetermined distance.
[0027] It is desirable that the hit decision area be set to large
and the damage amount be set to small when the computed distance is
further than a predetermined distance.
[0028] The present invention provides a game controlling method
whereby a game device is controlled such that it determines whether
an effective a area, generated in a virtual space with a
predetermined target point at the center in accordance with a
player's manipulation, overlaps with an object located in the
virtual space and the object is damaged when it is determined that
the hit decision area overlaps with the object, the method
comprising: a step of obtaining positional information indicating a
position of the object in the virtual space; a step of computing an
area wher in the hit decision area and the object overlap with each
other, based on the hit decision area and the obtained positional
information; and a step of generating data of a damage amount to be
attributed to the object, based on the obtained area, and providing
damage to the object based on the generated damage amount data.
[0029] The present invention provides a shooting game controlling
method, wherein the shooting game is controlled such that it
simulates a player's shooting of a character defined in a
three-dimensional virtual space, while a virtual camera located in
the three-dimensional virtual space is moving at a predetermined
speed, comprising: a step of changing the distance between the
character and the virtual camera; a step of changing the moving
speed of the virtual camera or the speed of directing the virtual
camera to the character, based on the distance between the
character and the virtual camera; a step of changing the player's
effective shooting radius based on the distance between the virtual
camera and the character; a step of determining whether a bullet
has hit the character, based on the character's position and the
location of the effective shooting radius; and a step of computing,
when it is determined that the bullet has hit the character in the
determination step, a damage amount caused to the character by the
shooting, based on both the distance between the virtual camera and
the character as well as the distance between the character and the
center of the effective shooting radius.
DESCRIPTION OF DRAWINGS
[0030] FIG. 1 is a block diagram indicating the general structure
of a game device according to one embodiment of the present
invention.
[0031] FIG. 2 is a flow chart of th ntire general process performed
by the CPU according to th embodiment.
[0032] FIG. 3 is a flow chart of the process executed in the game
mode.
[0033] FIG. 4 illustrates the relationship between virtual camera
movements and the enemies.
[0034] FIG. 5 is a flow chart of one example for the controlling
process of the virtual camera.
[0035] FIG. 6(A) illustrates the relationship between the enemy
sensing distance and the position d of the enemy closest to the
virtual camera.
[0036] FIG. 6(B) shows examples of formulas for computing
acceleration.
[0037] FIG. 7 is a diagram explaining a fixation point for the
virtual camera.
[0038] FIG. 8 is a diagram showing one example of the relationship
between the distance to the enemy and the bullet strength.
[0039] FIG. 9 is a diagram showing one example of the relationship
between an effective shooting scope and the bullet strength.
[0040] FIG. 10 is a diagram showing one example of an effective
shotgun radius.
[0041] FIG. 11 is a flow chart explaining the entire hit decision
process.
[0042] FIG. 12(A) is a flow chart of the hit decision process.
[0043] FIG. 12(B) is an example in which a collision cone of the
shotgun's bullet is divided into 16 sections.
[0044] FIG. 13(A) is a flow chart explaining the damage
process.
[0045] FIG. 13(B) is an example of a computation for damage
rate.
[0046] FIG. 14 shows an example of the configuration of the damage
chart.
[0047] FIG. 15 shows image examples of objects (enemies) being
shot.
[0048] FIG. 16 is a flow chart of the injury process.
[0049] FIG. 17 shows an example of the configuration for an injury
progression value (damage progression value) chart.
[0050] FIG. 18 is a diagram explaining a second damage process.
[0051] FIG. 19 is a diagram explaining a second damage process.
DETAILED DESCRIPTION
[0052] An embodiment of the present invention will be explained
with reference to the drawings. In this embodiment, explanations
are given for the case wherein the game device of the present
invention is applied to a gun shooting game of a so-called arcade
type game. Nevertheless, this invention is not limited to this type
and can also b applied to game software for home game devices.
[0053] [Block Diagram of Game Device]
[0054] FIG. 1 is a block diagram indicating one example of the game
device of an arcade type game for playing a gun shooting game,
according to the present invention. The basic components of this
game device include a game device main body 10, an input device 11,
a TV monitor 13, and a speaker 14.
[0055] The input device 11 is a weapon such as a gun, shotgun, or a
machine gun, for shooting enemies in the game. In this embodiment,
the weapon is a shotgun used by the game player. A shotgun includes
a photoreceptor for reading a scanning spot (a light spot of an
electron beam) for an impact point on the TV monitor, and a trigger
switch that is equivalent to the trigger of a regular shotgun which
is pulled. Scanning spot detection signals and trigger signals are
transmitted to the interface 106, which will be described
hereinafter, via a connecting cord. The TV monitor 13 displays
images showing the status of the game development. The TV monitor
can be replaced by a projector.
[0056] The game device main body 10 comprises a central processing
unit (CPU) 101, a ROM 102, a RAM 103, a sound device 104, an
input/output interface 106, a scroll data processor 107, a
coprocessor (auxiliary processor) 108, a landform contour data ROM
109, a geometrizer 110, a form data ROM 111, a drawing device 112,
a texture data ROM 113, a texture map RAM 114, a frame buffer 115,
an image composition device 116, and a D/A converter 117. Examples
of a storage medium used in this invention as the ROM 102, may
include a hard disc, a cartridge-type ROM, a CD-ROM, other
well-known media, and communication media (the intern t and other
personal computer communication networks).
[0057] The CPU 101 is connected through a bus-line to: the ROM 102
having a predetermined program stored therein; RAM 103 for storing
data; sound device 104; input/out interface 106; scroll data
processor 107; coprocessor 108; and geometrizer 110. The RAM 103 is
operated as a RAM buffer. Various commands (to display objects,
etc) to the geometrizer 110 and matrices obtained by computing the
transformation matrix are written on the RAM 103.
[0058] The input device 11 (shotgun) is connected to the
input/output interface 106. CPU 101 checks whether the shotgun was
fired based on a scanning spot detection signal sent from the
shotgun 11 and a trigger signal indicating that the shotgun switch
was pulled, and identifies an impact point and the number of shots
fired in accordance with the current coordinates (X, Y) of the
location of the scanning electron beam on the TV monitor and a
location of a target. Then, CPU 101 sets various corresponding
flags at predetermined positions in the RAM 103.
[0059] The sound device 104 is connected to the speaker 14 through
a power amplifier 105, and audio signals generated by the sound
device 104 are amplified in electric power, and then transmitted to
the speaker 14.
[0060] In this embodiment, the CPU 101 reads, on the basis of a
program stored in the ROM 102, the game story development, the
landform data in ROM 109 or the form data (three-dimensional data
of "objects such as enemy characters" and "the game scenery
including landscape, buildings, interiors, and underground
passages) in the ROM 111, then the CPU 101 determines the situation
in the three-dimensional virtual space and executes the shooting
process in correspondence with the trigger signals s nt from the
input device 11.
[0061] Regarding the various objects in the virtual game space,
their coordinate values in the three-dimensional space are
determined, then the transformation matrix for transforming the
coordinate values to a viewpoint coordinate system, and the form
data (buildings, landforms, interiors, laboratories, furniture,
etc.) are specified to the geometrizer 110. The coprocessor 108 is
connected to the landform data ROM 109, and accordingly, the
landform data for the predetermined movement course of the camera
is delivered to the coprocessor 108 (and the CPU 101). The
coprocessor 108 decides whether a bullet hits a target, computes
deviations of objects from the camera's line of sight, executes the
process for the movement of the line of sight, and takes on
computing floating-points upon such decision and computation.
Consequently, the results of the coprocessor's decision as to
whether a bullet hit an object and the process for the movement of
the line of sight which is movement relative to the position of the
objects, are transmitted to the CPU 101.
[0062] The geometrizer 110 is connected to the form data ROM 111
and the drawing device 112. Prestored on the form data ROM 111 is
the polygon form data (three-dimensional data of buildings, walls,
corridors, interiors, landforms, scenery, the main character,
characters on the main character's side, various kinds of enemies
(zombies), etc., all being composed of respective vertices). This
form data is delivered to the geometrizer 110. The geometrizer 110
executes the perspective transformation of the form data specified
by the transformation matrix sent from the CPU 101, and obtains the
form data in which the coordinate system of the three-dimensional
virtual space has been transformed to the coordinate system of a
visual field.
[0063] The drawing device 112 pastes together the transformed form
data of the visual field coordinate system and the textures and
then outputs it to the frame buffer 115. For pasting together the
textures, the drawing device 112 is connected to the texture data
ROM 113, the texture map RAM 114, and the frame buffer 115. Polygon
data refers to a data group of relative or absolute coordinates of
respective vertices each composing a polygon (mainly a trigon or
tetragon), which consists of a set of plural vertices. The polygon
data stored in the landform data ROM 109, is set relatively roughly
but enough to move the camera in the virtual space along the game
storyline. On the other hand, the polygon data stored in the form
data ROM 111, is set in more detail concerning the forms, such as
enemies and backgrounds, that compose the screen.
[0064] Scroll data processor 107 processes data such as letters on
a scroll screen. The scroll data processor 107 and the frame buffer
115 are connected to the TV monitor 13 via the image composition
device 116 and the D/A converter 117. Accordingly, the polygon
screen (a simulation result) for objects (enemies) and landforms
(backgrounds) stored temporarily in the frame buffer 115, and the
scroll screen for text information (for example, the player's
LifeCount value, damage points, etc.) are composed to generate
final frame-image data. This frame-image data is converted into
analog signals by the D/A converter 117 and transmitted to the TV
monitor 13, thereby, real-time images of the shooting game are
displayed.
[0065] [Entire Game Flow]
[0066] Now, the entire flow of the game will be explained with
reference to FIG. 2. FIG. 2 is a flow chart explaining the outline
of the game, and the process flow is broadly classified into a
movement mode and a game mode. In the movement mode (S10), the
virtual camera moves in the virtual game space created in the
computer system in accordance with the pre-programmed game story,
and also projects various game status updates onto the screen.
[0067] When the virtual camera moves to a point where a
preprogrammed enemy appears, an enemy is displayed on the screen
(S20) and the game switches to the game mode for developing the
shooting game (S30). During the game mode, the player can move
forward while shooting the enemies. When the player defeats the
enemies, the virtual camera again moves according to the
preprogrammed game story and enters another game status (S40; YES),
thereby further developing the game (S10 to S30).
[0068] If the player can not defeat the enemies and loses or if the
player clears the final game (S40; NO), the game is determined to
be over. The player can go back to the game mode of a different
status or the exact game mode in which the player lost, if, for
example, the main character's damage is not heavy (S50; NO).
Alternatively, the game is over (S50; YES) when the set time of
each section of the game runs out or when game parameters, such as
damage point, satisfy the game termination requirements.
[0069] [Game Mode]
[0070] Now, the process flow in the game mode will be explained
with reference to FIG. 3. FIG. 3 is a flow chart explaining the
process in the game mode (S30). The virtual camera moves in the
virtual space and when an enemy appears, an enemy appearance means
execut s the process to make enemies appear of the type and number
preprogrammed for the scene (S302). Regarding the process to mak
enemies appear, well-known techniques such as the Japanese Patent
Laid-Open Publication No. Hei 10-185547 may be used.
[0071] Along with the appearance of an enemy, a virtual camera
controlling means changes the moving speed of the virtual camera in
accordance with the distance between the virtual camera (viewpoint)
and the enemy (S304). The process to control the moving speed of
the virtual camera will be explained hereinafter with reference to
FIGS. 4 to 6. The virtual camera controlling means also changes a
fixation point of the virtual camera in accordance with the
distance between the virtual camera (viewpoint) and the enemy
(S304). This process will be explained hereinafter with reference
to FIG. 4.
[0072] The player can shoot the enemies that appear on the screen.
When an enemy is shot, a shooting result determination means
determines a shooting result (S306). At first, the shooting result
determination means determines whether the bullet has hit the enemy
(hit decision), and if the bullet has hit the enemy, a hit flag is
set for the shot enemy, and a damage point and an injury
progression value incurred by the shot are computed. The hit
decision and the computation of a damage point are executed in
accordance with the shotgun properties, and the details of this
process will be hereinafter explained with reference to FIGS. 11 to
13.
[0073] When the enemy is shot and vanishes, an enemy moving means
executes the enemy moving process (S308) for moving another enemy
towards a clear space or the place where the previous enemy was
shot. As for the moving process of the enemies, well-known
techniques such as the above-cited Japanese Patent Publication Hei
10-165547 may be used.
[0074] Subsequently, a determination is made as to whether the
fighting will continue. If the fighting is not yet finished (S310;
YES), such as in the case that some enemies still remain on the
screen, it is determined whether to make further enemies appear
based on the program of the game story (S312). If another enemy
should appear (S312; YES), the enemy is made to appear (S302). If
enemies should no longer appear (S312; NO), the determination of
the shooting results for the remaining enemies (S306) is moved on
to, and Steps 308 to 310 are repeated. When the fighting is
determined to be over (S310; NO), the process returns to the
above-mentioned Step 40. Then, it is determined whether to return
to the movement mode that leads to another fight scene (S40), or to
finish the game (S50).
[0075] [Virtual Camera Movement]
[0076] Now, improvements regarding the virtual camera movements by
a virtual camera controlling means will be explained. The virtual
camera is a viewpoint located in the three-dimensional virtual
space, and images seen from this viewpoint are presented to the
player through the monitor. In the conventional game, when the
virtual camera arrives at the preprogrammed enemy appearance point,
the virtual camera stops its movement. Accordingly, when an enemy
appears (i.e. when the virtual camera arrives at a predetermined
position), the player stops to shoot the enemy. Further, the moving
speed for the virtual camera of a conventional game is
constant.
[0077] On the contrary, in this embodiment, since the moving speed
of th camera changes in accordance with the distance between the
camera and the enemy, the player's speed of defeating the enemies
affects the progression of th game. For example, the moving speed
of the camera is changed such that the shorter the distance
becomes, the slower the moving speed of the camera becomes.
Accordingly, the faster the player defeats the approaching enemies
(the more the player defeats the enemies far off in the distance),
the faster the game progresses, and consequently, the player can
obtain a higher score. On the other hand, the more slowly the
player defeats the enemies, the more slowly the game progresses,
therefore, the player cannot obtain a high score. In short, in the
event that the player defeats the enemies within the time-limit,
the speed of defeating the enemies influences the game development
and the game results. Further, since the player finds and shoots
the enemies while moving toward a destination, the player feels a
sense of urgency and thus, an increased enjoyment of the game.
[0078] Now, the relationship between the virtual camera movement
and an enemy will be explained with reference to FIG. 4. As shown
in FIG. 4, the virtual camera moves along a predetermined track
with a predetermined speed and angle. There is a certain distance
(enemy sensing distance) from the virtual camera, wherein the
player can sense an enemy and the virtual camera moving speed
changes depending on the distance between the virtual camera and
the enemy. When an enemy comes within this enemy's sensing
distance, the virtual camera moving speed decelerates. The closer
the enemy approaches the virtual camera, the more the virtual
camera moving speed decelerates, and when the enemy reaches a
certain proximity, the virtual camera stops its movement.
[0079] As FIG. 4 shows, the enemy sensing distance is divided into
three areas having th virtual camera at the center: a normal moving
speed area in which the virtual camera mov s at a normal speed; a
low moving speed area in which the virtual camera moves at a low
speed; and a non-moving area in which the virtual camera stops its
movement. These areas are defined by the enemy's distance from the
virtual camera.
[0080] When an enemy is in the normal moving speed area of the
camera, the player feels that the enemy is quite far away, and the
moving speed of the virtual camera does not change and maintains
its normal speed. When the enemy enters the low moving speed area
of the camera, the player feels that he/she must defeat the enemy
and the virtual camera moving speed becomes slower than the normal
speed. When the enemy further enters the non-moving area of the
camera, the player feels danger that he/she might be defeated, and
the virtual camera stops its movement. Areas for determining the
virtual camera moving speed are not limited to these three areas.
Any area can be appropriately set according to the difficulty level
of a game.
[0081] Next, explanations will be given for the process flow to
change the virtual camera moving speed in accordance with the
distance between the camera and the enemy, with reference to FIGS.
5, 6(A) and 6(B). FIG. 5 is a flow chart explaining the process for
controlling the virtual camera (S304 in FIG. 3). FIG. 6(A) shows
the areas of the different moving speeds of camera. FIG. 6(B)
explains formulas for computing an acceleration for the camera.
[0082] At first, a position d of an enemy that is closest to the
virtual camera is determined (S304a). As shown in FIG. 6(A), the
non-moving area of the camera is within the 2.5-meter distance from
the virtual camera; the low moving speed area of th camera is
within the 10-meter distance from the virtual camera (excluding the
non-moving area of the camera); and the normal moving speed area of
the camera is an area further than the 10-meter distance from the
virtual camera, respectively.
[0083] Then, the enemy's position d is determined whether it is
within the non-moving area of the camera (S304b). If it is
determined that it is within the non-moving area (S304b; YES), an
acceleration in the non-moving area is computed (S304c). The
acceleration is obtained by the formula shown in FIG. 6(B).
[0084] If it is determined that the enemy's position d is not
within the non-moving area of the camera (S304b; NO), it is then
determined whether the position d is within the low moving speed
area of the camera (S304d). If it is determined that the position d
is within the low moving speed area of the camera (S304d; YES), an
acceleration in the low moving speed area of the camera is computed
(S304e). But, if it is determined that the enemy's position d is
not within the low moving speed area of the camera (S304d; NO), an
acceleration in the non-moving area of the camera is computed
(S304f).
[0085] The virtual camera moving speed s is computed based on the
above-obtained acceleration (S304g), and it is further determined
whether the obtained moving speed s is less than zero (S304h).) If
the obtained moving speed s is less than zero (S304h; YES), the
moving speed s is set to zero (S304i). Conversely, when the
obtained moving speed s is not less than zero (S304k; NO), it is
determined whether it is more than one, and if it is more than one
(S304k; YES), the moving speed s is set as one.
[0086] To rephrase the above processing, first, the position d of
the enemy (the distance between the virtual camera and the enemy
character) is computed. Then, an area that includes the enemy's
position d is determined, the acceleration of the virtual camera is
computed based on the specified area, and the virtual camera moving
speed is computed on the basis of the obtained acceleration.
[0087] In the case that plural enemy characters are defined
(appear) in the virtual space and when all the enemies have been
defeated, the virtual camera moving speed may be set back to the
normal speed. Specifically, it is determined whether the enemy
characters in the virtual space have all been defeated. If so
determined, the virtual camera moving speed is set back to the
normal speed.
[0088] Further, the virtual camera moving direction and the game
story development may be changed (or may follow other preprogrammed
branches) in accordance with the time elapsed for enemy characters
to appear in the virtual space and to be annihilated. Specifically,
the time elapsed is measured, then a virtual camera moving
direction is selected and the game story is specified for how it
will develop in accordance with the time.
[0089] As described above, the virtual camera moving speed changes
in accordance with the distance between the virtual camera and the
enemies, and the shorter the distance becomes, the slower the
virtual camera moving speed becomes. Accordingly, if the player
shoots the enemies from far away, the virtual camera moving speed
does not decrease. In other words, the faster the play r defeats
the enemies, the fast r the game advances, thereby the player
obtains a higher score.
[0090] Furthermor, since the virtual camera moving speed changes in
accordance with the distance from the enemies, the game device can
provide a tense mood. For example, while a target enemy approaches
from the back of the screen, the player (the virtual camera) moves
toward a certain destination. The moving speed of the player does
not change as long as the enemy is far away, therefore, the player
feels as if he/she is advancing towards the destination on his/her
own. However, as time passes and the enemy approaches the player,
the player's moving speed decelerates and the player recognizes
that he/she will have a battle with the enemies and so he/she
becomes nervous. When the enemy finally reaches a certain range,
the player's character stops its movement and remains in that
position until the battle with the enemy is over. The player fights
with the enemies with a sense of urgency, fearing that he/she might
be defeated. Accordingly, the combination of the player's movements
and the enemies'movements can provide a real-life tense
atmosphere.
[0091] [Fixation Point of the Virtual Camera]
[0092] The virtual camera moves in the three-dimensional virtual
space according to the program. As shown in FIG. 7, the line of
sight of the camera is directed to a certain point (a fixation
point) in the virtual space and images are generated with the
fixation point at the center of the display screen. The fixation
point is controlled according to the enemy's situation which is
located in the direction of the virtual camera's line of sight.
Control of the fixation point is executed such that the speed with
which the fixation point follows the enemy changes based on the
distance between the virtual camera and the enemy. More
specifically, as the enemy reaches within the enemy sensing
distance, the fixation point starts to follow the enemy, and the
shorter the distance becomes, the speed of the fixation point for
following th enemy increases.
[0093] Explanations will be given for a case in which the fixation
point is controlled in accordance with the distance between the
virtual camera and the enemy, with reference to FIG. 4. The
fixation point of the virtual camera is predetermined according to
the program. When an enemy reaches within the enemy sensing
distance (enemy 1 in FIG. 4), the fixation point of the virtual
camera starts to follow the enemy, but since the enemy is still far
away from the virtual camera, the speed of the fixation point for
following the enemy is set to slow. However, when the enemy
approaches the virtual camera (enemy 2 in FIG. 4), the enemy
following speed of the fixation point is set to fast. Finally, when
the enemy arrives at a certain distance (enemy 3 in FIG. 4), the
speed of the fixation point for following the enemy will be at the
maximum setting.
[0094] To rephrase the above process, a fixation point setting
means sets a fixation point of the virtual camera. Next, it selects
an enemy on which the fixation point should be fixed. This enemy is
the one within the enemy sensing distance and closest to the
virtual camera. Then, the fixation point setting means determines
the speed to move the fixation point in accordance with the enemy's
position and moves the fixation point at the determined speed.
[0095] [Shotgun Shooting Results Determination Process]
[0096] Now, explanations are given for improvements in the shooting
result determination process which is performed when the player
shoots the enemies. In this embodiment, the player's weapon is a
shotgun. Accordingly, it is desirable that the shooting results be
determined in a manner that effectively demonstrates the shotgun
property in which "bullets scatter in a wide radius". It is
understood that this characteristic of the shotgun means that, if
fired at an object closeby, bullets impact a small area with high
density, thereby demonstrating their greatest strength. On the
other hand, if fired at a distant object, bullets scatter and
impact a wide area with low density, thereby a deadly force cannot
be fully realized.
[0097] Accordingly, in this embodiment, damage to be suffered by an
enemy is determined based on the following points: the amount of
damage changes according to the distance to the enemy; an effective
shooting scope (bullet strength) will change in accordance with the
above distance: and the damage also changes in accordance with the
bullet's impact point within the effective radius.
[0098] FIG. 8 is a diagram showing one example of the relationship
between the distance to the enemy and the bullet strength. As shown
in FIG. 8, the bullet strength and the effective shooting scope are
determined to change according to the distance to the enemy. For
example, if the distance is 3 meters, the bullet strength is 100
points, but the bullet strength decreases to 60 points when the
distance is 5 meters, and to 30 points when the distance is 7
meters. Whereas, if the distance to the enemy is 3 meters, the
effective shooting scope is 20 centimeters, and it expands to 60
centimeters when the distance is 5 meters, and to 70 centimeters
when the distance is 7 meters.
[0099] FIG. 9 shows an example of the relationship between the
effective shooting scope and the bullet strength. As shown in FIG.
9, the bullet strength is set in a manner so that the further the
impact point is located from the target point of the player (the
center of the concentric circle), the more the bullet strength and
the damage suffered by the enemy decrease. FIG. 10 shows on example
of the r lationship between the distance to the enemy and the
effective shooting scope.
[0100] The shooting results of the player based on the above
settings, will be explained with reference to FIGS. 8 and 9. If the
player shoots an enemy (for example, the head of the enemy) when it
is within three meters of the player, the player can cause damage
to the enemy worth 100 points. Whereas, if the player shoots the
enemy when it is within five meters of the player, the player can
cause damage worth only 60 points, even if aiming at the same head.
However, the effective shooting scope, which is 20 centimeters when
the distance is 3 meters, expands to 50 centimeters when the
distance is 5 meters, and accordingly, there is a high possibility
that the bullet will hit the head and other parts of the body
(chest and shoulders) at the same time, consequently, this may
cause more damage to the enemy. When aiming at the center of the
enemy's head, a bullet that hits the chest is out of the radius of
100% bullet strength, but in the radius of 80% bullet strength. On
the other hand, if aiming at the neck, the player can damage both
the head and the chest with 100% bullet strength.
[0101] Since an effective point to aim at varies in accordance with
the distance to the enemy, by learning the shotgun properties,
strategies to defeat the enemies become possible. Therefore, the
entertaining characteristics of the game are enhanced.
[0102] Now, the process flow executed when the player shoots an
enemy will be explained. FIG. 11 is a flow chart explaining the
process of the shooting results determination process (S306 in FIG.
3). At first, when the player shoots the enemy, the hit decision
means executes the hit decision process for determining whether the
bullet hits the enemy (S306a). This hit decision process will be
described hereinafter with reference to FIGS. 12(A) and 12(B). When
the hit decision means determines that the bullet has hit the
target in the hit decision process, a hit flag is set.
[0103] Then, for the enemies having hit flags set, a damage
computing means executes the damage process for computing a damage
point caused by the shooting (S306b). The damage process will be
described hereinafter with reference to FIGS. 13(A) and 13(B).
Further, injury severity computing means executes the injury
process for determining the injury severity of the enemies in
accordance with the shooting results and expressing the injury
visually (S306c). The injury process will be hereinafter described
with reference to FIG. 16.
[0104] [Hit Decision Process]
[0105] Now, the process flow of the hit decision executed when the
player shoots an enemy, will be explained with reference to FIGS.
12(A) and 12(B). FIG. 12(A) is a flow chart explaining the hit
determination process flow. When the player shoots an enemy
(S306a1; YES), the enemy's coordinates are converted on the
coordinate system in which the players position is an original
point and the vector of the shooting direction is a Z-axis
(S306a2).
[0106] A radius DR, i.e., an effective shooting scope (extent of
the scatter shot) at the Z position of the enemy is computed
(S306a3) and a distance L between the enemy and the Z-axis is
computed (S306a4). Subsequently, a radius R of the enemy's
collision sphere is computed (S306a6) and it is determined whether
the bullet has hit the enemy base d on the radius DR, the distance
L, and the radius R (S306a6). Specifically, when the sum of the
radius R and the radius DR is greater than or equal to the distance
L, it is considered that the bullet has hit the enemy (S306a6;
YES). Whereas, when the sum is less than the distance L (S306a6;
NO), it is considered that the bullet has missed the enemy
(S306a7).
[0107] When the bullet has hit the enemy, the cross section of the
collision cone of the shotgun pellets at the enemy's Z position is
divided into sections of a predetermined number (for example, 16
sections), and it is determined which sections cover the enemy
(S306a8). FIG. 12(B) shows the conical cone which is divided into
sections 1 to 16.
[0108] After executing the process from S306a1 to S306a8 for all
the enemies that have appeared on the screen (S306a9), the enemies
are realigned according to their Z positions (S306a10), and the
sections of the collision cone that were hit cover enemies in order
of the shortest distance along the Z-axis from the shotgun
(S306a11).
[0109] It is determined whether the enemy is covered by any of
sections 1 to 16 (whether any section is filled with the enemies)
(S306a12), and if it is determined that the enemy is not covered in
any section, a hit flag is not set for the enemy (S306a13). On the
other hand, if it is determined that the enemy is covered by a
section, a hit flag for the enemy is set (S306a14). In short, it is
determined that the bullet missed the enemy, if all the sections
that have been hit already completely have covered the other
enemies, and a hit flag is not set for the enemy.
[0110] The process from S306a1 to S306a14 is repeated until the hit
decision process is completed for all the enemies.
[0111] With this process, it is possible to execute th shotgun hit
decision many times without executing the hit decision of the
vector and the enemy's collision sphere.
[0112] [First Damage Process]
[0113] Now, explanations will be given with reference to FIG. 13
for the flow of the damage process for computing a damage point
incurred by the enemies based on the player's shooting. FIG. 13 is
a flow chart explaining the damage process. The body of an enemy is
composed of predetermined body sections (for example, head, arms,
legs, chest, etc.), and each body section is composed of
predetermined body parts (for example, an "arm" has a "shoulder,"
"upper arm," "lower arm," and "hand"). The presence or absence of a
hit flag, which is set in the hit decision process explained by
FIG. 12(A), tells whether the bullet has hit any body section or
body sections of the enemy's body as well as which body section or
body sections were hit.
[0114] It is determined whether a hit flag is set for a
predetermined body section (S306b1). If the hit flag is set, the
part closest to the impact point is selected (S306b2). Then, the
effective shotgun radius at the impact point is specified (S306b3)
and a distance from the selected body part to the impact point is
computed (S306b4). Subsequently, a damage rate based on the
distance from the trajectory is calculated (S306b5).
[0115] The damage rate can be obtained by the formula shown in FIG.
13(B). In this formula, the minimum damage rate (MIN_DAMAGE_RATE)
is a bullet strength percentage at the furthest position from the
impact point within the shotgun radius, for example the minimum
damage rate is set as 0.1, for example. The maximum damage rate
(MAX_DAMAGE_RADIUS_RATE) in the maximum impact radius is a bullet
strength percentage, which determines a radius around the impact
point for which the same strength should be applied. In short, the
bullet strength at the impact point is maintained within a certain
radius from the impact point. The damage radius (SHOT_GUN_RADIUS)
is a radius wherein the bullet strength at the impact point is
effective, and it also represents a range in which bullets scatter
(i.e., hit decision area). A distance from the center of the
trajectory (HIT_LEN) is a distance between the center of the
trajectory and the enemy, and is obtained by subtracting the radius
of the enemy's collision sphere from the distance between the
center of the trajectory and the enemy.
[0116] When the damage rate is obtained at S306b5, a damage value
is specified with reference to the damage chart shown in FIG. 14
(S306b6). The damage value is determined based on the distance to
the enemy and the body section to which the body part belongs
(S306b6).
[0117] FIG. 14 is one example of the configuration of the damage
chart. The damage chart stores the damage values that determine a
damage point of the enemies that have been shot. In FIG. 14, it is
assumed that the average physical power value of the enemies is set
at 200 points. As shown in FIG. 14, the damage values are set
according to the distance between the player and an enemy and a
body section that has been shot. The physical power value of an
enemy which has been hit is calculated by: at first, obtaining the
damage point of the enemy by multiplying the damage value by the
damage rate based on the distance to the impact point (center of
the trajectory); and then subtracting the computed damage point
from the physical power value which the nemy owned before it was
shot.
[0118] If the distance to th nemy is 3 m t rs or less and the body
part that has b n shot is the arm, the damage value suffered by the
enemy is "30" points. A damage point of the body part is computed
by multiplying the damage rate by the damage value (S306b7).
[0119] If damage points of all the body sections are not computed,
damage points of the rest of the body sections are computed
(S306b8; NO). When the damage points of all the body sections have
been obtained, the damage points are summed up for the respective
body sections and the total damage point to the enemy is obtained
(S306b9). In other words, the sum of the damage points of the
respective body sections is the total damage point suffered by the
enemy, and this total damage point is subtracted from the physical
power value of the enemy. If after the subtraction, the physical
power value is less than a predetermined value, the enemy vanishes
from the screen.
[0120] FIG. 15 shows image examples of objects (enemies) being
shot. FIG. 15 shows two examples wherein the objects were shot at
the same impact point but from different distances, the effective
shooting scopes being shown with circles of dashed lines, and
damage being shown with .star. figures. If the enemy is shot at
short range as shown in FIG. 15(A), the bullets scatter around the
abdomen, and each body part will be heavily damaged even though
there are only a few points of damage. Whereas, if the enemy is
shot at long range as shown in FIG. 15(B), the bullets scatter in a
wide range throughout the whole body, but the damage to each body
part is small.
[0121] To summarize the above explanations, the distance betwe n
the virtual camera and an enemy character affects not only the
virtual camera's moving speed, but the amount of damag suffered by
the enemy character that was shot. Due to this fact, the player
will be conflicted since on the one hand, shooting at short range
demonstrates great bullet strength and enables the player to defeat
`one enemy` in a short time, but the player's moving speed will
become slow. On the other hand, if there are many enemies, it may
be better to shoot them at long range even with small bullet
strengths because the player can damage an enemy in a wide range,
thereby defeating the enemies more quickly and the player can move
forward in the game. Thus, the entertaining characteristics of the
game are enhanced.
[0122] [Injury Process]
[0123] Now, explanations will be given for the flow of the injury
process for displaying an injury status of an enemy in accordance
with the shooting by the player. By this injury process, how much
the enemy is damaged is visually displayed. Every time the enemy is
shot, an injury progression value due to the shot is attributed to
the enemy, and the damage (injury status) is displayed in
correspondence with the accumulated injury progression values. The
injury progression value to be attributed to the enemy, is set in
accordance with the distance to the enemy. Specifically, the
shorter the distance is, the larger the injury progression value is
set, and the longer the distance is, the smaller the injury
progression value is set
[0124] Each body part on the enemy is provided with damaged body
parts that express damage (injury status) corresponding to the
predetermined levels. For example, the body part, the "chest," of
an enemy A is provided with damaged body parts that correspond to
five levels (0.fwdarw.1.fwdarw.2.fwdarw.3.fwdarw.4) of damage. The
damaged body parts are composed such that the severity increases at
each level. For xample, level 0 shows an image of the chest with no
damage; at level 1 a part of the chest is bleeding; at level 2, a
part of the chest is damaged; at level 3, the entire ch st is
damaged; and at level 4, the chest is shattered. Damage levels and
their modes of expression may be set differently depending on the
enemy types.
[0125] FIG. 16 is a flow chart explaining the injury process. The
presence or absence of a hit flag, which is set in the hit decision
process explained with reference to FIG. 12, indicates whether a
bullet hit a body section of the enemy's body.
[0126] At first, it is determined whether a hit flag is set for a
predetermined body section (S306c1), and if the hit flag is set for
the body section (S306c1; YES), a body part closest to the impact
point within the present body section is selected (S306c2). Then,
the injury progression value chart (damage progression value chart)
in FIG. 17 is referred to, and an injury progression value is
specified based on the player's distance to the enemy (S306c3).
[0127] FIG. 17 shows one example of the configuration of the injury
progression value chart, wherein the injury progression values are
set in accordance with the distance to the enemy. As shown in FIG.
17, the injury progression values of a body part A of a certain
body section are set such that the shorter the distance to the
enemy the player is, the more the value increases; and the further
the distance is, the more the value decreases. In FIG. 17, the
injury progression values only of the body part A (upper part of an
arm) are indicated. Other body parts (for example, lower arms) and
other body sections (for example, the head) are omitted in FIG. 17,
but the injury progression values of those parts are similarly
set.
[0128] After specifying the injury progression value, the injury
progression value that is already stored in a predetermined storage
area is added (S306c4). In short, if the present body part has
previously been hit, the injury progression value of the present
impact is added to the injury progression value of the previous
impact, thereby increasing the total injury progression value.
Parameters of the damaged body parts are referred to based on the
accumulated injury progression value, and a damaged body part which
will be displayed is specified as it shows the shooting result
(S306c5).
[0129] Specifically, the damaged body part is specified by the
formula "display damaged body part=injury progression value of the
present body part.div.10 (fractions omitted)". For example, when
the enemy is shot from a distance of 8 meters, the injury
progression value of the body part is "7", and the formula is
"{fraction (7/10)}=0 (fractions omitted)", therefore, the damaged
body part of level "0" is displayed. When the enemy is shot again
from the same distance, the accumulated injury progression value is
"7+7=14", and the formula is "{fraction (14/10)}=1 (fractions
omitted)", thereby the damaged body part of level "1" is
displayed.
[0130] Whereas, if the enemy is shot from a distance of 3 meters,
the injury progression value of the present body part is "15" and
the formula is "{fraction (15/10)}=1 (fractions omitted)", and
accordingly, the damaged body part of level "1" is displayed. When
the enemy is shot again from the same distance, the accumulated
injury progression value is "15+15=30", and the formula is
"{fraction (30/10)}=3 (fractions omitted)", thereby, the damaged
body part of level "3" is displayed. In this way, when the distance
to the enemy at the time of impact is short, the severity of the
enemy's injury increases even if impacted only a few times.
[0131] [Second Damage Process]
[0132] Now, explanations will be given for the second damage
process of damage suffered by the enemy due to the shooting. A
proportion ("first overlapping proportion") of the overlapping area
(hit determined portion) to the entire effective shooting radius
(damage radius) is computed, and in the overlapping area, the
damage radius and the enemy's collision sphere overlap. A damage
point is then computed based on the obtained proportion.
Specifically, the damage point is computed by multiplying a damage
value of the damage radius by the first overlapping proportion.
Details will be explained with reference to FIG. 18.
[0133] FIG. 18 explains the second damage process. In FIG. 18(A),
the proportion of the hit determined portion to the damage radius
is 100%. An enemy's damage point is obtained by the formula
"enemy's damage point=damage value.times.first overlapping
proportion (%)". Accordingly, if the damage value is set to 100,
the damage point of the enemy is 100 by calculating "damage value
(100).times.first overlapping proportion (100%)=100".
[0134] In FIG. 18(B), the proportion of the hit determined portion
to the damage radius is 50%. If the damage value is set to 100, the
enemy's damage point is 50 by the formula "damage value
(100).times.first overlapping proportion (50%)=50". The fact that
the proportion of the hit determined portion to the damage radius
is 50%, means that 50% of the damage radius overlaps with the
enemy's collision sphere.
[0135] Now, another example of the second damage process will be
explained. In this example, a proportion ("second overlapping
proportion") of the overlapping area (hit determined portion) to
the entire collision sphere of the enemy is computed, and in the
overlapping area, the damage radius and the enemy's collision
sphere overlap. A damage point is then computed based on the obtain
d second overlapping proportion. Specifically, the damage point is
obtained by multiplying the damage value of the damage radius by
the second overlapping proportion. Details will be explained with
reference to FIGS. 18(C) and 18(D).
[0136] In FIG. 18(C), the proportion of the hit determined portion
to the enemy's collision sphere is 100%. A damage point of the
enemy is obtained by calculating "enemy's damage point=damage
value.times.second overlapping proportion (%)". Accordingly, when
the damage value provided by the entire damage radius is set to
100, the enemy's damage point is 100 by the formula "damage value
(100).times.second overlapping proportion (100%)=100".
[0137] On the other hand, in FIG. 18(D), the proportion of the hit
determined portion to the enemy's collision sphere is 50%.
Accordingly, when the damage value provided by the entire damage
radius is set to 100, the enemy's damage point is 50 by calculating
"damage value (100).times.second overlapping proportion (50%)". The
fact that the proportion of the hit determined portion to the
enemy's collision sphere is 50%, means that 50% of the enemy's
collision sphere overlaps with the damage radius.
[0138] Now, explanations will be given, with reference to FIG.
18(E), as to how to compute an area ("hit determined portion")
wherein the damage radius and the enemy's collision sphere overlap
with each other. As shown in FIG. 18(E), the damage radius is
divided into lattices of a predetermined size. Then, it counts the
number of lattices overlapping the collision sphere. Finally, in
the case of the first overlapping proportion, the proportion of the
overlapped lattices to all of the damage radius lattices is
computed. Whereas, in the cas of the second overlapping proportion,
the proportion of the overlapped lattices to all of the collision
sphere lattices is computed.
[0139] In yet another example, the damage radius and the collision
sphere are projected onto a virtual image (not displayed) for the
hit decision, and the number of pixels in the overlapped portion of
the virtual image is counted.
[0140] The overlapped area and the value for multiplying the damage
do not necessarily correspond exactly. The value may be separated
into levels, for example, when "the overlapped portion is 1% or
more, but less than 10%", "the value is 10%", and when "the
overlapped portion is 10% or more, but less than 30%", "the value
is 30%".
[0141] The damage radius is not limited to being circular, but may
be oval or polygonal as appropriate. Further, the combination of
the damage process of the present invention with other damage
processes makes it possible to execute a more fractionalized damage
process.
[0142] Now, explanations will be given for the case in which an
enemy, a shooting object, has a human shape and a damage point of
the enemy is computed according to the second damage process. FIG.
19 explains the second damage process when an enemy is
human-shaped.
[0143] As shown in FIG. 19(A), when the damage radius is above the
waist, it is determined that approximately 80% of the entire damage
radius overlaps with the enemy's collision sphere (the proportion
of the hit determined portion to the damage radius is 80%). If th
damage value of the entire damage radius is set to 100, the enemy's
damage point will be 80 by calculating "damage value
(100).times.first overlapping proportion (80%)=80".
[0144] If the damage point computation is executed for each body
part, a proportion of an area for each body part to the damage
radius is computed and a damage point is also computed based on the
total obtained proportion. For example, when the damage radius is
taken up by 10% for each of the arms, 10% by the head, 20% by the
waist, and 30% by the chest, the summation of 20 for the arms, 10
for the head, 20 for the waist, and 30 for the chest, i.e.
"20+10+20+30", will equal the total damage points of the enemy,
i.e., 80.
[0145] As shown in FIG. 19(B), if the damage radius overlaps with
an arm, it is determined that approximately 10% of the entire
damage radius overlaps with the enemy's collision sphere. When the
damage value is set to 100, the enemy's damage point will be 10 by
calculating "damage value (100).times.first overlapping proportion
(10%)=10". If the damage point computation is executed for each
body part, the damage point of the arm is 10.
[0146] As shown in FIG. 19(C), if the damage radius overlaps with
both legs, it is determined that approximately 40% of the entire
damage radius overlaps with the enemy's collision sphere. When the
damage value is set to 100, the enemy's damage point will be 40 by
calculating "damage value (100).times.first overlapping proportion
(40%)=40". If the damage point computation is executed for each
body part, th damage points of both legs will be 40, which is 20
for each leg.
[0147] As shown in FIG. 19(D), if the damage radius overlaps with
two nemies, their damage points are computed separately. First,
regarding the enemy A, the damage radius overlaps with its arm,
therefore, it is determined that approximately 10% of the damage
radius overlaps with the collision sphere of the enemy A. As for
the enemy B, since the damage radius overlaps with its upper body,
it is determined that approximately 50% of the damage radius
overlaps with the collision sphere of the enemy B. When the damage
value is set to 100, a damage point of the enemy A is 10 by
calculating "damage value (100).times.first overlapping proportion
(10%)=10". A damage point of the enemy B will be 50 by calculating
"damage value (100).times.first overlapping proportion (50%)=50".
This enables the display of an image in which "even though the
enemy B is positioned slightly behind the character A, the enemy B
is severely damaged since it received more scattered bullets".
[0148] As described, because the overlapping areas of the damage
radius and the enemies'collision spheres, correspond precisely to
the damage values, damage can be determined more realistically and
fairly. Further, the player can develop his/her skills for the
shooting game, for example, "aiming at a target in a manner so that
more enemies are included in the effective shooting scope" and
"shooting stronger enemies in a manner so that bullets scatter in a
wide range".
[0149] It is possible to combine the second damage process with the
first damage process.
[0150] [Other Embodiments]
[0151] In the explanations of the above embodiment, th present
invention is applied to a gun shooting game, however, the present
invention is not limited to this application, but can be applied to
other types of games. For example, explanations will be given to a
game wherein multiple characters are defined in a three-dimensional
virtual space, a first character (for example, an enemy character)
being manipulated under a predetermined program while a second
character (for example, a player's character) being manipulated in
accordance with the manipulation information from the player.
[0152] In the virtual camera controlling process of this game, a
position of the player's character is employed instead of the
position of the virtual camera and the moving speed of the virtual
camera is controlled by the distance between the enemy character
and the player's character. Further, in the moving speed changing
process of the fixation point of the virtual camera, the speed of
the fixation point to follow enemy characters may be controlled on
the basis of the distance between the player's character and the
enemy character.
[0153] Also, in the damage point computing process, an effective
attack range of the player is employed instead of the effective
shooting radius. In this case, damage points of the enemy
characters may be computed on the basis of: the distance between
the player's character and the enemy character; the effective
attack range that changes in accordance with the above distance;
and the distance between the enemy character and the center of the
effective attack range. The same computing manner is used when the
player's character (a character on the player's side) is being
attacked.
[0154] Furthermore, in the second damage point computing process, a
damage point may be computed based on the proportion of the
overlapping area to the entire effective attack range, and in the
overlapping area, the effective attack range and the enemy's
collision spheres overlap.
[0155] This damage point computing process may be applied to games
in which a player damages objects located in a virtual space.
Specifically, the positional information of a character manipulated
by the player and the positional information of objects are
obtained. Then the distance between the player's character and an
object is computed, and the size of a hit decision is determined
based on the distance. When it is determined that the hit decision
area overlaps with an object, the damage amount of the object is
computed, and then the damage to the object based on the damage
amount is caused.
[0156] According to the present invention, the faster the player
defeats the enemies, the more favorably the game develops for the
player, therefore, the player will have results that correspond to
his/her skills. In addition, in this invention, since the shooting
results are determined in correspondence with the characteristics
of the gun, the player can enjoy the game by developing fight
strategies using his/her knowledge of the gun properties.
Furthermore, with the game device of the present invention, the
shooting results and the enemies' damage point correspond to each
other precisely, and damage can be determined more realistically
and fairly.
[0157] In this specification, a product invention can be
interpreted as a method invention and vice versa. This invention
can also be implemented as a program or a recording medium that has
a program stored therein for making a computer implement
predetermined functions. Examples of the recording medium include,
for example, a hard disk (HD), a DVD-RAM, a floppy disk (FD), a
CD-ROM, and types of memory such as a RAM and a ROM. Examples of
the computer includ a so-called microcomputer wherein a central
processing unit such as a CPU or an MPU interprets programs to
execute predetermined processes.
[0158] In this specification, a means does not simply imply a
physical means, but it can also imply a function of the means
implemented by a software or hardware circuit. A function of one
means may be realized by two or more physical means and functions
of two or more means may be realized by one physical means.
[0159] Moreover, means in this specification can be implemented by
hardware or software, or the combination of both. Implementation by
the combination of the hardware and the software is, for example,
the implementation by a computer system having a predetermined
program therein. A function of one means may be realized by two or
more types of hardware or software, or by the combination of both,
while two or more functions of one means may also be realized by
one type of hardware or software, or by the combination of
both.
[0160] The entire disclosure of Japanese Patent Application No.
2002-146900 filed on May 21, 2002 including the specification,
claims, drawings, and summary are incorporated herein by reference
in its entirety.
* * * * *