U.S. patent application number 10/572429 was filed with the patent office on 2007-08-23 for music game device, music game system, operation object, music game program, and music game method.
This patent application is currently assigned to SSD Company Limited. Invention is credited to Hiromu Ueshima.
Application Number | 20070197290 10/572429 |
Document ID | / |
Family ID | 38428925 |
Filed Date | 2007-08-23 |
United States Patent
Application |
20070197290 |
Kind Code |
A1 |
Ueshima; Hiromu |
August 23, 2007 |
Music Game Device, Music Game System, Operation Object, Music Game
Program, And Music Game Method
Abstract
When a cursor (105: FIG. 12) is correctly manipulated in
correspondence with guides ("G1" to "G4" and "g1" to "g5": FIG.
12), the display of the dance object (106: FIG. 12) and the
background (110: FIG. 12) is controlled in accordance with the
manipulation direction of the cursor. The position of the operation
article on a screen (91: FIG. 1) is obtained as the coordinates of
the cursor by intermittently irradiating an operation article (150:
FIG. 1) by a stroboscope and capturing the image thereof by an
imaging unit (13: FIG. 1).
Inventors: |
Ueshima; Hiromu; (Shiga,
JP) |
Correspondence
Address: |
OSHA LIANG L.L.P.
1221 MCKINNEY STREET
SUITE 2800
HOUSTON
TX
77010
US
|
Assignee: |
SSD Company Limited
Shiga
JP
525-0054
|
Family ID: |
38428925 |
Appl. No.: |
10/572429 |
Filed: |
September 17, 2004 |
PCT Filed: |
September 17, 2004 |
PCT NO: |
PCT/JP04/14025 |
371 Date: |
December 6, 2006 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
60515267 |
Oct 29, 2003 |
|
|
|
Current U.S.
Class: |
463/36 |
Current CPC
Class: |
A63F 13/5375 20140902;
A63F 2300/63 20130101; G10H 1/0033 20130101; G10H 2220/455
20130101; A63F 13/245 20140902; A63F 2300/1087 20130101; G10H
1/0008 20130101; A63F 13/44 20140902; A63F 2300/10 20130101; G10H
2220/415 20130101; G10H 2220/201 20130101; A63F 13/213 20140902;
A63F 13/10 20130101; A63F 13/814 20140902; A63F 13/02 20130101 |
Class at
Publication: |
463/036 |
International
Class: |
A63F 9/24 20060101
A63F009/24 |
Foreign Application Data
Date |
Code |
Application Number |
Sep 18, 2003 |
JP |
2003-325381 |
Claims
1. A music game apparatus operable to automatically playing music,
comprising: a stroboscope operable to irradiate an operation
article manipulated by a player with light in a predetermined
cycle; an imaging unit operable to generate a lighted image signal
and an unlighted image signal by capturing images of the operation
article respectively when said stroboscope is lighted and
unlighted; a differential signal generating unit operable to
generate a differential signal between the lighted image signal and
the unlighted image signal; a state information calculating unit
operable to calculate the state information of the operation
article on the basis of the differential signal; a guide control
unit operable to control the display of a guide for the
manipulation of a cursor, which moves in association with the
operation article, in a timing on the basis of the music; a cursor
control unit operable to control the display of the cursor on the
basis of the state information of the operation article; and a
follow-up image control unit operable to control the display of an
image in accordance with guidance by the guide when the cursor is
correctly manipulated by the operation article in correspondence
with the guide, wherein said follow-up image control unit
determines whether or not the cursor is correctly manipulated by
the operation article in correspondence with the guide, on the
basis of the state information of the operation article and the
information about the guide.
2. The music game apparatus as claimed in claim 1 wherein the guide
is operable to guide the cursor to a destination position in a
manipulation timing, and wherein said follow-up image control unit
is operable to control the display of the image in correspondence
with the direction of the destination position as guided by the
guide when the cursor is correctly manipulated by the operation
article in correspondence with the guide.
3. The music game apparatus as claimed in claim 2 wherein said
state information calculating unit is operable to calculate the
position of the operation article as the state information on the
basis of the differential signal, and wherein said follow-up image
control unit is operable to determine that the cursor, which moves
in association with the operation article, is correctly manipulated
in correspondence with the guide if the position of the operation
article as calculated by said state information calculating unit is
located in an area corresponding to the guidance by the guide
within a period corresponding to the guidance by the guide.
4. The music game apparatus as claimed in claim 1 wherein the guide
is operable to guide the moving path, moving direction and
manipulation timing of the cursor.
5. The music game apparatus as claimed in claim 4 wherein said
state information calculating unit is operable to calculate the
position of the operation article as the state information on the
basis of the differential signal, and wherein said follow-up image
control unit is operable to determine that the cursor, which moves
in association with the operation article, is correctly manipulated
in correspondence with the guide if the position of the operation
article as calculated by said state information calculating unit is
moved through a plurality of predetermined areas guided by the
guide in a predetermined order guided by the guide within a period
guided by the guide.
6. The music game apparatus as claimed in claim 1 wherein the guide
is displayed in each of a plurality of positions which is
determined in advance in a screen, and wherein the guide control
unit is operable to change the appearance of the guide in a timing
on the basis of the music;
7. The music game apparatus as claimed in claim 1 wherein the guide
is expressed in an image with which it is possible to visually
recognize the motion from a first predetermined position to a
second predetermined position on a screen, and wherein the guide
control unit is operable to control the display of the guide in a
timing on the basis of the music.
8. The music game apparatus as claimed in claim 7 wherein the guide
is expressed by the change in appearance of a plurality of objects
which are arranged in a path having a start point at the first
predetermined position and an end point at the second predetermined
position on the screen.
9. The music game apparatus as claimed in claim 7 wherein the guide
is expressed by an object moving from the first predetermined
position to the second predetermined position on the screen.
10. The music game apparatus as claimed in claim 7 wherein the
guide is expressed by the change in appearance of a path having a
start point at the first predetermined position and an end point at
the second predetermined position on the screen.
11. The music game apparatus as claimed in claim 1 wherein the
state information of the operation article as calculated by said
state information calculating unit is any one of or any combination
of two or more of speed information, moving direction information,
moving distance information, velocity vector information,
acceleration information, movement locus information, area
information, and positional information.
12. A music game system operable to automatically playing music,
comprising: an operation article to be manipulated by a player; a
stroboscope operable to irradiate said operation article with light
in a predetermined cycle; an imaging unit operable to generate a
lighted image signal and an unlighted image signal by capturing
images of the operation article respectively when said stroboscope
is lighted and unlighted; a differential signal generating unit
operable to generate a differential signal between the lighted
image signal and the unlighted image signal; a state information
calculating unit operable to calculate the state information of the
operation article on the basis of the differential signal; a guide
control unit operable to control the display of a guide for the
manipulation of a cursor, which moves in association with the
operation article, in a timing on the basis of the music; a cursor
control unit operable to control the display of the cursor on the
basis of the state information of the operation article; and a
follow-up image control unit operable to control the display of an
image in accordance with guidance by the guide when the cursor is
correctly manipulated by the operation article in correspondence
with the guide, wherein said follow-up image control unit
determines whether or not the cursor is correctly manipulated by
the operation article in correspondence with the guide, on the
basis of the state information of the operation article and the
information about the guide.
13. The operation article manipulated by the player of the music
game apparatus as recited in claim 1, comprising: a stick-like grip
portion to be gripped by the player; and a reflecting portion
provided at one end of said grip portion and operable to
retroreflectively reflect incident light.
14. A music game program which makes a computer perform processing
comprising: automatically playing music; irradiating an operation
article manipulated by a player with light in a predetermined
cycle; generating a lighted image signal and an unlighted image
signal by capturing images of the operation article respectively
when the light is emitted and not emitted; generating a
differential signal between the lighted image signal and the
unlighted image signal; calculating the state information of the
operation article on the basis of the differential signal;
controlling the display of a guide for the manipulation of a
cursor, which moves in association with the operation article, in a
timing on the basis of the music; controlling the display of the
cursor on the basis of the state information of the operation
article; and determining whether or not the cursor is correctly
manipulated by the operation article in correspondence with the
guide, on the basis of the state information of the operation
article and the information about the guide, and controlling the
display of an image in accordance with guidance by the guide when
the cursor is correctly manipulated by the operation article in
correspondence with the guide.
15. A music game method comprising: irradiating an operation
article manipulated by a player with light in a predetermined
cycle; generating a lighted image signal and an unlighted image
signal by capturing images of the operation article respectively
when the light is emitted and not emitted; generating a
differential signal between the lighted image signal and the
unlighted image signal; calculating the state information of the
operation article on the basis of the differential signal;
controlling the display of a guide for the manipulation of a
cursor, which moves in association with the operation article, in a
timing on the basis of the music; controlling the display of the
cursor on the basis of the state information of the operation
article; and controlling the display of an image in accordance with
guidance by the guide when the cursor is correctly manipulated by
the operation article in correspondence with the guide, and wherein
in said step of controlling the display of the image, it is
determined whether or not the cursor is correctly manipulated by
the operation article in correspondence with the guide, on the
basis of the state information of the operation article and the
information about the guide.
Description
FIELD OF THE INVENTION
[0001] The present invention relates to a music game apparatus
which displays images following the motion of an operation article
and the related arts.
BACKGROUND ART
[0002] A music conducting game apparatus is disclosed in Patent
document 1 (Japanese Patent Published Application No. 2002-263360).
This music conducting game apparatus is provided with a
phototransmitter unit at the tip of a baton controller, and a
photoreceiver unit in a lower position of a monitor. The motion of
the baton controller is detected by such a configuration.
[0003] When a game is started, an operation guidance image is
displayed on the monitor in order to instruct the direction and
timing of swinging the baton controller while the sound of music
performance is output. This sound of performance is output
irrespective of the manipulation of the baton controller. On the
other hand, a baton responsive sound is output only when the baton
controller is manipulated in accordance with the direction and
timing as instructed. This baton responsive sound corresponds to
fragments into which a certain performance part is divided by a
predetermined length. As a result, each time the player manipulates
the baton controller in accordance with the direction and diodes as
instructed, the baton responsive sound corresponding thereto is
output.
[0004] Patent document 2 (Japanese Patent Published Application No.
Hei 10-143151) discloses a conducting apparatus. In this conducting
apparatus, while a mouse is manipulated in the same manner as a
baton, music parameters such as a tempo, an accent and dynamics are
calculated with reference to the trajectory of the mouse. Then, the
music parameters as calculated are reflected in the music and image
as output. For example, in the case where the motion picture of a
steam train is displayed, the speed of the steam train is
controlled to follow the tempo as calculated, the variation of the
speed is controlled to follow the accent as calculated, and the
amount of smoke of the steam train is controlled to follow the
dynamics as calculated.
[0005] As explained above, it is apparently the main purpose of the
music conducting game apparatus of Patent document 1 that the
player plays music performance. On the other hand, in the
conducting apparatus of the Patent document 2, since it is the main
purpose that the player plays music performance, the moving
information of the mouse is converted into music parameters which
are then reflected in the music and image.
[0006] As has been discussed above, in the case of the conventional
apparatuses having the main purpose of playing music performance by
the player, the image which is displayed (a steam train in the case
of the above example) is not interesting enough, and little
importance is attached to such an image that can be enjoyed by the
player.
[0007] Furthermore, with respect to the baton controller and the
mouse which are the operation articles manipulated by the player,
there are the following facts. The baton controller of Patent
document 1 is provided with the phototransmitter unit, and thereby
it is indispensable to use an electronic circuit. Accordingly, the
cost of the baton controller rises, and it can be the cause of
trouble. Still further, the manipulability is degraded.
Particularly, since the baton controller is used by swinging, it is
desirable to dispense with an electronic circuit and simplify the
configuration. In addition to this, the mouse of the Patent
document 2 can be moved only on a plane surface so that there are
substantial restrictions on the manipulation, and in addition to
this there are the same problem as in the baton controller of
Patent document 1.
SUMMARY OF THE INVENTION
[0008] Accordingly, it is an object of the present invention to
provide a music game apparatus and the related arts in which the
player can enjoy images, which are displayed in synchronization
with the manipulation of an operation article, together with music
by manipulating the operation article having a simple structure,
while automatically playing the music without relation to the
player.
[0009] In accordance with an aspect of the present invention, a
music game apparatus operable to automatically playing music,
comprises: a stroboscope operable to irradiate an operation article
manipulated by a player with light in a predetermined cycle; an
imaging unit operable to generate a lighted image signal and an
unlighted image signal by capturing images of the operation article
respectively when said stroboscope is lighted and unlighted; a
differential signal generating unit operable to generate a
differential signal between the lighted image signal and the
unlighted image signal; a state information calculating unit
operable to calculate the state information of the operation
article on the basis of the differential signal; a guide control
unit operable to control the display of a guide for the
manipulation of a cursor, which moves in association with the
operation article, in a timing on the basis of the music; a cursor
control unit operable to control the display of the cursor on the
basis of the state information of the operation article; and a
follow-up image control unit operable to control the display of an
image in accordance with guidance by the guide when the cursor is
correctly manipulated by the operation article in correspondence
with the guide, wherein said follow-up image control unit
determines whether or not the cursor is correctly manipulated by
the operation article in correspondence with the guide, on the
basis of the state information of the operation article and the
information about the guide.
[0010] In accordance with this configuration, if the cursor is
correctly manipulated in correspondence with the guide, the display
of the image is controlled in accordance with the guidance by the
guide. In this case, since the cursor is manipulated in
correspondence with the guidance by the guide, the display of the
image is controlled in accordance with the manipulation of the
cursor. In other words, since the cursor moves in association with
the operation article, the display of the image is controlled in
accordance with the manipulation of the operation article. The
state information of the operation article is obtained by capturing
the image of the operation article, which is intermittently lighted
by the stroboscope. Because of this, no circuit which is driven by
a power supply need be provided within the operation article for
obtaining the state information of the operation article.
Furthermore, this music game apparatus serves to automatically play
music.
[0011] As a result, while automatically playing music without
relation to the player, the player can enjoy, together with the
music, images which are displayed in synchronization with the
manipulation of the operation article by manipulating the operation
article having a simple structure.
[0012] Also, since the guide is controlled in the timing on the
basis of music, the operation article is manipulated in
synchronization with music as long as the player manipulates the
cursor in correspondence with the guide. Accordingly, the player
can enjoy the manipulation of the operation article in
synchronization with music.
[0013] In this case, the "manipulation" of the operation article
means moving the operation article itself (for example, changing
the position thereof), but does not mean pressing a switch, moving
an analog stick, and so forth.
[0014] In the above music game apparatus, the guide is operable to
guide the cursor to a destination position in a manipulation
timing, and wherein said follow-up image control unit is operable
to control the display of the image in correspondence with the
direction of the destination position as guided by the guide when
the cursor is correctly manipulated by the operation article in
correspondence with the guide.
[0015] In accordance with this configuration, when the player
manipulates the operation article in order to move the cursor to
the destination position guided by the position guide in the
manipulation timing guided by the guide, the display of images is
controlled in correspondence with the direction toward the
destination position of the cursor guided by the guide. As a
result, it is possible to enjoy, together with music, the images
which are synchronized with the cursor which is moving in
association with the motion of the operation article.
[0016] In the above music game apparatus, said state information
calculating unit is operable to calculate the position of the
operation article as the state information on the basis of the
differential signal, and wherein said follow-up image control unit
is operable to determine that the cursor, which moves in
association with the operation article, is correctly manipulated in
correspondence with the guide if the position of the operation
article as calculated by said state information calculating unit is
located in an area corresponding to the guidance by the guide
within a period corresponding to the guidance by the guide.
[0017] In accordance with this configuration, it is possible to
determine the correctness of the manipulation of the cursor on the
basis of the position of the operation article which can be
calculated by a simple process.
[0018] In the above music game apparatus, the guide is operable to
guide the moving path, moving direction and manipulation timing of
the cursor.
[0019] In accordance with this configuration, when the player
manipulates the operation article in order to move the cursor in
the manipulation timing guided by the guide, in the moving
direction guided by the guide and along the moving path guided by
the guide, the display of images is controlled in correspondence
with the guide. As a result, it is possible to enjoy, together with
music, the images which are synchronized with the cursor which is
moving in association with the motion of the operation article.
[0020] In the above music game apparatus, said state information
calculating unit is operable to calculate the position of the
operation article as the state information on the basis of the
differential signal, and wherein said follow-up image control unit
is operable to determine that the cursor, which moves in
association with the operation article, is correctly manipulated in
correspondence with the guide if the position of the operation
article as calculated by said state information calculating unit is
moved through a plurality of predetermined areas guided by the
guide in a predetermined order guided by the guide within a period
guided by the guide.
[0021] In accordance with this configuration, it is possible to
determine the correctness of the manipulation of the cursor on the
basis of the position of the operation article which can be
calculated by a simple process.
[0022] In the above music game apparatus, the guide is displayed in
each of a plurality of positions which is determined in advance in
a screen, and wherein the guide control unit is operable to change
the appearance of the guide in a timing on the basis of the
music;
[0023] In accordance with this configuration, the player can easily
recognize the position and the direction to which the cursor is to
be moved with reference to the change of the position guide in
appearance.
[0024] In the present specification, the appearance of the guide is
related to either or both of the shape and color of the guide.
[0025] In the above music game apparatus the guide is expressed in
an image with which it is possible to visually recognize the motion
from a first predetermined position to a second predetermined
position on a screen, and wherein the guide control unit is
operable to control the display of the guide in a timing on the
basis of the music.
[0026] In accordance with this configuration, the player can
clearly recognize the direction and path of the cursor to be
moved.
[0027] For example, the guide is expressed by the change in
appearance of a plurality of objects which are arranged in a path
having a start point at the first predetermined position and an end
point at the second predetermined position on the screen.
[0028] In accordance with this configuration, the player can easily
recognize the direction and path of the cursor to be moved with
reference to the change in appearance of the plurality of
objects.
[0029] For example, the guide is expressed by an object moving from
the first predetermined position to the second predetermined
position on the screen.
[0030] In accordance with this configuration, the player can easily
recognize the direction and path of the cursor to be moved with
reference to the motion of the object.
[0031] For example, the guide is expressed by the change in
appearance of a path having a start point at the first
predetermined position and an end point at the second predetermined
position on the screen.
[0032] In accordance with this configuration, the player can easily
recognize the direction and path of the cursor to be moved with
reference to the change in appearance of the path.
[0033] In the above music game apparatus the state information of
the operation article as calculated by said state information
calculating unit is any one of or any combination of two or more of
speed information, moving direction information, moving distance
information, velocity vector information, acceleration information,
movement locus information, area information, and positional
information.
[0034] In accordance with this configuration, since a variety of
information can be used as the state information of the operation
article for determining whether or not the cursor is correctly
manipulated in correspondence with the guides, the possibility of
expression of guides is greatly expanded, and thereby the design
freedom of the game content is also greatly increased.
[0035] The novel features of the invention are set forth in the
appended claims. The invention itself, however, as well as other
features and advantages thereof, will be best understood by reading
the detailed description of specific embodiments in conjunction
with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0036] FIG. 1 is a view showing the overall configuration of the
music game system in accordance with an embodiment of the present
invention.
[0037] FIG. 2 is a perspective view of the operation article of
FIG. 1.
[0038] FIG. 3(a) is a top view showing the reflection ball of FIG.
2. FIG. 3(b) is a side view for showing the reflection ball as seen
from arrow A of FIG. 3(a). FIG. 3(c) is a side view for showing the
reflection ball as seen from arrow B of FIG. 3(a).
[0039] FIG. 4 is a longitudinal section view of the reflection ball
of FIG. 2.
[0040] FIG. 5 is an explanatory schematic diagram for showing one
example of the imaging unit of FIG. 1.
[0041] FIG. 6 is a view showing the electric configuration of the
music game apparatus of FIG. 1.
[0042] FIG. 7 is a block diagram of the high speed processor of
FIG. 6.
[0043] FIG. 8 is a circuit diagram for showing an LED drive circuit
and the configuration for transferring pixel data from the image
sensor of FIG. 6 to a high speed processor.
[0044] FIG. 9(a) is a timing diagram of a frame status flag signal
FSF as output from the image sensor of FIG. 8. FIG. 9(b) is a
timing diagram of a pixel data strobe signal PDS as output from the
image sensor of FIG. 8. FIG. 9(c) is a timing diagram of the pixel
data D (X, Y) as output from the image sensor of FIG. 8. FIG. 9(d)
is a timing diagram of an LED control signal LEDC as output from
the image sensor of FIG. 8. FIG. 9(e) is a timing diagram showing
the lighting state of the infrared light emitting diodes of FIG. 8.
FIG. 9(f) is a timing diagram showing the exposure period of the
image sensor of FIG. 8.
[0045] FIG. 10(a) is an expanded view showing the frame status flag
signal FSF of FIG. 9. FIG. 10(b) is an expanded view showing the
pixel data strobe signal PDS of FIG. 9. FIG. 10(c) is an expanded
view showing the pixel data D (X, Y) of FIG. 9.
[0046] FIG. 11 is a view for showing an example of the game screen
as displayed on the screen of the television monitor of FIG. 1.
[0047] FIG. 12 is a view for showing another example of the game
screen as displayed on the screen of the television monitor of FIG.
1.
[0048] FIG. 13 is a view for showing a further example of the game
screen as displayed on the screen of the television monitor of FIG.
1.
[0049] FIG. 14 is a view for explaining the sprites forming an
object which is displayed on the screen of the television monitor
of FIG. 1.
[0050] FIG. 15 is an explanatory view for showing a background
screen to be displayed on the screen of the television monitor of
FIG. 1.
[0051] FIG. 16(a) is an explanatory view for showing the background
screen of FIG. 15 before scrolling it. FIG. 16(b) is an explanatory
view for showing the background screen after scrolling it.
[0052] FIG. 17 is a schematic representation of a program and data
stored in the ROM of FIG. 6.
[0053] FIG. 18 is a schematic representation of one example of the
first musical score data of FIG. 17.
[0054] FIG. 19 is a schematic representation of one example of the
second musical score data of FIG. 17.
[0055] FIG. 20(a) is a view showing the correspondence between note
numbers and the directions in which the cursor is guided. FIG.
20(b) is another view showing the correspondence between note
numbers and the directions in which the cursor is guided. FIG.
20(c) is a further view showing the correspondence between note
numbers and the directions in which the cursor is guided.
[0056] FIG. 21(a) is a view for showing an example of the image
which is captured by an ordinary used image sensor and is not
processed by a particular treatment. FIG. 21(b) is a view for
showing an example of the image which is obtained by level
filtering the image signal of FIG. 21(a) by a certain threshold
value. FIG. 21(c) is a view for showing an example of the image
which is captured by the image sensor through the infrared filter
with the illumination and level filtered by a certain threshold
value. FIG. 21(d) is a view for showing an example of the image
which is captured by the image sensor through the infrared filter
without the illumination and is level filtered by a certain
threshold value. FIG. 21(e) is a view for showing an example of the
differential signal between the image signal with the illumination
and the image signal without the illumination.
[0057] FIG. 22 is a view for explaining the process of calculating
the coordinates of the target point of the operation article of
FIG. 1.
[0058] FIG. 23(a) is a view for explaining the process of scanning
in the X-direction when the coordinates of the target point of the
operation article of FIG. 1 are calculated on the basis of the
coordinates of the pixel having the maximum luminance value. FIG.
23(b) is a view for explaining the process of starting scanning in
the Y-direction when the coordinates of the target point of the
operation article of FIG. 1 are calculated on the basis of the
coordinates of the pixel having the maximum luminance value. FIG.
23(c) is a view for explaining the process of scanning in the
Y-direction when the coordinates of the target point of the
operation article of FIG. 1 are calculated on the basis of the
coordinates of the pixel having the maximum luminance value. FIG.
23(d) is an explanatory view for showing the result of the process
of calculating the coordinates of the target point of the operation
article on the basis of the coordinates of the pixel having the
maximum luminance value.
[0059] FIG. 24 is a view for explaining a target point existence
area determination process (1) performed by the CPU 201.
[0060] FIG. 25 is a view for explaining a target point existence
area determination process (2) performed by the CPU 201.
[0061] FIG. 26 is a view for explaining the registration process of
the animations of the direction guide "G", the position guide "g"
and the path guide "rg" in accordance with the present
embodiment.
[0062] FIG. 27 is a view for showing an example of the animation
table which is designated by the animation table storage location
information of FIG. 26.
[0063] FIG. 28 is a timing diagram for explaining the relationship
among the first musical score data, the second musical score data,
the direction guide "G", the position guide "g", the judgment of
manipulation and the dance animation in accordance with the present
embodiment.
[0064] FIG. 29 is a flow chart showing the entire process flow of
the music game apparatus 1 of FIG. 1.
[0065] FIG. 30 is a flow chart showing the process flow for the
initial settings of the system in step S1 of FIG. 29.
[0066] FIG. 31 is a flow chart showing the process flow for sensor
initial settings in step S14 of FIG. 30.
[0067] FIG. 32 is a flow chart showing the command transmission
process in step S21 of FIG. 31.
[0068] In FIG. 33, (a) is a timing diagram showing the register
setting clock CLK of FIG. 8.
[0069] (b) is a timing diagram showing register data of FIG. 8.
[0070] FIG. 34 is a flow chart showing the register setting process
in step S23 of FIG. 31.
[0071] FIG. 35 is a flow chart showing the process of calculating
the state information in step S2 of FIG. 29.
[0072] FIG. 36 is a flow chart showing the process flow of
acquiring a pixel data group in step S50 of FIG. 35.
[0073] FIG. 37 is a flow chart showing the process flow of
acquiring pixel data in step S61 of FIG. 36.
[0074] FIG. 38 is a flow chart showing the process flow of
extracting a target point in step S51 of FIG. 35.
[0075] FIG. 39 is a flow chart showing the process flow of
calculating the coordinates of a target point in step S85 of FIG.
38.
[0076] FIG. 40 is a flow chart showing the game process flow in
step S3 of FIG. 29.
[0077] FIG. 41 is a flow chart showing the interrupt process in
accordance with the present embodiment.
[0078] FIG. 42 is a flow chart showing the process flow of the
playback of music in step S150 of FIG. 41.
[0079] FIG. 43 is a flow chart showing the process flow of
registering guides in step S151 of FIG. 41.
[0080] FIG. 44 is a view for showing another example of the
direction guide in accordance with the present embodiment.
[0081] FIG. 45 is a view for showing a further example of the
direction guide in accordance with the present embodiment.
[0082] FIG. 46 is a view for showing a still further example of the
direction guide in accordance with the present embodiment.
BEST MODE FOR CARRYING OUT THE INVENTION
[0083] In what follows, an embodiment of the present invention will
be explained in conjunction with the accompanying drawings.
Meanwhile, like references indicate the same or functionally
similar elements throughout the drawings, and therefore redundant
explanation is not repeated.
[0084] FIG. 1 is a view showing the overall configuration of the
music game system in accordance with the embodiment of the present
invention. As shown in FIG. 1, this music game system includes a
music game apparatus 1, an operation article 150 and a television
monitor 90.
[0085] The housing 19 of the music game apparatus 1 includes an
imaging unit 13 therein. The imaging unit 13 includes four infrared
light emitting diodes 15 and an infrared filter 17. The light
emission units of the infrared light emitting diodes 15 are exposed
from the infrared filter 17.
[0086] The music game apparatus 1 is supplied with a DC power
voltage from an AC adapter 92. Alternatively, a battery cell (not
shown in the figure) can be used to apply the DC power voltage in
place of the AC adaptor 92.
[0087] The television monitor 90 includes a screen 91 at the front
side thereof. The television monitor 90 and the music game
apparatus 1 are connected by an AV cable 93. Incidentally, as
illustrated in FIG. 1, the music game apparatus 1 is placed for
example on the upper surface of the television monitor 90.
[0088] When the player 94 turns on the power switch (not shown in
the figure) which is provided in the back side of the music game
apparatus 1, a game screen is displayed on the screen 91. The
player 94 manipulates the operation article 150 in accordance with
the guidance of a game screen to run a game. In the present
specification, the "manipulation" of the operation article 150
means moving the operation article itself (for example, changing
the position thereof), but does not mean pressing a switch, moving
an analog stick, and so forth.
[0089] The infrared light emitting diodes 15 of the imaging unit 13
intermittently emit infrared light. The infrared light emitted from
the infrared light emitting diodes 15 is reflected by the
reflection sheet (to be described below) attached to this operation
article 150, and input to the imaging device (to be described
below) located inside the infrared filter 17. In this way, the
image of the operation article 150 is intermittently captured.
Accordingly, the music game apparatus 1 can intermittently acquire
an image signal of the operation article 150 which is moved by the
player 94. The music game apparatus 1 analyzes the image signals
and reflects the analysis result in the game. The reflection sheet
which is used in the present embodiment is for example a
retroreflective sheet.
[0090] FIG. 2 is a perspective view for showing the operation
article 150 of FIG. 1. As shown in FIG. 2, the operation article
150 comprises the reflection ball 151 fixed to the tip of a stick
152. The infrared light from the infrared light emitting diodes 15
is reflected by this reflection ball 151. The details of the
reflection ball 151 will be explained.
[0091] FIG. 3(a) is a top view showing the reflection ball 151 of
FIG. 2, FIG. 3(b) is a side view for showing the reflection ball
151 as seen from arrow A of FIG. 3(a), and FIG. 3(c) is a side view
for showing the reflection ball 151 as seen from arrow B of FIG.
3(a).
[0092] As illustrated in FIG. 3(a) through FIG. 3(c), the
reflection ball 151 comprises a spherical inner shell 154 which is
fixedly located inside a spherical outer shell 153 of a transparent
color (inclusive of a semi-transparent, a colored-transparent and
colorless transparent). The spherical inner shell 154 is provided
with a reflection sheet 155 attached thereto. This reflection sheet
155 serves to reflect infrared light from the infrared light
emitting diodes 15.
[0093] FIG. 4 is a longitudinal section view taken through the
reflection ball 151 of FIG. 2. As illustrated in FIG. 4, the
spherical outer shell 153 comprises two semispherical outer shells
which are fixed together with bosses 156 and screws (not shown in
the figure). The spherical inner shell 154 comprises two
semispherical inner shells which are fixed inside the spherical
outer shell 153 with bosses 157. In addition, the stick 152 is
fixed to the reflection ball 151 by inserting it thereinto. More
specifically speaking, the stick 152 is fixed to the reflection
ball 151 by placing the stick 152 between the two semispherical
outer shells forming the spherical outer shell 153 and the two
semispherical inner shells forming the spherical inner shell 154,
fixing together the two semispherical outer shells with the bosses
156 and the screws, and fixing together the two semispherical inner
shells with the bosses 157.
[0094] FIG. 5 is an explanatory schematic diagram for showing one
example of the imaging unit 13 of FIG. 1. As illustrated in FIG. 5,
the imaging unit 13 includes a unit base 35 which is molded for
example from a plastic material, and a supporting cylinder 36 is
attached to the inside of this unit base 35. The supporting
cylinder 36 is provided with an horn opening 41 formed in the upper
surface of the supporting cylinder 36 with an inner surface shaped
in the form of an inverted cone, and an optical system located in a
cylindrical portion below the opening 41 and including a concave
lens 39 and a convex lens 37 each of which is molded for example
from a plastic material, and an image sensor 43 as an imaging
device is fixed below the convex lens 37. Accordingly, the image
sensor 43 can capture an image in accordance with light which
incomes through the opening 41 via the lens sections 39 and 37.
[0095] The image sensor 43 is a low resolution CMOS image sensor
(for example, 32 pixels.times.32 pixels: gray scale). However, this
image sensor 43 may be an image sensor having a larger number of
pixels, a CCD or the like device. In the following explanation, it
is assumed that the image sensor 43 comprises 32 pixels.times.32
pixels.
[0096] In addition, a plurality (four in this embodiment) of the
infrared light emitting diodes 15 is attached to the unit base 35
in order that the light output directions thereof are set
respectively to the upward direction. Infrared light is emitted to
an area over the imaging unit 13 by this infrared light emitting
diodes 15. In addition, the infrared filter 17 (a filter capable of
passing only infrared light therethrough) is attached to the upper
portion of the unit base 35 in order to cover the above opening 41.
Then, the infrared light emitting diodes 15 are repeatedly turned
on/off (non-lighted) in a continuous manner, as will be described
below, so that it serves as a stroboscope. However, the
"stroboscope" is a generic term used to refer to a device serving
to intermittently irradiate a moving object. Accordingly, the above
image sensor 43 serves to capture an image of an object, which is
moving in the scope of imaging, i.e., the operation article 150 in
the case of the embodiment. Incidentally, as illustrated in FIG. 8
to be described below, the stroboscope is composed mainly of the
infrared light emitting diodes 15, an LED drive circuit 75 and a
high speed processor 200.
[0097] In this case, the imaging unit 13 is incorporated in the
housing 19 in order that the light receiving surface of the image
sensor 43 is inclined from the horizontal surface at a
predetermined angle (for example, 90 degrees). Also, the scope of
imaging of the image sensor 43 is for example within 60 degrees as
determined by the concave lens 39 and the convex lens 37.
[0098] FIG. 6 is a view showing the electric configuration of the
music game apparatus 1 of FIG. 1. As shown in FIG. 6, the music
game apparatus 1 includes the image sensor 43, the infrared light
emitting diodes 15, a video signal output terminal 47, an audio
signal output terminal 49, the high speed processor 200, a ROM
(read only memory) 51, and a bus 53.
[0099] The high speed processor 200 is connected to the bus 53.
Furthermore, the ROM 51 is connected to the bus 53. Accordingly,
the high speed processor 200 can access the ROM 51 through the bus
53 to read and execute a game program as stored in the ROM 51, and
read and process image data and music data as stored in the ROM 51
in order to generate a video signal and an audio signal, which are
then output through the video signal output terminal 47 and the
audio signal output terminal 49 respectively.
[0100] The operation article 150 is irradiated with infrared light
emitted from the infrared light emitting diodes 15, and reflects
the infrared light by the reflection sheet 155. The image sensor 43
detects the reflected light from this retroreflective sheet 155,
and outputs an image signal which includes an image of the
retroreflective sheet 155. The analog image signal output from the
image sensor 43 is converted into digital data by an A/D converter
(to be described below) incorporated in the high speed processor
200. This process is performed also in the periods without infrared
light. The high speed processor 200 analyzes this digital data, and
reflects the analysis result in the game processing.
[0101] FIG. 7 is a block diagram of the high speed processor 200 of
FIG. 6. As illustrated in FIG. 7, this high speed processor 200
includes a central processing unit (CPU) 201, a graphics processor
202, a sound processor 203, a DMA (direct memory access) controller
204, a first bus arbiter circuit 205, a second bus arbiter circuit
206, an internal memory 207, an A/D converter (ADC: analog to
digital converter) 208, an input/output control circuit 209, a
timer circuit 210, a DRAM (dynamic random access memory) refresh
control circuit 211, an external memory interface circuit 212, a
clock driver 213, a PLL (phase-locked loop) circuit 214, a low
voltage detection circuit 215, a first bus 218, and a second bus
219.
[0102] The CPU 201 takes control of the entire system and perform
various types of arithmetic operations in accordance with the
program stored in the memory (the internal memory 207, or the ROM
51). The CPU 201 is a bus master of the first bus 218 and the
second bus 219, and can access the resources connected to the
respective buses.
[0103] The graphics processor 202 is also a bus master of the first
bus 218 and the second bus 219, and generates a video signal on the
basis of the data as stored in the internal memory 207 or the ROM
51, and output the video signal through the video signal output
terminal 47. The graphics processor 202 is controlled by the CPU
201 through the first bus 218. Also, the graphics processor 202 has
the functionality of outputting an interrupt request signal 220 to
the CPU 201.
[0104] The sound processor 203 is also a bus master of the first
bus 218 and the second bus 219, and generates an audio signal on
the basis of the data as stored in the internal memory 207 or the
ROM 51, and output the audio signal through the audio signal output
terminal 49. The sound processor 203 is controlled by the CPU 201
through the first bus 218. Also, the sound processor 203 has the
functionality of outputting an interrupt request signal 220 to the
CPU 201.
[0105] The DMA controller 204 serves to transfer data from the ROM
51 to the internal memory 207. Also, the DMA controller 204 has the
functionality of outputting, to the CPU 201, an interrupt request
signal 220 indicative of the completion of the data transfer. The
DMA controller 204 is also a bus master of the first bus 218 and
the second bus 219. The DMA controller 204 is controlled by the CPU
201 through the first bus 218.
[0106] The internal memory 207 may be implemented with one or any
necessary combination of a mask ROM, an SRAM (static random access
memory) and a DRAM in accordance with the system requirements. A
battery 217 is provided if an SRAM has to be powered by the battery
for maintaining the data contained therein. In the case where a
DRAM is used, a so-called refresh cycle is periodically performed
to maintain the data contained therein.
[0107] The first bus arbiter circuit 205 accepts a first bus use
request signal from the respective bus masters of the first bus
218, performs bus arbitration among the requests for the first bus
218, and issue a first bus use permission signal to one of the
respective bus masters. Each bus master is permitted to access the
first bus 218 after receiving the first bus use permission signal.
Here, the first bus use request signal and the first bus use
permission signal are shown as the first bus arbitration signals
222 in FIG. 7.
[0108] The second bus arbiter circuit 206 accepts a second bus use
request signal from the respective bus masters of the second bus
219, performs bus arbitration among the requests for the second bus
219, and issue a second bus use permission signal to one of the
respective bus masters. Each bus master is permitted to access the
second bus 219 after receiving the second bus use permission
signal. Here, the second bus use request signal and the second bus
use permission signal are shown as the second bus arbitration
signals 223 in FIG. 7.
[0109] The input/output control circuit 209 serves to perform the
communication with an external input/output device(s) and/or an
external semiconductor device(s) through input/output signals. The
read and write operations of the input/output signals are performed
by the CPU 201 through the first bus 218. Also, the input/output
control circuit 209 has the functionality of outputting an
interrupt request signal 220 to the CPU 201.
[0110] This input/output control circuit 209 outputs an LED control
signal LEDC for controlling the infrared light emitting diodes
15.
[0111] The timer circuit 210 has the functionality of periodically
outputting an interrupt request signal 220 to the CPU 201 on the
basis of a time interval as preset. The settings such as the time
interval are performed by the CPU 201 through the first bus
218.
[0112] The ADC 208 converts analog input signals into digital
signals. The digital signals are read by the CPU 201 through the
first bus 218. Also, the ADC 208 has the functionality of
outputting an interrupt request signal 220 to the CPU 201.
[0113] This ADC 208 receives pixel data (analog) from the image
sensor 43 and converts it into digital data.
[0114] The PLL circuit 214 generates a high frequency clock signal
by multiplication of the sinusoidal signal as obtained from a
quartz oscillator 216.
[0115] The clock driver 213 amplifies the high frequency clock
signal as received from the PLL circuit 214 to a sufficient signal
level to supply the respective blocks with the clock signal
225.
[0116] The low voltage detection circuit 215 monitors the power
potential Vcc and issues the reset signal 226 of the PLL circuit
214 and the reset signal 227 to the other circuit elements of the
entire system when the power potential Vcc falls below a certain
voltage. Also, in the case where the internal memory 207 is
implemented with an SRAM requiring the power supply from the
battery 217 for maintaining data, the low voltage detection circuit
215 serves to issue a battery backup control signal 224 when the
power potential Vcc falls below the certain voltage.
[0117] The external memory interface circuit 212 has the
functionality of connecting the second bus 219 to the external bus
53 and the functionality of controlling the bus cycle length of the
second bus by issuing a cycle end signal 228.
[0118] The DRAM refresh cycle control circuit 211 periodically and
unconditionally gets the ownership of the first bus 218 to perform
the refresh cycle of the DRAM at a certain interval. Needless to
say, the DRAM refresh cycle control circuit 211 is provided in the
case where the internal memory 207 includes a DRAM.
[0119] In what follows, with reference to FIG. 8 and FIG. 10, the
configuration of transferring pixel data from the image sensor 43
to the high speed processor 200 will be explained in detail.
[0120] FIG. 8 is a circuit diagram for showing the LED drive
circuit and the configuration of transferring pixel data from the
image sensor 43 of FIG. 6 to the high speed processor 200. FIG. 9
is a timing diagram showing the operation of the high speed
processor 200 which receives pixel data from the image sensor 43 of
FIG. 6. FIG. 10 is an expanded timing diagram showing part of FIG.
9.
[0121] Referring to FIG. 8, since the image sensor 43 is a sensor
which outputs pixel data D (X, Y) as an analog signal, this pixel
data D (X, Y) is input to an analog input port of the high speed
processor 200. The analog input port is connected to the ADC 208 of
this high speed processor 200, and therefore the high speed
processor 200 acquires therein pixel data converted into digital
data from the ADC 208.
[0122] The middle point of the analog pixel data D (X, Y) as
described above is determined by a reference voltage given to a
reference voltage terminal Vref of the image sensor 43. For this
reason, in association with the image sensor 43, for example, a
reference voltage generation circuit 59 made of a resistance
voltage divider is provided in order to supply a reference voltage
which is always kept at a certain level to the reference voltage
terminal Vref.
[0123] The respective digital signals for controlling the image
sensor 43 are input to and output from the high speed processor 200
through the I/O ports thereof. These I/O ports are digital ports
capable of controlling input and output operations and connected to
the input/output control circuit 209 inside of this high speed
processor 200.
[0124] More specifically speaking, a reset signal "reset" is output
to the image sensor 43 from the I/O port of the high speed
processor 200 for resetting the image sensor 43. In addition, a
pixel data strobe signal PDS and a frame status flag signal FSF are
output from the image sensor 43, and supplied to the input ports of
the high speed processor 200.
[0125] The pixel data strobe signal PDS is a strobe signal as shown
in FIG. 9(b) which is used to read the pixel signal D (X, Y) as
described above. The frame status flag signal FSF is a flag signal
which indicates the state of the image sensor 43 and is used for
defining the exposure period of this image sensor 43 as illustrated
in FIG. 9(a). In other words, while the exposure period is defined
by the low level period of the frame status flag signal FSF as
illustrated in FIG. 9(a), the non-exposure period is defined by the
high level period of the frame status flag signal FSF as
illustrated in FIG. 9(a).
[0126] Also, the high speed processor 200 outputs, from the I/O
ports, a command (or command associated with data) to be set in a
control register (not shown in the figure) of the image sensor 43,
outputs a register setting clock CLK which periodically and
alternatively takes high and low levels, and supplies the register
setting clock CLK to the image sensor 43.
[0127] Incidentally, the infrared light emitting diodes 15 as used
are four infrared light emitting diodes 15a, 15b, 15c and 15d which
are connected in parallel each other as illustrated in FIG. 8.
These four infrared light emitting diodes 15a to 15d are arranged
and surround the image sensor 43, as explained above, in order to
irradiate the operation article 150 with infrared light emitted in
the same direction as the viewpoint of the image sensor 43 is
directed. However, the individual infrared light emitting diodes
15a to 15d are referred to simply as the infrared light emitting
diodes 15 unless it is necessary to distinguish them.
[0128] This infrared light emitting diodes 15 is turned on/off
(non-lighted) by the LED drive circuit 75. The LED drive circuit 75
receives the frame status flag signal FSF as described above from
the image sensor 43, and this frame status flag signal FSF is
passed through a differentiating circuit 67, which is made up of a
resistor 69 and a capacitor 71, and given to the base of the PNP
transistor 77. The base of this PNP transistor 77 is connected
further to a pull-up resistor 79 which usually pulls up the base of
the PNP transistor 77 to a high level. Then, when the frame status
flag signal FSF is pulled down to a low level, the low level signal
is input to the base through the differentiating circuit 67 so that
the PNP transistor 77 is turned on only for the low level period of
the frame status flag signal FSF.
[0129] The emitter of the PNP transistor 77 is grounded through
resistors 73 and 65. On the other hand, the connecting point
between the emitter resistors 73 and 65 is connected to the base of
an NPN transistor 81. The collector of this NPN transistor 81 is
connected commonly to the anodes of the respective infrared light
emitting diodes 15a to 15d. The emitter of the NPN transistor 81 is
connected directly to the base of another NPN transistor 61. The
collector of the NPN transistor 61 is connected commonly to the
cathodes of the respective infrared light emitting diodes 15a to
15d, while the emitter of the NPN transistor 61 is grounded.
[0130] This LED drive circuit 75 turns on the infrared light
emitting diodes 15a to 15d only within the period when the LED
control signal LEDC which is output from the I/O port of the high
speed processor 200 is activated (in a high level) while the frame
status flag signal FSF which is output from the image sensor 43 is
in a low level.
[0131] When the frame status flag signal FSF is pulled down to the
low level as shown in FIG. 9(a), the PNP transistor 77 is turned on
for the low level period (in practice, there is a delay time
corresponding to the time constant of the differentiating circuit
67). Accordingly, when the LED control signal LEDC shown in FIG.
9(d) is output from the high speed processor 200 as a high level
signal, the base of the NPN transistor 81 is pulled up to a high
level and turned on. When the transistor 81 is turned on, the
transistor 61 is also turned on. Accordingly, a current flows from
the power supply (indicated by a small open circle in FIG. 8)
through the respective infrared light emitting diodes 15a to 15d
and the transistor 61, and in response to this, the respective
infrared light emitting diodes 15a to 15d are lighted as shown in
FIG. 9(e).
[0132] The LED drive circuit 75 turns on the infrared light
emitting diodes 15 only the period when the LED control signal LEDC
is activated as shown in FIG. 9(d) while the frame status flag
signal FSF is in a low level as shown in FIG. 9(a), and therefore
the infrared light emitting diodes 15 are turned on only in the
exposure period (refer to FIG. 9(f)) of the image sensor 43.
[0133] Accordingly, useless power consumption can be restricted.
Furthermore, since the frame status flag signal FSF is given also
to the coupling capacitor 71, the transistor 77 is necessarily
turned off after a certain period even when the flag signal FSF is
fixed at a low level due to the runaway of the image sensor 43 or
the like, so that the infrared light emitting diodes 15 are also
necessarily turned off after the certain period.
[0134] It is therefore possible to arbitrarily and freely change
the exposure period of the image sensor 43 by adjusting the mark
duration of the frame status flag signal FSF.
[0135] Furthermore, the lighting period, non-lighting period,
cycles of lighting/non-lighting period and so forth of the infrared
light emitting diodes 15, i.e., of the stroboscope can be
arbitrarily and freely set and changed by adjusting the mark
durations and the frequencies of the frame status flag signal FSF
and LED control signal LEDC.
[0136] As has been discussed above, when the operation article 150
is irradiated with the infrared light emitted from the infrared
light emitting diodes 15, the image sensor 43 is exposed to the
light reflected from the operation article 150. In response to
this, the above pixel data D (X, Y) is output from the image sensor
43. More specifically speaking, during the period in which the
frame status flag signal FSF as shown in FIG. 9(a) is in a high
level (the infrared light emitting diodes 15 is not turned on), the
image sensor 43 outputs analog pixel data D (X, Y) as shown in FIG.
9(c) in synchronism with the pixel data strobe signal PDS as shown
in FIG. 9(b).
[0137] The high speed processor 200 acquires digital pixel data
through the ADC 208 while monitoring the frame status flag signal
FSF and the pixel data strobe signal PDS.
[0138] In this case, the pixel data is sequentially output as the
zeroth line, the first line, . . . and the thirty-first line as
illustrated in FIG. 10(c). However, as explained in the followings,
the first pixel of each line is associated with dummy data.
[0139] Next, the details of the game played with the music game
apparatus 1 will be explained with specific examples.
[0140] FIG. 11 is a view for showing an example of the game screen
as displayed on the screen 91 of the television monitor 90 of FIG.
1. The game screen shown in FIG. 11 is a game start screen. As
shown in FIG. 11, the game start screen displayed on the screen 91
includes a background 110, position guides "G1" to "G4", evaluation
objects 107 to 109, a cursor 105, a dance object 106 and masks 141
and 142. And, automatic playing of music is started.
[0141] Incidentally, in the case of the present embodiment, the
position guides "G1" to "G4" are displayed in the form of blooms,
the evaluation objects 107 to 109 are displayed in the form of
heart-shaped objects, the dance object 106 is displayed in the form
of a male-female pair, and the cursor 105 is displayed in the form
of the operation article 150. In the following description, the
term "position guides G" are used to generally represent the
position guides "G1" to "G4".
[0142] The cursor 105 serves to indicate the position of the
operation article 150 on the screen 91, and moves on the screen 91
to follow the motion of the operation article 150. Accordingly, as
seen from the player 94, the manipulation of the operation article
150 is equivalent to the manipulation of the cursor 105. The
position guide "G" serves to guide the manipulation timing and
destination position of the cursor 105 (the operation article 150)
in terms of the timings relative to the music which is
automatically played. Direction guides "g1" to "g5", which will be
described below, serves to guide the manipulation timing and moving
direction of the cursor 105 (the operation article 150) in terms of
the timings relative to the music which is automatically played.
Path guides "rg1" to "rg10", which will be described below, serves
to guide the manipulation timing, the moving direction and moving
path of the cursor 105 (the operation article 150) in terms of the
timings relative to the music which is automatically played. The
evaluation objects 107 to 109 serves to indicate the evaluation of
the manipulate of the cursor 105 (the operation article 150) by the
player 94 in a visual way. In the following description, the term
"direction guides g" are used to generally represent the direction
guides "g1" to "g5". In the same manner, the term "path guides "rg"
are used to generally represent the path guides "rg1" to
"rg10".
[0143] FIG. 12 is a view for showing another example of the game
screen as displayed on the screen 91 of the television monitor 90
of FIG. 1. As shown in FIG. 12, the animation of the position guide
"G" that a bloom is gradually opening indicates the position to
which the cursor 105 is to be moved. By this guidance, the player
94 is instructed to move the cursor 105 to the area in which is
displayed the animation of the position guide "G1" that a bloom is
opening. The player 94 moves the operation article 150 in order to
move the cursor 105 to the area in which is displayed the animation
of the position guide "G" that a bloom is opening. After the
animation that a bloom is opening, the position guide "G" is
displayed with the animation that the bloom is closed. Furthermore,
the direction in which the cursor 105 is to be moved is indicated
by the direction toward the animation of the position guide "G"
that a bloom is opening. By this guidance, the player 94 is
instructed to move the cursor 105 to the direction toward the
animation of the position guide "G" that a bloom is opening.
[0144] In addition to this, the direction in which the cursor 105
is to be moved is guided also by the direction guides "g1" to "g5".
Particularly, the direction guides "g1" to "g5" sequentially appear
in the order that the direction guide "g1" appears first, the
direction guide "g2" appears second, the direction guide "g3"
appears third, the direction guide "g4" appears fourth, and then
the direction guide "g5" appears fifth. Accordingly, the direction
in which the cursor 105 is to be moved is guided by the direction
in which the direction guides "g1" to "g5" appear in sequence. In
this case, while each of the direction guides "g11" to "g5" is
displayed as a graphic form representing a small sphere just after
it appears, the sphere gradually increases in size as time passes,
and when the size is maximized an animation is performed as if the
sphere shatters into fragments. Accordingly, the direction toward
the graphic form of the sphere which appears is the direction in
which the cursor 105 is to be moved.
[0145] The player 94 has to move the cursor 105 to the area in
which the position guide "G1" is displayed within a predetermined
period in which the bloom serving as the position guide "G" is
opened. In other words, the position guide "G" serves to guide the
manipulation timing of the cursor 105 by the animation that the
bloom is opened. Also, the player 94 has to move the cursor 105 to
the area in which the position guide "G" is displayed as an opening
bloom within a predetermined period after the last direction guide
"g" appears as the graphic form of the sphere. In other words, the
manipulation timing of the cursor 105 is guided also by the
direction guide "g".
[0146] In addition to this, the position guide "G" serves also to
indicate in advance the manipulation direction of the cursor 105.
That is to say, if the bud of the bloom serving as the position
guide "G" is coming to open, it enable the player 94 to know the
direction in which the cursor 105 is to be moved next. Furthermore,
the direction guide "g" serves also to indicate in advance the
manipulation direction of the cursor 105. Namely, since the
direction guide "g" appears in advance of the manipulation timing,
the player 94 can know the direction in which the cursor 105 is to
be moved next also by the direction guide "g".
[0147] This will be explained with reference to a specific example.
In the case of the example as illustrated in FIG. 12, the position
to which the cursor 105 is to be moved is indicated by the
animation of the position guide "G2" that a bloom is gradually
opening. By this guidance, the player 94 is instructed to move the
cursor 105 to the area in which is displayed the animation of the
position guide "G2" that the bloom is opening. Also, the direction
toward the animation of the position guide "G2" that the bloom is
opening is the direction in which the cursor 105 is to be moved. By
this process, the player 94 is instructed to move the cursor 105 to
the direction toward the animation of the position guide "G2" that
the bloom is opening. In addition to this, the graphic forms of the
spheres as the direction guides "g1" to "g5" subsequently appear
from the position guide "G1" to the position guide "G2". As
described above, also by the direction guides "g1" to "g5", the
motion of the cursor 105 is guided from the position guide "G1" to
the position guide "G2".
[0148] The player 94 has to move the cursor 105 to the area in
which the position guide "G2" is displayed within a predetermined
period in which the bloom serving as the position guide "G2" is
opened. Also, the player 94 has to move the cursor 105 to the area
in which the position guide "G2" is displayed as an opening bloom
within a predetermined period after the last direction guide "g5"
appears as the graphic form of the sphere. In other words, the
manipulation timing of the cursor 105 is guided also by the
direction guide "g".
[0149] The player 94 appropriately manipulates the operation
article 150 in accordance with the instruction by the position
guide "G2" and the direction guides "g1" to "g5" in order to move
the cursor 105 from the position of the position guide "G1" to the
position of the position guide "G2". As a result, animation is
performed such that all the evaluation objects 107 to 109 are
flashing. Incidentally, if the cursor 105 is manipulated in a most
appropriate timing, animation is performed in order that all the
evaluation objects 107 to 109 are flashing, and if the cursor 105
is manipulated in a timing which is not most appropriate but within
an acceptable range, animation is performed in order that only the
evaluation object 108 is flashing. Meanwhile, each of the position
guides "G1", "G3" and "G4" is displayed in the form of the bud of
the bloom because the current time is out of the time slot for
guiding the manipulation timing and destination position of the
cursor 105. Also, the direction guide "g" does not appear between
the position guide "G2" and the position guide "G4", between the
position guide "G4" and the position guide "G3" and between the
position guide "G3" and the position guide "G1", because the
current time is out of the time slot for guiding the manipulation
timing and destination position of the cursor 105.
[0150] When the player 94 appropriately manipulates the cursor 105
in accordance with the guidance given by the position guide "G2"
and the direction guides "g1" to "g5", the animation of dance is
performed in the direction corresponding to the moving direction of
the cursor 105 (the direction from the position guide "G1" to the
position guide "G2", i.e., the right direction as seen toward the
screen 91). For example, the animation of the dance object 106
turning in the counter-clockwise direction is performed, while the
background 110 is scrolled in the left direction as seen toward the
screen 91. By this process, although the dance object 106 is
positioned in the center of the screen 91, it appears that the
dance object 106 is turning in the counter clockwise direction
while moving in the right direction.
[0151] FIG. 13 is a view for showing a further example of the game
screen as displayed on the screen 91 of the television monitor 90
of FIG. 1. As shown in FIG. 13, animation is performed in order
that the position guides "G1" to "G4" are opening as blooms at the
same time. Taking this opportunity, the player 94 is guided to move
the cursor 105 in the direction and along the path in accordance
with the path guides "rg1" to "rg10". In this case, the appearance
positions of the path guides "rg1" to "rg10" indicate the guide
path of the cursor 105. Also, the path guides "rg1" to "rg10"
appear in the order that the path guide "rg1" appears first, the
path guide "rg2" appears second, the path guide "rg3" appears
third, the path guide "rg4" appears fourth, . . . . and the path
guide "rg10" finally appears. Accordingly, the direction in which
the cursor 105 is to be moved is guided by the direction in which
the path guides "rg1" to "rg10" appear in sequence. In this case,
while each of the path guides "rg1" to "rg10" is displayed as a
graphic form representing a small sphere just after it appears, the
sphere gradually increases in size as time passes, and when the
size is maximized an animation is performed as if the sphere
shatters into fragments. In FIG. 13, it is indicated to move the
cursor 105 in the counter clockwise direction from a start point in
the vicinity of the position guide "G3" along the path guides "rg1"
to "rg10".
[0152] When the player 94 manipulates the operation article 150 in
accordance with the position guides "G1" to "G4" and the path
guides "rg1" to "rg10" in order to move the cursor 105 in an
appropriate manner, the animation of the dance object 106 (for
example, the animation which is widely turning in the counter
clockwise direction) is performed in correspondence with the path
guides "rg1" to "rg10".
[0153] Meanwhile, as discussed above, the object illustrated in
each of FIG. 12 and FIG. 13 such as the dance object 106 is an
image corresponding to a certain picture for an animation. For
example, a series of the dance objects 106 are prepared for dance
animation. Also, for example, a series of object images in the
graphic forms of blooms are prepared for the animation of the
position guide "G". Furthermore, for example, a series of object
images in the graphic forms of spheres are prepared for the
animation of the direction guide "g" and the path guide "rg".
[0154] In this case, each of the dance object 106, the position
guide "G", the evaluation objects 107 to 109, the cursor 105, the
direction guide "g" and the path guide "rg" in the game screens as
illustrated in FIG. 11 and FIG. 13 is composed of a single or a
plurality of sprites. A sprite comprises a rectangular pixel set
and can be arranged in an arbitrary position of the screen 91.
Incidentally, a generic term "object" (or "object image") is
sometimes used to generally refer to the position guide "G", the
evaluation objects 107 to 109, the cursor 105, the direction guide
"g" and the path guide "rg".
[0155] FIG. 14 is a view for explaining the sprites forming an
object which is displayed on the screen 91. As illustrated in FIG.
14, the dance object 106 of FIG. 11 is composed, for example, of 12
sprites SP0 to SP11. Each of the sprites SP0 to SP11 consists, for
example, of 16 pixels.times.16 pixels. When the dance object 106 is
arranged on the screen 91, for example, the coordinates at which
the center of the upper left corner sprite SP0 is to be located is
designated. Then, the coordinates at which the centers of the
respective sprites SP1 to SP11 is to be located are calculated on
the basis of the coordinates as designated and the size of the
sprites SP0 to SP11.
[0156] Next, the scrolling of the background 110 will be explained.
First, the background screen will be explained.
[0157] FIG. 15 is an explanatory view for showing the background
screen to be displayed on the screen 91 of the television monitor
90 of FIG. 1. As illustrated in FIG. 15, the background screen 140
is composed, for example, of 32.times.32 blocks "0" to "1023". Each
of the block "0" to the block "1023" is composed, for example, of a
rectangular element comprising 8 pixels.times.8 pixels. An array
element PA[0] to an array element PA[1023] and an array element
CA[0] to an array element CA[1023] are prepared in correspondence
respectively with the block "0" to the block "1023".
[0158] In this description, in the case where the block "0" to the
block "1023" are generally referred to, they are referred to simply
as the "block"; in the case where the array element PA[0] to the
array element PA[1023] are generally referred to, they are referred
to as the "array element PA"; and in the case where the array
element CA[0] to the array element CA[1023] are generally referred
to, they are referred to as the "array element CA".
[0159] Incidentally, data (pixel pattern data) for designating the
pixel pattern of the corresponding block is assigned to the array
element PA. This pixel pattern data consists of the color
information of the respective pixels of the 8 pixels.times.8 pixels
for making up a block. On the other hand, the information for
designating the color palette and the depth value for use in the
corresponding block is assigned to the array element CA. A color
palette consists of the predetermined number of color information
entries. The depth value indicates the depth position of the
pixels, and if a plurality of pixels overlap each other in the same
position only the pixel having the largest depth value is
displayed.
[0160] FIG. 16(a) is an explanatory view for showing the background
screen 140 in advance of scrolling it, and FIG. 16(b) is an
explanatory view for showing the background screen 140 after
scrolling it. As illustrated in FIG. 16(a), since the size of the
screen 91 of the television monitor 90 is 256 pixels.times.224
pixels, an area of 256 pixels.times.224 pixels in the background
screen 140 is displayed the screen 91. It is considered here that
the background screen 140 is scrolled to shift the center position
thereof to the left by "k" pixels. In this case, since the width of
the background screen 140 in the lateral direction (the horizontal
direction) is equal to the width of the screen 91 in the lateral
direction, the portion thereof (hatched portion) scrolled out of
the screen 91 is displayed in the right edge as illustrated in FIG.
16(b). In other words, when scrolling in the lateral direction,
conceptually, it can be thought that the same background screen 140
is repeatedly arranged in the lateral direction.
[0161] For example, if it is assumed that the portion thereof
(hatched portion) scrolled out of the screen 91 consists of the
block "64", the block "96", . . . , the block "896" and the block
"928" of FIG. 15, the image displayed near the right edge of the
screen 91 is defined by the array elements PA[64], . . . , and
PA[928] and the array elements CA[64], . . . , and CA[928]
corresponding to these blocks. From this fact, in order to make the
background coherent while scrolling the background screen 140 in
the left direction, it is needed to update the data assigned to the
array elements PA and the array elements CA corresponding to the
blocks included in the portion thereof (hatched portion) scrolled
out of the screen 91. By this process, the image defined by the
array elements PA and the array elements CA which are updated is
displayed in the right edge of the screen 91.
[0162] In order to making the background look smooth and
continuous, it is needed to update the relevant array elements PA
and the relevant array elements CA in advance of displaying them at
the right edge of the screen 91. In this case, it is needed to
update the relevant array elements PA and the relevant array
elements CA while displaying the left edge of the screen 91, and
thereby the image near the left edge of the screen 91 becomes
incoherent. Accordingly, as illustrated in FIG. 11 to FIG. 13, the
mask 141 is provided at the left edge of the screen 91 in order to
avoid such shortcomings. For the same reason, there is the mask 142
provided at the right edge.
[0163] Incidentally, the scroll process in the rightward direction
is performed in the same manner as the scroll process in the
leftward direction. Also, in the case of the present embodiment,
since the range of scrolling is limited within .+-.16 pixels in the
longitudinal direction (vertical direction) of the background
screen 140, there is no mask at the top and bottom edges of the
screen 91.
[0164] As has been discussed above, the background 110 is scrolled
by scrolling the background screen 140.
[0165] Next, the details of the game process by the music game
apparatus 1 will be explained. FIG. 17 is a schematic
representation of a program and data stored in the ROM 51 of FIG.
6. As shown in FIG. 17, the ROM 51 is used to store a game program
300, image data 301 and music data 304. The image data 302 includes
object image data (inclusive of image data such as the position
guide "G", the direction guide "g", the path guide "rg", the
evaluation objects 107 to 109 and the cursor 105) and background
image data. The music data 304 includes first musical score data
305, second musical score data 306 and sound source data (wave
data) 307.
[0166] The first musical score data 305 shown in FIG. 17 is the
data in which music control information is arranged in a time
series.
[0167] FIG. 18 is a schematic representation of one example of the
first musical score data 305 of FIG. 17. As shown in FIG. 18, the
music control information contains a command, a note number/a
waiting time information item, an instrument designation
information item, a velocity value and a gate time.
[0168] "Note On" is a command to output sound, and "Wait" is a
command to set a waiting time. The waiting time is the time period
to elapse before reading the next command (the time period between
one musical note and the next musical note). The note number is
information for designating the frequency of sound vibration
(pitch). The waiting time information item is information for
designating a waiting time to be set. The instrument designation
information item is information for designating a musical
instrument whose tone quality is to be used. The velocity value is
information for designating the magnitude of sound, i.e., a sound
volume. The gate time is information for designating a period for
which a musical note is to be continuously output.
[0169] Returning to FIG. 17, the second musical score data 306 is
the data in which guide control information is arranged in a time
series. This second musical score data 306 is used when guides (the
position guide "G", the direction guide "g" and the path guide
"rg") is displayed on the screen 91. In other words, while the
first musical score data 305 is the musical score data for
automatically playing music, the second musical score data 306 is
the musical score data for displaying the guides in synchronization
with the music.
[0170] FIG. 19 is a schematic representation of one example of the
second musical score data 306 of FIG. 17. As shown in FIG. 19, the
guide control information contains a command, a note number/a
waiting time information item, and an instrument designation
information item.
[0171] The instrument designation information item of the second
musical score data 306 is the number indicating that the second
musical score data 306 is the musical score data for displaying
guides (the position guide "G", the direction guide "g" and the
path guide "rg") rather than the number indicating the instrument
(tone quality) corresponding to the instrument of which sound is to
be output.
[0172] Accordingly, "Note On" is not a command to output sound but
a command to designate starting the animation of the position guide
"G" or designate starting the display of the direction guide "g"
and the path guide "rg". Accordingly, the note number is not a
command to designate the frequency of sound vibration (pitch) but
information used to designate which of the animations of the
position guides "G" is to be started and designate where the
direction guide "g" and the path guide "rg" are displayed. This
point will be explained in detail.
[0173] FIG. 20(a) through FIG. 20(c) are views showing the
correspondence between note numbers and the directions in which the
cursor 105 is guided. As illustrated in FIG. 20(a) through FIG.
20(c), the direction of each arrow indicates the direction in which
the cursor 105 is guided, the start point of each arrow indicates
the position of the position guide "G" which previously guided the
cursor 105, and the end point of each arrow indicates the position
of the position guide "G" which currently guides the cursor 105.
For example, as illustrated in FIG. 20(a), the note number 115511
is used to direct the cursor 105 from the position guide "G1" to
the position guide "G2", and when the note number indicated by the
musical score data pointer is "55" the position guide "G" and the
direction guide "g" are displayed as illustrated in FIG. 12. Also,
for example, as illustrated in FIG. 20(c), the note number "57" is
used to direct the cursor 105 so that it turns in the counter
clockwise direction from the position guide "G3" as the start
point, and when the note number indicated by the musical score data
pointer is "57" the position guide "G1" and the path guide "rg" are
displayed as illustrated in FIG. 13.
[0174] Meanwhile, for example, the note number "81" is dummy data
placed on top of the second musical score data 306 (refer to FIG.
19) and not information which is used to control the display of
guidance. By this configuration, the top positions of the first
musical score data 305 and the second musical score data 306 are
aligned with each other. Furthermore, for example, the note number
"79" is data indicative of the end of music, and arranged at the
end of the second musical score data 306 (refer to FIG. 19).
Incidentally, the note number "79" is not information which is used
to control the display of guidance.
[0175] Next is the explanation of the main process performed by the
high speed processor 200.
[0176] [Pixel Data Group Acquisition Process] The CPU 201 acquires
digital pixel data by converting analog pixel data which is output
from the image sensor 43, and assigns it to the array element
P[X][Y]. Meanwhile, it is assumed that the horizontal axis (in the
lateral direction or the row direction) of the image sensor 43 is
X-axis and the vertical axis (in the longitudinal direction or the
column direction) is Y-axis.
[0177] [Differential Data Calculation Process] The CPU 201
calculates the differential data between the pixel data P[X] [Y]
acquired when the infrared light emitting diodes 15 are turned on
and the pixel data P[X][Y] acquired when the infrared light
emitting diodes 15 are turned off, and the differential data is
assigned to the array element Dif[X] [Y]. In what follows, the
advantages of obtaining the differential data will be explained
with reference to drawings. In this case, the pixel data represents
the luminance value. Accordingly, the differential data also
represents the luminance value.
[0178] FIG. 21(a) is a view for showing an example of the image
which is captured by the use of an ordinary used image sensor and
is not processed by a particular treatment, FIG. 21(b) is a view
for showing an example of the image which is obtained by level
filtering the image signal of FIG. 21(a) by a certain threshold
value, FIG. 21(c) is a view for showing an example of the image
which is captured by the image sensor 43 through the infrared
filter 17 with the illumination and is level filtered by a certain
threshold value, FIG. 21(d) is a view for showing an example of the
image which is captured by the image sensor 43 through the infrared
filter 17 without the illumination and is level filtered by a
certain threshold value, and FIG. 21(e) is a view for showing an
example of the differential signal between the image signal with
the illumination and the image signal without the illumination.
[0179] As has been discussed above, the operation article 150 is
irradiated with infrared light in order to capture an image by the
reflected infrared light which is incident on the image sensor 43
through the infrared filter 17. In the case where an image of the
operation article 150 is stroboscope captured by the use of an
ordinary light source in an ordinary indoor environment, an
ordinary image sensor (corresponding to the image sensor 43 of FIG.
5) captures an image which includes not only light sources such as
a fluorescent light source, an incandescent light source and a
solar light source (window) but any other objects located inside of
the room in addition to an image of the operation article 150 as
illustrated in FIG. 21(a). Accordingly, a computer or a processor
having a substantially high-speed processing capability is needed
in order to extract only the image of the operation article 150 by
processing the image of FIG. 21(a). However, such a
high-performance computer cannot be used in a device which must be
manufactured at a low cost. Then, it is conceivable to lessen the
load by the use of a variety of processing techniques.
[0180] Incidentally, although the image of FIG. 21(a) would have to
be drawn as a gray-scale image, the illustration is omitted. Also,
in each of FIG. 21(a) through FIG. 21(e), an image is captured of
the reflection sheet 155 of the operation article 150.
[0181] Next, FIG. 21(b) is an image signal after level filtering
the image signal of FIG. 21(a) by a certain threshold value. While
such a level filtering process can be performed by a dedicated
hardware circuit or by software control, it is possible to remove
images having low luminance values other than the operation article
150 and the light sources by performing the level filtering process
which cut off pixel data whose luminance value is no higher than a
certain level. In the case of the image signal of FIG. 21(b), the
images other than the operation article 150 and the light sources
can be eliminated so as to lessen the load on the computer,
however, since there are high-luminance images yet including light
source images, it is difficult to discriminate between the
operation article 150 and other light sources.
[0182] Because of this, the infrared filter 17 is used as
illustrated in FIG. 5 in order that the image sensor 43 does not
capture the images other than the image of the infrared light. By
this process, as illustrated in FIG. 21(c), it is possible to
remove the fluorescent light source which emits little infrared
light. However, there are the solar light source and the
incandescent light source included in the image signal.
Accordingly, the load is lessened by calculating the difference
between the pixel data when the infrared light stroboscope is
turned on and the pixel data when the infrared light stroboscope is
turned off.
[0183] For this purpose, the difference is calculated between the
pixel data of the image signal with the illumination as shown in
FIG. 21(c) and the pixel data of the image signal without the
illumination as shown in FIG. 21(d). Then, as illustrated in FIG.
21(e), only the image corresponding to the difference can be
acquired. The image corresponding to the difference includes only
the image corresponding to the operation article 150 as apparent
from the comparison with FIG. 21(a). Accordingly, while lessening
the processing load, it is possible to acquire the state
information on the operation article 150. The state information is
any one of or any combination of two or more of speed information,
moving direction information, moving distance information, velocity
vector information, acceleration information, movement locus
information, area information, and positional information.
[0184] For the reason as described above, the CPU 201 acquires
differential data by calculating the difference between the pixel
data acquired when the infrared light emitting diodes 15 are turned
on and the pixel data acquired when the infrared light emitting
diodes 15 are turned off.
[0185] [Target Point Extraction Process] The CPU 201 obtains the
coordinates of the target point of the operation article 150 on the
basis of the differential data Dif[X][Y] as calculated. This will
be explained in detail.
[0186] FIG. 22 is a view for explaining the calculation process of
the target point of the operation article 150. Incidentally, it is
assumed that the image sensor 43 shown in FIG. 22 is an image
sensor of 32 pixels.times.32 pixels.
[0187] As illustrated in FIG. 22, the CPU 201 scans the
differential data in the X-direction through 32 pixels while
incrementing the Y-coordinate, in such a manner that the CPU 201
scans the differential data through 32 pixels in the X-direction
(the horizontal direction, the lateral direction or the row
direction), then increments the Y-coordinate, scans the
differential data through 32 pixels in the X-direction, then
increments the Y-coordinate, and so on.
[0188] In this case, the CPU 201 finds the differential data of the
maximum luminance value from the differential data of 32
pixels.times.32 pixels as scanned, and compares the maximum
luminance value to a predetermined threshold value "Th". Then, if
the maximum luminance value is larger than the predetermined
threshold value "Th", the CPU 201 calculates the coordinates of the
target point of the operation article 150 on the basis of the
coordinates of the pixel having the maximum luminance value. This
point will be explained in detail.
[0189] FIG. 23(a) is a view for explaining the process of scanning
in the X-direction when the coordinates of the target point of the
operation article 150 are calculated on the basis of the
coordinates of the pixel having the maximum luminance value, FIG.
23(b) is a view for explaining the process of starting scanning in
the Y-direction when the coordinates of the target point of the
operation article 150 are calculated on the basis of the
coordinates of the pixel having the maximum luminance value, FIG.
23(c) is a view for explaining the process of scanning in the
Y-direction when the coordinates of the target point of the
operation article 150 are calculated on the basis of the
coordinates of the pixel having the maximum luminance value, and
FIG. 23(d) is an explanatory view for showing the result of the
process of calculating the coordinates of the target point of the
operation article 150 on the basis of the coordinates of the pixel
having the maximum luminance value.
[0190] As illustrated in FIG. 23(a), the CPU 201 performs scanning
the differential data in the X-direction from the coordinates of
the pixel having the maximum luminance value as the center in order
to detect pixels whose luminance values are larger than the
predetermined threshold value "Th". In the case of the example of
FIG. 23(a), the pixels corresponding to X=11 to 15 are pixels whose
luminance values are larger than the predetermined threshold value
"Th".
[0191] Next, as illustrated in FIG. 23(b), the CPU 201 obtains the
center of X (=11 to 15). Then, it is determined that Xc=13 as the
X-coordinate of the center.
[0192] Next, as illustrated in FIG. 23(c), the scanning operation
of the differential data is performed in the Y-direction from the
center at the X-coordinate (=13) as obtained in FIG. 23(b), and
detects pixels whose luminance values are larger than the
predetermined threshold value "Th". In the case of the example of
FIG. 23(c), the pixels corresponding to Y=5 to 10 are pixels whose
luminance values are larger than the predetermined threshold value
"Th".
[0193] Next, as illustrated in FIG. 23(d), the CPU 201 obtains the
center of Y (=5 to 10). Then, it is determined that Yc=7 as the
Y-coordinate of the center.
[0194] The CPU 201 converts the coordinates (Xc, Yc) (=(13, 7)) of
the target point which is calculated as described above into the
coordinates (xc, yc) in the screen 91. The CPU 201 performs the
process of calculating the coordinates (xc, yc) of the target point
as described above each time the frame is updated. Then, the CPU
201 assigns "xc" and "yc" respectively to the array elements Px[M]
and Py[M]. Meanwhile, "M" is an integer and incremented by one each
time the frame displayed on the screen 91 is updated.
[0195] [Target Point Existence Area Determination Process (1)] The
CPU 201 determines which of areas a1 to a4 includes the target
point of the operation article 150 on the screen 91. This point
will be explained in detail.
[0196] FIG. 24 is a view for explaining the target point existence
area determination process (2) performed by the CPU 201. As
illustrated in FIG. 24, a predetermined area a1 including the
position guide "G1", a predetermined area a2 including the position
guide "G2", a predetermined area a3 including the position guide
"G3" and the predetermined area a4 including the position guide
"G4" are defined on the screen 91. The CPU 201 determines, from
among the predetermined areas a1 to a4, the area in which the
target point (xc, yc) of the operation article 150 is located and
stores the result of determination in the array element J1[M]. The
CPU 201 performs the determination process as described above each
time the frame displayed on the screen 91 is updated.
[0197] [Target Point Existence Area Determination Process (2)] The
CPU 201 determines which of areas A1 to A4 includes the target
point of the operation article 150 on the screen 91. This point
will be explained in detail.
[0198] FIG. 25 is a view for explaining the target point existence
area determination process (2) performed by the CPU 201. As
illustrated in FIG. 25, the areas A1 to A4 are defined by dividing
the screen 91 into four. The CPU 201 determines, from among the
areas A1 to A4, the area in which the target point (xc, yc) of the
operation article 150 is located and stores the result of
determination in the array element J2[M]. The CPU 201 performs the
determination process as described above each time the frame
displayed on the screen 91 is updated.
[0199] [Cursor Control Process] The CPU 201 registers (stores in
the internal memory 207) the coordinates (xc, yc) of the current
target point of the operation article 150 as the coordinates of the
cursor 105 to be displayed in the next frame.
[0200] [Guide Type Registration Process] The CPU 201 assigns a note
number (refer to FIG. 19 and FIG. 20(a) through FIG. 20(c)), which
are read from the second musical score data 306 in accordance with
the musical score data pointer for guidance, to an array element
NN[0] or an array element NN[1]. The number of elements of the
array is two as described above because the guidance of a certain
position guide "G" and the guidance of another position guide "G"
are started in different timings but can be overlappingly continued
in a certain period. Incidentally, the musical score data pointer
for guidance is a pointer pointing to the position of the second
musical score data 306 from which data is read.
[0201] [Guide Control Process] The CPU 201 registers the animation
information of the direction guide "G", the animation information
of the position guide "g" and the animation information of the path
guide "rg" with reference to the array element NN[J] (guide display
number "J"=0, 1) in accordance with the note number assigned to the
array element NN[J]. This point will be explained in detail.
[0202] FIG. 26 is a view for explaining the registration process of
the animations of the direction guide "G", the position guide "g"
and the path guide "rg". As illustrated in FIG. 26, in the ROM 51
or the internal memory 207, there is prepared a table in which the
note number are associated with the animation information (the
storage location information of the animation table of the position
guide "G", the display coordinate information of the position guide
"G" on the screen 91, the display timing information of the
position guide "G", the storage location information of the
animation table of the direction guide "g"/the path guide "rg", the
display coordinate information of the direction guide "g"/the path
guide "rg" in the screen 91, and the display timing information of
the direction guide "g"/the path guide "rg").
[0203] Each of the note numbers in this table is a note number
which is used to control the display of a guide and shown in FIG.
20(a) through FIG. 20(c). For example, if the note number assigned
to the array element NN[J] is "55", the CPU 201 refers to this
table and registers (stores in the predetermined area of the
internal memory 207) the animation information (the storage
location information of the animation table of the position guide
"G", the display coordinates of the position guide "G" on the
screen 91, the display timing information of the position guide
"G", the storage location information of the animation table of the
direction guide "g", the display coordinate information of the
direction guide "g" in the screen 91, and the display timing
information of the direction guide "g") associated with the note
number "55".
[0204] In this case, the display timing information is information
indicative of when an object is to be displayed on the screen 91.
For example, the guide number "55" indicates that the position
guide "G2" is to be displayed at the coordinates (x1, y1) in the
next frame following the frame which is currently displayed since
the display timing information of the position guide "G" is "0".
Also, for example, since the display timing information of the
direction guide "g" is 0, 6, 12, . . . and 24, the guide number
"55" indicates that the position guide "g1" is to be displayed at
the coordinates (x3, y1) in the next frame following the frame
which is currently displayed, that the position guide "g2" is to be
displayed at the coordinates (x4, y1) 6 frames after, . . . and
that the position guide "g5" is to be displayed at the coordinates
(x7, y1) 24 frames after.
[0205] FIG. 27 is a view for showing an example of the animation
table which is designated by the animation table storage location
information of FIG. 26. As illustrated in FIG. 27, the animation
table is a table in which are associated the storage location
information of animation image data (a plurality of image object
data items arranged in a time series), the reference numbers of
objects for use in performing animation arranged in a time series,
information indicative of how many frames (the number of duration
frames) an object is continuously displayed, the size of an object,
the information on a color palette, the information on the depth
value, and the size of a sprite. Incidentally, the animation image
data is pixel pattern data. In this case, the pixel pattern data,
the color palette and the depth value are related to sprites for
forming objects, and the definitions thereof are the same as
explained in conjunction with the blocks of FIG. 14.
[0206] The animation table pointed to by the animation table
storage location information "address 0" is an example of the
animation table of the position guide "G", the animation table
pointed to by the animation table storage location information
"address 1" is an example of the animation table of the direction
guide "g", the animation table pointed to by the animation table
storage location information "address 2" is an example of the
animation table of the path guide "rg", and the animation table
pointed to by the animation table storage location information
"address 3" is an example of the animation table of the position
guide "G" which is used when the player 94 successfully manipulates
the cursor 105.
[0207] [Dance Control Process] The CPU 201 determines whether or
not the player 94 correctly manipulates the cursor 105 in
correspondence with the position guide "G" and the direction guide
"g" or the position guide "G" and the path guide "rg". More
specific description is as follows.
[0208] The CPU 201 determines whether or not the cursor 105 (i.e.,
the target point of the operation article 150) is located in the
area which is currently designated by the position guide "G" and
the direction guide "g" on the basis of the result of determination
J1[M] performed by the target point existence area determination
process (1) (refer to FIG. 24). For example, in the case where the
area a2 is currently designated by the position guide "G" and the
direction guide "g", if the cursor 105 is located in the area a2,
the CPU 201 determines that the cursor 105 is correctly manipulated
in correspondence with the position guide "G" and the direction
guide "g".
[0209] Also, the CPU 201 determines whether or not the cursor 105
(i.e., the target point of the operation article 150) is moved
along the path which is currently designated by the position guide
"G" and the path guide "rg" on the basis of the result of
determination J2[M] performed by the target point existence area
determination process (2) (refer to FIG. 25). In this case, the
path which is designated by the guide number "53" as shown in FIG.
20(b) is the path of the area A3->the area A1->the area
A2->the area A4 as shown in FIG. 25. Also, the path which is
designated by the guide number "57" as shown in FIG. 20(c) is the
path of the area A4->the area A2->the area A1->the area A3
as shown in FIG. 25. Accordingly, for example, in the case where
the path corresponding to the guide number "53" is currently
designated by the position guide "G" and the path guide "rg", if
the cursor 105 is moved along the path of the area A3->the area
A1->the area A2->the area A4, the CPU 201 determines that the
cursor 105 is correctly manipulated in correspondence with the
position guide "G1" and the path guide "rg".
[0210] From the above results, in the case where it is determined
that the player 94 correctly manipulates the cursor 105 in
correspondence with the position guide "G" and the direction guide
"g" or the position guide "G" and the path guide "rg", the CPU 201
registers (stores in the predetermined area of the internal memory
207) the dance animation information corresponding to the position
guide "G" and the direction guide "g" or the position guide "G" and
the path guide "rg". In the same manner as the table of FIG. 26, in
the ROM 51 or the internal memory 207, a table is prepared in order
to associate the dance animation information with the note numbers
(refer to FIG. 20(a) through FIG. 20(c)) for controlling the
display of guides. However, the note numbers indicating the same
guide direction (for example, the note numbers "55" and 67) are
associated with the same dance animation information.
[0211] The dance animation information is designed in the same
manner as the animation information of FIG. 26 and contains the
storage location information of the dance animation table, the
display coordinates of the dance object 106 on the screen 91, and
the display timing information of the dance object 106. Also, the
dance animation table is designed in the same manner as the
animation table of FIG. 27 and provided as a table in which are
associated the storage location information of dance animation
image data (a plurality of image object data items of the dance
object 106 arranged in a time series), the reference numbers of the
dance objects 106 for use in performing animation arranged in a
time series, information indicative of how many frames (the number
of duration frames) the dance object is continuously displayed, the
size of the dance object 106, the information on the color palette,
the information on the depth value and the size of the sprite. The
dance animation image data is pixel pattern data.
[0212] Meanwhile, the second musical score data 306 of FIG. 17 may
contain a note number indicating that a high speed dance animation
is to be performed and a note number indicating that a low speed
dance animation is to be performed. In this case, a high speed
dance animation table and a low speed dance animation table are
prepared respectively as the dance animation table. Also, in this
case, there is a table prepared in the ROM 51 or the internal
memory 207 in order to associate the dance animation information
with the note numbers for controlling the display of guides and the
note numbers for controlling the speed of dance. In the same
manner, a dance animation table is prepared for each dance
speed.
[0213] Also, in the case where it is determined that the player 94
correctly manipulates the cursor 105 in correspondence with the
position guide "G" and the direction guide "g" or the position
guide "G" and the path guide "rg", the CPU 201 registers the
storage location information ("address 3" in the case of the
example of FIG. 27) of the animation table of the position guide
"G" which is used when the player 94 successfully manipulates the
cursor 105.
[0214] Furthermore, in the case where it is determined that the
player 94 correctly manipulates the cursor 105 in correspondence
with the position guide "G" and the direction guide "g" or the
position guide "G" and the path guide "rg", the CPU 201 registers
(stores in the predetermined area of the internal memory 207) the
animation information for the evaluation objects 107 to 109. This
animation information is designed in the same manner as the
animation information of FIG. 26. Accordingly, this animation
information contains the storage location information of the
animation table for the evaluation objects 107 to 109. This
animation table is designed in the same manner as the animation
table of FIG. 27.
[0215] Still further, in the case where it is determined that the
player 94 correctly manipulates the cursor 105 in correspondence
with the position guide "G" and the direction guide "g" or the
position guide "G" and the path guide "rg", the CPU 201 performs
the scrolling corresponding to the position guide "G" and the
direction guide "g" or the position guide "G" and the path guide
"rg". More specifically speaking, the CPU 201 changes the center
position of the background screen 140 to scroll the background 110
(refer to FIG. 16(a) and FIG. 16(b)) in correspondence with the
position guide "G" and the direction guide "g" or the position
guide "G" and the path guide "rg", and the note number for
controlling the dance speed. Still further, in the case where the
background screen 140 is scrolled in the lateral direction, the CPU
201 changes the data in the array elements PA and the array
elements CA corresponding thereto.
[0216] Incidentally, FIG. 28 is a timing diagram for explaining the
relationship among the first musical score data 305, the second
musical score data 306, the direction guide "G", the position guide
"g", the judgment of manipulation and the dance animation.
Incidentally, in FIG. 28, each thick line indicates the execution
period for which the process continues, the filled circle at the
left end of each thick line indicates the starting point of the
process, and the filled circle at the right end of each thick line
indicates the end of the process.
[0217] As illustrated in FIG. 28, the time point of starting
reading the second musical score data 306 is set earlier than the
time point T1, T2 . . . of starting reading the first musical score
data 305 by a predetermined time "t". Accordingly, the display of
the direction guide "g" is started the predetermined time "t"
earlier than the corresponding note number of the first musical
score data 305 is read at the corresponding time point of T1 to T3,
and continued to this corresponding time point of T1 to T3 in which
the corresponding note number of the first musical score data 305
is read (for example, for 60 frames). Likewise, the animation of
the position guide "G" is started the predetermined time "t"
earlier than the corresponding note number of the first musical
score data 305 is read at the corresponding time point of T1 to T3,
and continued a short time after the corresponding note number of
the first musical score data 305 is read at this corresponding time
point of T1 to T3.
[0218] The CPU 201 starts the process of determining whether or not
the cursor 105 is correctly manipulated in correspondence with the
direction guide "g" and the position guide "G" a predetermined
period (for example, 30 frames) after starting the display of the
direction guide "g", and completes the process at the corresponding
time point of T1 to T3 in which the corresponding note number of
the first musical score data 305 is read. Then, if it is determined
that the cursor 105 is correctly manipulated in correspondence with
the direction guide "g" and the position guide "G", the CPU 201
registers the dance animation information at the time point when
the determination period ends. Accordingly, in this case, dance
animation is performed on the basis of the dance animation
information which is registered.
[0219] Meanwhile, the following is the reason why the time point of
starting reading the second musical score data 306 is earlier than
the time point T1, T2 . . . of starting reading the first musical
score data 305 by a predetermined time "t". Namely, since the
player 94 starts the manipulation of the operation article 150
after the guidance by the direction guide "g" and the position
guide "G" is started, the direction guide "g" and the position
guide "G" are displayed earlier than in the timing of music for the
purpose of adjusting the time lag.
[0220] The timing of displaying the path guide "rg" is provided in
the same manner as the timing of displaying the direction guide
"G". However, for example, the determination of whether or not the
cursor 105 is correctly manipulated in correspondence with the path
guide "rg" is performed between the start and end of the guidance
by the path guide "rg" (For example, for 60 frames).
[0221] [Image Display Process] The CPU 201 provides the graphics
processor 202 of FIG. 7 with the information required for drawing
during the vertical blanking period on the basis of the information
registered by the cursor control process, the guide control process
and the dance control process. Then, the graphics processor 202
generates a video signal on the basis of the information as given,
and outputs it to the video signal output terminal 47. By this
process, the game screen including the position guide "G", the
background 110 and so forth is displayed on the screen 91 of the
television monitor 90. More specific description is as follows.
[0222] The CPU 201 calculates the display coordinates of the
respective sprites forming the cursor 105 on the basis of the
coordinate information (the coordinate information of the target
point of the operation article 150) which is registered by the
cursor control process. Then, the CPU 201 provides the graphics
processor 202 with the display coordinate information, the color
palette information, the depth value, the size information and the
pixel pattern data storage location information of the respective
sprites for forming the cursor 105. The graphics processor 202
generates the image signal of the cursor 105 on the basis of the
respective information, and outputs it to the video signal output
terminal 47.
[0223] Also, the CPU 201 acquires the size information of the
object for forming the animation image of each guide (the position
guide "G", the direction guide "g" or the path guide "rg") and the
size information of the sprite for forming the object with
reference to the animation table on the basis of the animation
table storage location information contained in the animation
information which is registered by the guide control process. Then,
the CPU 201 calculates the display coordinates of the respective
sprites for forming the object on the basis of the above respective
information and the display coordinate information contained in the
animation information as registered. Furthermore, the CPU 201
calculates the pixel pattern data storage location information of
the respective sprites for forming the object on the basis of the
reference number of the position guide "G" to be displayed next,
the size information of the object and sprite contained in the
animation table, and the animation image data storage location
information of the position guide "G" contained in the animation
table.
[0224] Still further, the CPU 201 provides the graphics processor
202 with the color palette information, the depth value and the
size information of the respective sprites for forming the position
guide "G" together with the pixel pattern data storage location
information and the display coordinate information of the
respective sprites with reference to the animation table. In this
case, the CPU 201 provides the graphics processor 202 with the
above respective information on the basis of the display timing
information of the position guide "G" contained in the animation
information as registered and the information on the number of
duration frames of the animation table.
[0225] For the direction guide "g" and the path guide "rg", the
information to be given to the graphics processor 202 by the CPU
201 has a similar content and is acquired in a similar manner as
for the position guide "G". However, since the direction guides
"g1" to "g4" and the path guides "rg1" to "rg10" are sequentially
displayed in a plurality of positions which are designated by the
display coordinate information contained in the animation
information in the timing which is designated by the display timing
information contained in the animation information, the CPU 201
provides the information on the direction guides "g1" to "g4" and
the path guides "rg1" to "rg10" to the graphics processor 202, when
starting displaying each of the direction guides "g1" to "g4" and
each of the path guides "rg1" to "rg10", with reference to the
display coordinate information and the display timing information
contained in the animation information as registered.
[0226] The graphics processor 202 generates the image signals of
the guides (the position guide "G", the direction guide "g", the
path guide "rg") on the basis of the above information which is
given as described above, and outputs them to the video signal
output terminal 47.
[0227] Also, the CPU 201 acquires the size information of the dance
object 106 for forming the dance animation image and the size
information of the sprite for forming the dance object 106 with
reference to the dance animation table on the basis of the dance
animation table storage location information contained in the dance
animation information which is registered by the dance control
process. Then, the CPU 201 calculates the display coordinates of
the respective sprites for forming the dance object 106 on the
basis of the above respective information and the display
coordinate information contained in the dance animation information
as registered. Furthermore, the CPU 201 calculates the pixel
pattern data storage location information of the respective sprites
for forming the dance object 106 on the basis of the reference
number of the dance object 106 to be displayed next, the size
information of the dance object 106 and the sprite contained in the
dance animation table, and the dance animation image data storage
location information contained in the dance animation table.
[0228] Still further, the CPU 201 provides the graphics processor
202 with the color palette information, the depth value and the
size information of the respective sprites for forming the dance
object 106 together with the pixel pattern data storage location
information and the display coordinate information of the
respective sprites with reference to the dance animation table. In
this case, the CPU 201 provides the above respective information to
the graphics processor 202 on the basis of the display timing
information contained in the dance animation information as
registered and the information on the number of duration frames of
the dance animation table.
[0229] Still further, the CPU 201 acquires the information required
for generating image signals on the basis of the animation
information and the animation table for the evaluation objects 107
to 109 which are registered by the dance control process, and
provides the information to the graphics processor 202.
Incidentally, in this case, the information to be given to the
graphics processor 202 by the CPU 201 has a similar content and is
acquired in a similar manner as for the dance object 106.
[0230] The graphics processor 202 generates the image signals of
the dance object 106 and the evaluation objects 107 to 109 on the
basis of the above information which is given as described above,
and outputs them to the video signal output terminal 47.
[0231] [Music Playback] The playback of music is performed by an
interrupt operation. The CPU 201 reads and interprets the music
control information of FIG. 18 while incrementing the musical score
data pointer for music. Incidentally, the musical score data
pointer for music is a pointer pointing to the position of the
first musical score data 305 from which data is read.
[0232] Then, if the command contained in the music control
information as read is "Note On", the CPU 201 provides the sound
processor 203 with the head address from which the wave data is
stored in accordance with the frequency of sound vibration (pitch)
designated by the note number contained in the music control
information and the instrument (tone quality) designated by the
instrument designation information. Furthermore, if the command
contained in the music control information as read is "Note On",
the CPU 201 provides the sound processor 203 with the head address
from which the envelope data as required is stored. Still further,
if the command contained in the music control information as read
is "Note On", the CPU 201 provides the sound processor 203 with
pitch control information corresponding to the frequency of sound
vibration (pitch) designated by the note number contained in the
music control information, and volume information contained in the
music control information.
[0233] In what follows, the pitch control information will be
explained. The pitch control information is used to perform the
pitch conversion by changing the frequency of reading the wave
data. Namely, the sound processor 203 periodically reads the pitch
control information at a certain interval and accumulates the pitch
control information. The sound processor 203 then processes this
result of accumulation to obtain the address pointer to the wave
data. Accordingly, if the pitch control information is set to a
large value, the address pointer is quickly incremented by the
large value to raise the frequency of the wave data. Conversely, if
the pitch control information is set to a small value, the address
pointer is slowly incremented by the small value to lower the
frequency of the wave data. In this way, the sound processor 203
performs the pitch conversion of wave data.
[0234] Next, the sound processor 203 reads the wave data stored in
the location pointed to by the head address as given from the ROM
51, while incrementing the address pointer on the basis of the
pitch control information as given. Then, the sound processor 203
generates an audio signal by multiplying the wave data, which is
successively read, by the envelope data and the volume data. In
this way, an audio signal having the tone quality of the musical
instrument, the frequency of sound vibration (pitch) and the sound
volume which are designated by the first musical score data 305 is
generated and output to the audio signal output terminal 49.
[0235] On the other hand, the CPU 201 manages the gate times
contained in the music control information as read. Accordingly,
the CPU 201 outputs an instruction to the sound processor 203 in
order that, when a gate time elapses, the output of the
corresponding musical tone is terminated. In response to this, the
sound processor 203 terminates the output of the corresponding
musical tone as designated.
[0236] Music is played back as described above on the basis of the
first musical score data 305 and output through a speaker (not
shown in the figure) of the television monitor 90.
[0237] Next, the entire process flow of the music game apparatus 1
of FIG. 1 will be explained with reference to a flow chart.
[0238] FIG. 29 is a flow chart showing the entire process flow of
the music game apparatus 1 of FIG. 1. As illustrated in FIG. 29,
the CPU 201 performs the initial settings of the system in step S1.
In step S2, the CPU 201 calculates the state information of the
operation article 150. In step S3, the CPU 201 performs the game
process on the basis of the state information of the operation
article 150 as calculated in step S2. In step S4, the CPU 201
determines whether or not "M" is smaller than a predetermined value
"K". If "M" is greater than or equal to the predetermined value
"K", the CPU 201 proceeds to step S5, in which "0" is assigned to
"M", and proceeds to step S6. On the other hand, if "M" is smaller
than the predetermined value "K", the CPU 201 proceeds from step S4
to step S6. This value "M" will be explained in the following
description.
[0239] In step S6, it is determines whether or not the CPU 201
waits for the video system synchronous interrupt. The CPU 201
provides the graphics processor 202 with the image information for
updating the display screen of the television monitor 90 after
starting the vertical blanking period (step S7). Accordingly, after
the process necessary for updating the display screen is completed,
the process is halted until the next video system synchronous
interrupt is issued. If "YES" is determined in step S6, i.e., while
waiting for a video system synchronous interrupt (while there is no
video system synchronous interrupt), the same step S6 is repeated.
Conversely, if "NO" is determined in step S6, i.e., if it gets out
of the state of waiting for a video system synchronous interrupt
(if there is a video system synchronous interrupt), the process
proceeds to step S7. In step S7, the CPU 201 provides the graphics
processor 202 with the image information required for generating
the game screen (refer to FIG. 11 through FIG. 13) during the
vertical blanking period on the basis the game process in step
S3.
[0240] FIG. 30 is a flow chart showing the process flow for the
initial settings of the system in step S1 of FIG. 29. As shown in
FIG. 30, the CPU 201 initializes the musical score data pointer for
guidance in step S10. In step S11, the CPU 201 sets an execution
stand-by counter for guidance to "0".
[0241] In step S12, the CPU 201 initializes the musical score data
pointer for music for music. In step S13, the CPU 201 sets an
execution stand-by counter for music to "t".
[0242] In step S14, the CPU 201 performs the initial settings of
the image sensor 43. In step S15, the CPU 201 initializes various
flags and various counters.
[0243] In step S16, the CPU 201 sets the timer circuit 210 as an
interrupt source for outputting sound. Incidentally, as an
interrupt handler, the sound processor 203 performs a process to
output sound from the speaker of the television monitor 90.
[0244] FIG. 31 is a flow chart showing the process flow for sensor
initial settings in step S14 of FIG. 30. As shown in FIG. 31, in
the initial step S20, the high speed processor 200 sets setting
data to a command "CONF". In this case, this command "CONF" is a
command used to inform the image sensor 43 that the high speed
processor 200 enters a configuration mode in which a command is
transmitted to the image sensor 43. Then, in the next step S21, a
command transmission process is performed.
[0245] FIG. 32 is a flow chart showing the command transmission
process in step S21 of FIG. 31. As shown in FIG. 32, the high speed
processor 200 sets register data (I/O port) to the setting data
(the command "CONF" in the case of step S21) in the first step S30,
and sets the register setting clock (I/O port) to a low level in
the next step S31. Then, after waiting for a predetermined time in
step S32, the register setting clock CLK is set to a high level in
step S33. Then, after further waiting for another predetermined
time in step S34, the register setting clock CLK is set to a low
level again in step S35.
[0246] As has been discussed above, as illustrated in FIG. 33, the
process of transmitting a command (or a command associated with
data) can be performed by changing the level of the register
setting clock CLK to a low level, then to a high level and again to
a low level while waiting for the predetermined time before each
change.
[0247] Returning to FIG. 31, the explanation continues. In step
S22, the pixel mode is set as well as the exposure time. In the
case of the present embodiment, since the image sensor 43 is for
example a CMOS image sensor of 32 pixels.times.32 pixels as
described above, "0h" indicative of 32 pixels.times.32 pixels is
loaded into a pixel mode register at a setting address "0". In the
next step S23, the high speed processor 200 performs a register
setting process.
[0248] FIG. 34 is a flow chart showing the register setting process
in step S23 of FIG. 31. As shown in FIG. 34, the high speed
processor 200 sets the setting data to the command "MOV" associated
with an address in the first step S40, and then performs the
command transmission process in the next step S41 as explained
above with reference to FIG. 32 to transmit the command. Next, the
high speed processor 200 sets the setting data to the command "LD"
associated with data in the next step S42, and then performs the
command transmission process in the next step S43 to transmit the
command. Then, the high speed processor 200 sets the setting data
to the command "SET" in step S44, and transmits the command in the
next step S45. Incidentally, the command "MOV" is a command for
transmitting the address of a control register; the command "LD" is
a command for transmitting data; and the command "SET" is a command
for actually loading the data into the address. Incidentally, the
above process is repeated if there are a plurality of control
registers to be set.
[0249] Returning to FIG. 31, the explanation continues. In step
S24, the setting address is set to "1" (the address of the low
nibble of an exposure time setting register), and "Fh" is loaded
into the low nibble of the exposure time setting register as the
low nibble data of "FFh" indicative of the maximum exposure time.
Then, in step S25, the register setting process of FIG. 34 is
performed. In the same manner, in step S26, the setting address is
set to "2" (the address of the high nibble of the exposure time
setting register), and "Fh" is loaded into the high nibble of the
exposure time setting register as the high nibble data of "FFh"
indicative of the maximum exposure time, and the register setting
process is performed in step S27.
[0250] Thereafter, the setting data is set to a command "RUN" in
step S28 for indicating the completion of initialization and having
the image sensor 43 start outputting data, followed by step S29 in
which the command "RUN" is transmitted. As has been discussed
above, the sensor initialization process is performed in step S14
of FIG. 30. However, the specific examples as illustrated in FIG.
31 to FIG. 34 may be modified in accordance with the specification
of the image sensor 43 actually employed.
[0251] FIG. 35 is a flow chart showing the process of calculating
the state information in step S2 of FIG. 29. As shown in FIG. 35,
the CPU 201 acquires digital pixel data from the ADC 208 in step
S50. This digital pixel data is data obtained by converting the
analog pixel data, which is transmitted from the image sensor 43,
into digital data by the ADC 208.
[0252] In step S51, the process of extracting a target point is
performed. More specifically speaking, the CPU 201 acquires
differential data by calculating the difference between the pixel
data acquired when the infrared light emitting diodes 15 are turned
on and the pixel data acquired when the infrared light emitting
diodes 15 are turned off. Then, the CPU 201 finds the maximum value
of the differential data and compares it with the predetermined
threshold value "Th". Furthermore, if the maximum value of the
differential data is greater than the predetermined threshold value
"Th", the CPU 201 converts the coordinates of the pixel having the
differential data corresponding to the maximum value into the
coordinates on the screen 91 of the television monitor 90 and sets
the coordinates of the target point of the operation article 150 to
the coordinates as converted.
[0253] In step S52, the CPU 201 determines which of the areas a1 to
a4 in FIG. 24 includes the target point of the operation article
150, and stores the result of determination in the array element
J1[M].
[0254] In step S53, the CPU 201 determines which of the areas A1 to
A4 in FIG. 25 includes the target point of the operation article
150, and stores the result of determination in the array element
J2[M].
[0255] FIG. 36 is a flow chart showing the process flow of
acquiring a pixel data group in step S50 of FIG. 35. As shown in
FIG. 36, the CPU 201 sets "X" to "-1" and "Y" to "0" as element
indices of a pixel data array in the first step S60. In the case of
the present embodiment, while the pixel data array is a
two-dimensional array in which X=0 to 31 and Y=0 to 31, dummy data
is output as the data of the initial pixel as described above so
that the initial value of "X" is set to "-1". In the next step S61,
the process of acquiring pixel data.
[0256] FIG. 37 is a flow chart showing the process flow of
acquiring pixel data in step S61 of FIG. 36. As shown in FIG. 37,
the CPU 201 checks the frame status flag signal FSF as input from
the image sensor 43 in the initial step S70, and judges whether or
not the rising edge thereof (from a low level to a high level) is
detected in step S71. Then, if the rising edge of the frame status
flag signal FSF is detected in step S71, in the next step S72, the
CPU 201 instructs the ADC 208 to start the conversion of the analog
pixel data input thereto into digital data. Thereafter, the pixel
strobe signal PDS as input from the image sensor 43 is checked in
step S73, and it is judged whether or not the rising edge of the
pixel strobe signal PDS from a low level to a high level is
detected in step S74.
[0257] If "YES" is determined in step S74, the CPU 201 determines
in step S75 whether or not X=-1, i.e., whether or not it is the
initial pixel. As has been discussed above, since the initial pixel
of each line is set as a dummy pixel, if "YES" is determined in
this step S75, the current pixel data is not acquired, but the
element index "X" is incremented in the following step S77.
[0258] If "NO" is determined in step S75, since it is the second or
later pixel data constructing the line, the current pixel data is
acquired and saved in a temporary register (not shown in the
figure) in steps S76 and S78. Thereafter, the process proceeds to
step S62 of FIG. 36.
[0259] In step S62 of FIG. 36, the pixel data as saved in the
temporary register is assigned to a pixel data element P[Y][X].
[0260] In the following step S63, "X" is incremented if "X" is
smaller than "32", the process of from step S61 to step S63 is
repeatedly performed. If "X" is equal to "32", i.e., if the
acquisition process of pixel data reaches the end of the current
line, "X" is set to "-1" in the following step S65, "Y" is
incremented in step S66, and the acquisition process of pixel data
is repeated from the top of the next line.
[0261] If "Y" is equal to "32" in step S67, i.e., if the
acquisition process of pixel data reaches the last pixel data array
element P[Y][X], the process proceeds to step S51 of FIG. 35.
[0262] FIG. 38 is a flow chart showing the process flow of
extracting a target point in step S51 of FIG. 35. As shown in FIG.
38, in step S80, the CPU 201 acquires differential data by
calculating the difference between the pixel data acquired from the
image sensor 43 when the infrared light emitting diodes 15 are
turned on and the pixel data acquired from the image sensor 43 when
the infrared light emitting diodes 15 are turned off. In step S81,
the CPU 201 assigns the differential data as calculated to the
array elements Dif[X][Y]. In this case, since the image sensor 43
of 32 pixels.times.32 pixels is used in the case of the present
embodiment, X=0 to 31 and Y=0 to 31.
[0263] In step S82, the CPU 201 scans all the array elements Dif[X]
[Y]. In step S83, the CPU 201 finds the maximum value of all the
array elements Dif[X] [Y]. If the maximum value is greater than the
predetermined threshold value "Th", the CPU 201 proceeds to step
S85, and if the maximum value is less than or equal to the
predetermined threshold value "Th", the CPU 201 proceeds to step S4
of FIG. 29 (step S84).
[0264] In step S85, the CPU 201 calculates the coordinates (Xc, Yc)
of the target point of the operation article 150 on the basis of
the coordinates corresponding to the maximum value. In step S86,
the CPU 201 increments the value of the count "M" by one
(M=M+1).
[0265] In step S87, the CPU 201 converts the coordinates (Xc, Yc)
of the target point on the image sensor 43 into the coordinates
(xc, yc) on the screen 91 of the television monitor 90. In step
S88, the CPU 201 assigns "xc" to the array element Px[M] as the
x-coordinate of the M-th target point, and "yc" to the array
element Py[M] as the y-coordinate of the M-th target point.
[0266] FIG. 39 is a flow chart showing the process flow of
calculating the coordinates of a target point in step S85 of FIG.
38. As shown in FIG. 39, in step S100, the CPU 201 assigns the
X-coordinate and the Y-coordinate, which are obtained in step S83
in correspondence with the maximum value, respectively to "m" and
"n". In step S101, the CPU 201 increments "m" by one (m=m+1). If
the differential data Dif[m][n] is greater than the predetermined
threshold value "Th", the CPU 201 proceeds to step S103, otherwise
proceeds to step S104 (step S102). In step S103, the CPU 201
assigns the current value of "m" to "m". The endmost X-coordinate
of the differential data greater than the predetermined threshold
value "Th" is obtained by repeating steps S101 to S103 while
scanning the X-axis from the maximum value position in the positive
direction.
[0267] In step S104, the CPU 201 assigns to "m" the X-coordinate
which is obtained in step S83 in correspondence with the maximum
value. In step S105, the CPU 201 decrements "m" by one. If the
differential data Dif[m][n] is greater than the predetermined
threshold value "Th", the CPU 201 proceeds to step S107, otherwise
proceeds to step S108 (step S106). In step S107, the CPU 201
assigns the current value of "m" to "ml". The endmost X-coordinate
of the differential data greater than the predetermined threshold
value "Th" is obtained by repeating steps S105 to S107 while
scanning the X-axis from the maximum value position in the negative
direction.
[0268] In step S108, the CPU 201 calculates the center coordinate
between the X-coordinate "mr" and the X-coordinate "ml", and
assigns it to the X-coordinate (Xc) of the target point. In step
S109, the CPU 201 assigns "Xc" which is obtained in step S108 and
the Y-coordinate which is obtained in step S83 in correspondence
with the maximum value, respectively to "m" and "n". In step S110,
the CPU 201 increments "n" by one (n=n+1). If the differential data
Dif[m][n] is greater than the predetermined threshold value "Th",
the CPU 201 proceeds to step S112, otherwise proceeds to step S113
(step S111). In step S112, the CPU 201 assigns the current value of
"n" to "md". The endmost Y-coordinate of the differential data
greater than the predetermined threshold value "Th" is obtained by
repeating steps S110 to S112 while scanning the Y-axis from the
maximum value position in the positive direction.
[0269] In step S113, the CPU 201 assigns to "n" the Y-coordinate
which is obtained in step S83 in correspondence with the maximum
value. In step S114, the CPU 201 decrements "n" by one. If the
differential data Dif[m][n] is greater than the predetermined
threshold value "Th", the CPU 201 proceeds to step S116, otherwise
proceeds to step S117 (step S115). In step S116, the CPU 201
assigns the current value of "n" to "mu". The endmost Y-coordinate
of the differential data greater than the predetermined threshold
value "Th" is obtained by repeating steps S114 to S116 while
scanning the Y-axis from the maximum value position in the negative
direction.
[0270] In step S117, the CPU 201 calculates the center coordinate
between the Y-coordinate "md" and the Y-coordinate "mu", and
assigns it to the Y-coordinate (Yc) of the target point. As has
been described above, the coordinates (Xc, Yc) of the target point
of the operation article 150 is calculated.
[0271] FIG. 40 is a flow chart showing the game process flow in
step S3 of FIG. 29. As shown in FIG. 40, in step S120, the CPU 201
checks a music end flag (refer to step S196 of FIG. 43), and if
music ends the game process is over, conversely if music does not
end yet the process proceeds to step S121.
[0272] In step S121, the CPU 201 registers the x-coordinate Px[M]
and the y-coordinate Py[M] of the target point of the operation
article 150 as the display coordinates of the cursor 105 on the
screen 91.
[0273] The CPU 201 repeats the process between step S122 and step
S144 twice. In this case, "j" represents a guidance display number
"J" (refer to FIG. 43).
[0274] In step S123, the CPU 201 checks a guidance start flag GF[j]
(refer to step S194 of FIG. 43). If the guidance start flag GF[j]
is turned on, the CPU 201 proceeds to step S125, and if it is
turned off the CPU 201 proceeds to step S144 (step S124). In step
S125, the CPU 201 checks a frame counter C[j]. If the frame counter
C[j] is greater than "0", the CPU 201 proceeds to step S128,
conversely if the frame counter C[j] is equal to "0", the CPU 201
proceeds to step S127 (step S126). In step S127, in accordance with
the note number NN[j], the CPU 201 registers the animation
information of the direction guide "g" or the path guide "rg"
together with the animation information of the position guide "G".
The animation information is registered only when the frame counter
C[j] is "0" because once the animation information is registered
the animation is performed in accordance with the registration
information so that the registration is needed only when starting
the animation.
[0275] In step S128, the CPU 201 checks the note number NN[j], and
if it is the note number designating the turn of the cursor 105
(refer to FIG. 20(b) and FIG. 20(c)) the process proceeds to step
S131 otherwise (refer to FIG. 20(a)) proceeds to step S129. In step
S129, the CPU 201 checks the frame counter C[j]. If the frame
counter C[j] is greater than or equal to the predetermined number
"f1" of frames, the CPU 201 proceeds to step S131 otherwise
proceeds to step S141 (step S130). In step S131, the CPU 201
determines whether or not the cursor 105 is correctly manipulated
in correspondence with the guides (the position guide "G"/the
direction guide "g"/the path guide "rg") (success
determination).
[0276] Incidentally, as apparent from step S128 and step S131, in
the case where the note number NN[j] designates the turn of the
cursor 105, the success determination of the manipulate of the
cursor 105 is performed just after starting displaying the position
guide "G" and the path guide "rg" (the frame counter C[j] is "0")
irrespective of the value of the frame counter C[j]. On the other
hand, in the case where the note number NN[j] is a note number
other than the note number designating the turn of the cursor 105,
the success determination of the manipulate of the cursor 105 is
performed a predetermined number "f1" of frames (for example, 30
frames) after starting displaying the position guide "G" and the
direction guide "g" (the frame counter C[j] is "0") (refer to FIG.
28).
[0277] By the way, as a result of the determination in step S131,
if the manipulate of the cursor 105 succeeded, the CPU 201 proceeds
to step S133 otherwise proceeds to step S140 (step S132). In step
S133, the CPU 201 registers the dance animation information with
reference to the note number NN[j] and a dance speed flag DF (refer
to step S193, step S190 and step S192 of FIG. 43). Also, in the
case where the manipulation is successful, the CPU 201 changes the
center position of the background screen 140 and modifies the
corresponding data of the array elements PA and the array elements
CA with reference to the note number NN[j] and a dance speed flag
DF in order to scroll the background 110. Furthermore, the CPU 201
registers the storage location information of the animation table
of the position guide "G" which is used when the manipulation
succeeded.
[0278] In step S134, the CPU 201 checks the note numbers NN[j], and
if it is the note number designating the turn of the cursor 105,
the process proceeds to step S137 otherwise proceeds to step S135.
In step S135, the CPU 201 checks the frame counter C[j]. If the
frame counter [j] is greater than or equal to a predetermined
number "f2" of frames, the CPU 201 proceeds to step S137 otherwise
proceeds to step S138 (step S136). In step S137, the CPU 201 adds
"3" to a score "S". On the other hand, in step S138, "1" is added
to the score "S".
[0279] Incidentally, the following is the reason why "3" is added
to the score "S" in step S137 while "1" is added to the score "S"
in step S138. In the case where the cursor 105 is located in the
area of the position guide "G" within a predetermined period (for
example, 10 frames) from the time point the predetermined number
"f2" of frames (for example, 50 frames) after starting displaying
the position guide "G" and the direction guide "g" (the frame
counter C[j] is "0"), it is determined that the cursor 105 is
manipulated in a best timing so that "3" is added. On the other
hand, in the case where the cursor 105 is located in the area of
the position guide "G" the predetermined number "f1" of frames
after and the predetermined number "f2" of frames before, it is
determined that the cursor 105 is manipulated in an ordinary
successful manner that "1" is added. Also, when the manipulation is
performed in correspondence with the position guide "G" and the
path guide "rg" (the guide designating the turn of the cursor 105),
"3" is added equally to the score "S".
[0280] Next, in step S139, the CPU 201 checks the frame counter
C[j]. If the frame counter C[j] is equal to a predetermined number
"f3" (for example, 60 frames), the CPU 201 proceeds to step S142
otherwise proceeds to step S141 (step S140). In step S141, the CPU
201 increments the frame counter C[j] by one. On the other hand, in
step S142, the CPU 201 sets the frame counter C[j] to "0". In step
S143, the CPU 201 turns off the guidance start flag GF[j].
Incidentally, the predetermined number "f3" is used to define the
end of success determination.
[0281] FIG. 41 is a flow chart showing the interrupt process flow.
As shown in FIG. 41, in step S150, the CPU 201 performs the
playback of music. In step S151, the CPU 201 performs the process
of registering the guides (the position guide "G", the direction
guide "g" and the path guide "rg").
[0282] FIG. 42 is a flow chart showing the process flow of the
playback of music in step S150 of FIG. 41. As shown in FIG. 42, in
step S160, the CPU 201 checks the execution stand-by counter for
music. If the value of the execution stand-by counter for music is
"0", the process proceeds to step S162, conversely if it is not
"0", the process proceeds to step S170 (step S161). In step S170,
the CPU 201 decrements the execution stand-by counter for
music.
[0283] On the other hand, in step S162, the CPU 201 reads and
interprets the command pointed to by the musical score data pointer
for music. If the command is "Note On", the process proceeds to
step S164 (step S163). On the other hand, if the command is not
"Note On", i.e., "Waiting", the process proceeds to step S165. In
step S165, the CPU 201 sets a waiting time to the execution
stand-by counter for music.
[0284] On the other hand, in step S164, the CPU 201 instructs the
sound processor 203 to start outputting a sound corresponding to
the note number which is read. In step S166, the CPU 201 increments
the musical score data pointer for music.
[0285] In step S167, the CPU 201 checks the remaining sound
outputting time corresponding to the note number associated with
the outputting sound. If the remaining sound outputting time is
"0", the process proceeds to step S169 otherwise proceeds to step
S151 of FIG. 41 (step S168). In step S169, the CPU 201 instructs
the sound processor 203 to perform the sound termination process of
the note number having the remaining sound outputting time of
"0".
[0286] FIG. 43 is a flow chart showing the process flow of
registering guides in step S151 of FIG. 41. As shown in FIG. 43, in
step S180, the CPU 201 checks the execution stand-by counter for
guide. If the value of the execution stand-by counter for guide is
"0", the process proceeds to step S182, conversely if it is not
"0", the process proceeds to step S198 (step S181). In step S198,
the CPU 201 decrements the execution stand-by counter for
guide.
[0287] On the other hand, in step S182, the CPU 201 reads and
interprets the command pointed to by the musical score data pointer
for guide. If the command is "Note On", the CPU 201 proceeds to
step S184 (step S183). On the other hand, if the command is not
"Note On", i.e., "Waiting", the CPU 201 proceeds to step S197. In
step S197, the CPU 201 sets the execution stand-by counter for
guide to a waiting time.
[0288] If the note number designates the end of music, the CPU 201
proceeds to step S196 otherwise proceeds to step S185 (step S184).
In step S196, the CPU 201 turns on the music end flag.
[0289] On the other hand, if the note number designates the start
of music, the CPU 201 proceeds to step S195 otherwise proceeds to
step S186 (step S185). If the guidance display number "J" is "1,
the CPU 201 sets the guidance display number "J" to "0" in step
S188, conversely if the guidance display number "J" is not "1
(i.e., it is 0"), the CPU 201 sets the guidance display number "J"
to "1" in step S187. Since the guidance of a certain position guide
"G" and the guidance of another position guide "G" are started in
different timings but can be overlappingly continued in a certain
period, the guidance display number "J" is used to perform the game
process in step S3 of FIG. 29.
[0290] By the way, if the note number designates that a high speed
dance animation is to be performed, the CPU 201 proceeds to step
S190 otherwise proceeds to step S191 (step S189). In step S190, the
CPU 201 sets the dance speed flag DF to "1" (a high speed dance
animation).
[0291] On the other hand, if the note number designates that a low
speed dance animation is to be performed, the CPU 201 proceeds to
step S192 otherwise proceeds to step S193 (step S191). In step
S192, the CPU 201 sets the dance speed flag DF to "0" (a low speed
dance animation).
[0292] By the way, in the case where the note number is none of the
note number designating the end of music, the note number
designating the start of music, the note number designating a high
speed dance animation and the note number designating a low speed
dance animation, such a note number shall be a note number which
designates a type of a guide (FIG. 20(a) through FIG. 20(c)) and
thereby the CPU 201 assigns the note number to the array element
NN[J] in step S193. In step S194, the CPU 201 turns on the guidance
start flag GF[J].
[0293] In step S195, the CPU 201 increments the musical score data
pointer for guide.
[0294] Next is the explanation of another example of the direction
guide "g". FIG. 44 is a view for showing an example of the game
screen in which another example of the direction guide "g" is
applied. As shown in FIG. 44, a direction guide "g20" is displayed
in the form of a belt which is extending from the position guide
"G1" toward the position guide "G2". This direction guide "g20"
grows as time passes from the position guide "G1" to the position
guide "G2". The direction in which the cursor 105 is to be
manipulated is guided in terms of the direction in which this
direction guide "g20" grows. Also, if a predetermined time period
after the direction guide "g20" reaches the position guide "G2" is
used as the period for performing the success determination of
manipulating the cursor 105, it is possible to guide the
manipulation timing of the cursor 105 by the direction guide "g20".
Incidentally, it can be said that the direction guide "g20" is
represented by gradually change the color of the path from the
position guide "G1" to the position guide "G2".
[0295] FIG. 45 is a view for showing an example of the game screen
in which a further example of the direction guide "g" is applied.
As shown in FIG. 45, a direction guide "g30" is displayed on the
game screen between one position guide "G" and another position
guide "G". This direction guide "g30" consists of five partial
paths "g31" to "g35". Then, in the case of the example of FIG. 45,
the five partial paths "g31" to "g35" change color sequentially
from the position guide "G1" toward the position guide "G2". The
change in color is illustrated by hatching. By this method, the
manipulation direction of the cursor 105 can be guided in terms of
the direction in which the partial paths "g31" to "g35" change
color in sequence. Also, in the case of this example, if a
predetermined time period after changing the color of the partial
path "g35" adjacent to the position guide "G2" guiding the
destination position of the cursor 105 is used as the period for
performing the success determination of manipulating the cursor
105, it is possible to guide the manipulation timing of the cursor
105 by the direction guide "g30".
[0296] FIG. 46 is a view for showing an example of the game screen
in which a still further example of the direction guide "g" is
applied. As shown in FIG. 46, a direction guide "g40" is displayed
on the game screen between the position guide "G1" and the position
guide "G2". This direction guide "g40" moves from the position
guide "G1" to the position guide "G2" as time passes. The direction
in which the cursor 105 is to be moved is guided in terms of the
direction in which this direction guide "g40" moves. Also, if a
predetermined time period after the direction guide "g40" reaches
the position guide "G2" is used as the period for performing the
success determination of manipulating the cursor 105, it is
possible to guide the manipulation timing of the cursor 105 by the
direction guide "g40".
[0297] As has been discussed above, in the case of the present
embodiment, if the cursor 105 is correctly manipulated in
correspondence with the guides (the position guide "G", the
direction guide "g" and the path guide "rg"), the display of images
(the dance object 106 and the background 110 in the above example)
is controlled in accordance with the guidance by the guides. In
this case, since the cursor 105 is correctly manipulated in
correspondence with the guidance by the guides, the display of
images is controlled in accordance with the manipulation of the
cursor 105. In other words, since the cursor 105 moves in
association with the operation article 150, the display of the
images is controlled in accordance with the manipulation of the
operation article 150. Also, the image of the operation article
150, which is intermittently lighted by the stroboscope, is
captured by the imaging unit 13 in order to obtain the state
information of the operation article 150. Because of this, no
circuit which is driven by a power supply need be provided within
the operation article 150 for obtaining the state information of
the operation article 150. Furthermore, this music game apparatus 1
serves to automatically play music.
[0298] As a result, while automatically playing music without
relation to the player 94, the player 94 can enjoy, together with
the music, images which are displayed in synchronization with the
manipulation of the operation article 150 by manipulating the
operation article 150 having a simple structure.
[0299] Also, since the guides are controlled in the timing on the
basis of music, the operation article 150 is manipulated in
synchronization with music as long as the player 94 manipulates the
cursor 105 in correspondence with the guides. Accordingly, the
player 94 can enjoy the manipulation of the operation article 150
in synchronization with music.
[0300] In this case, for example, in correspondence with the note
numbers "55" and "67", the high speed processor 200 scrolls the
background 110 to the left, and prepares the dance animation
information and the dance animation table for turning the dance
object 106 in the counter clockwise direction. Also, for example,
in correspondence with the note numbers "45" and "64", the high
speed processor 200 scrolls the background 110 to the right, and
prepares the dance animation information and the dance animation
table for turning the dance object 106 in the clockwise direction.
Furthermore, for example, in correspondence with the note numbers
"76" and "77", the high speed processor 200 scrolls the background
110 in the downward direction, and prepares the dance animation
information and the dance animation table for turning the dance
object 106 in the counter clockwise direction. Still further, for
example, in correspondence with the note numbers "65" and "74", the
high speed processor 200 scrolls the background 110 in the upward
direction, and prepares the dance animation information and the
dance animation table for turning the dance object 106 in the
clockwise direction. Still further, for example, the dance
animation information and the dance animation table for widely
turning the dance object 106 in the clockwise direction are prepare
in correspondence with the note number "53". Still further, for
example, the dance animation information and the dance animation
table for widely turning the dance object 106 in the counter
clockwise direction are prepare in correspondence with the note
number "57".
[0301] Incidentally, the above types of the note numbers (refer to
FIG. 20(a) through FIG. 20(c)) are note numbers for controlling the
display of the guides, and thereby the background 110 and the dance
object 106 are controlled in accordance with the guides.
Furthermore, in other words, the background 110 and the dance
object 106 are controlled in accordance with the manipulation of
the operation article 150.
[0302] Also, in the case of the present embodiment, the position
guide "G" serves to guide the manipulation timing and the
destination position of the cursor 105. In addition, when the
cursor 105 is correctly manipulated by the operation article 150 in
correspondence with the guidance of the position guide "G", the
high speed processor 200 controls the display of images (the dance
object 106 and the background 110 in the case of the above example)
in correspondence with the direction toward the destination
position which is guided by the position guide "G".
[0303] Accordingly, when the player 94 manipulates the operation
article 150 in order to move the cursor 105 to the destination
position guided by the position guide "G" in the manipulation
timing guided by the position guide "G", the display of images is
controlled in correspondence with the direction toward the
destination position guided by the position guide "G". As a result,
it is possible to enjoy, together with music, the images which are
synchronized with the cursor 105 which is moving in association
with the motion of the operation article 150 (refer to FIG.
12).
[0304] Furthermore, in the case of the present embodiment, the path
guide "rg" serves to guide the moving path, the moving direction
and the manipulation timing of the cursor 105. Accordingly, when
the player 94 manipulates the operation article 150 in order to
move the cursor 105 in the manipulation timing guided by the path
guide "rg", in the moving direction guided by the path guide "rg"
and along the moving path guided by the path guide "rg", the
display of images (the dance object in the case of the above
example) is controlled in correspondence with the path guide "rg".
As a result, it is possible to enjoy, together with music, the
images which are synchronized with the cursor 105 which is moving
in association with the motion of the operation article 150 (refer
to FIG. 13).
[0305] Furthermore, in the case of the present embodiment, if the
position of the target point of the operation article 150 is found
in the area guided by the position guide "G" within the period is
guided by the position guide "G", it is determined that the cursor
105 which is moving in association with the operation article 150
is correctly manipulated in correspondence with the guidance of the
position guide "G" (refer to FIG. 24). Also, if the position of the
target point of the operation article 150 is moved through a
plurality of predetermined areas guided by the path guide "rg" in
the predetermined order guided by the path guide "rg" within the
period guided by the path guide "rg", it is determined that the
cursor 105 which is moving in association with the operation
article 150 is correctly manipulated in correspondence with the
guidance of the path guide "rg" (refer to FIG. 25). As has been
discussed above, it is possible to determine the correctness of the
manipulation of the cursor 105 on the basis of the position of the
target point of the operation article 150 which can be calculated
by a simple process.
[0306] Furthermore, in the case of the present embodiment, the
position guide "G" is displayed in each of a plurality of positions
which are determined in advance on the screen 91. Then, the high
speed processor 200 changes the appearance of the position guide
"G" in the timing on the basis of music (the animation that a bloom
is opening in the case of the example of FIG. 12). Accordingly, the
player 94 can easily recognize the position and the direction to
which the cursor 105 is to be moved with reference to the change of
the position guide "G" in appearance.
[0307] Furthermore, in the case of the present embodiment, the
direction guide "g" and the path guide "rg" are expressed in images
with which it is possible to visually recognize the motion from the
first predetermined position to the second predetermined position
on the screen 91. As has been discussed above, the manipulation of
the cursor 105 is guided not only by the position guide "G" but
also by the direction guide "g" and the path guide "rg".
Accordingly, the player 94 can clearly recognize the direction and
path of the cursor 105 to be moved. More specific description is as
follows.
[0308] The direction guide "g" and the path guide "rg" are
expressed by the change in appearance of a plurality of objects (in
the form of spheres in the case of the examples of FIG. 12 and FIG.
13) which are arranged in the path having a start point at the
first predetermined position and an end point at the second
predetermined position on the screen 91. In this case, the player
94 can easily recognize the direction and the path to which the
cursor 105 is to be moved with reference to the change in
appearance of the plurality of the objects.
[0309] Also, the direction guide "g" is expressed by the motion of
an object (in the form of a bird in the case of the example of FIG.
46) from the first predetermined position to the second
predetermined position on the screen 91. In this case, the player
94 can easily recognize the direction and the path to which the
cursor 105 is to be moved with reference to the motion of the
object.
[0310] In addition to this, the direction guide "g" is expressed by
the change in appearance of the path having a start point at the
first predetermined position and an end point at the second
predetermined position on the screen 91 (refer to FIG. 44 and FIG.
45). In this case, the player 94 can easily recognize the direction
and the path to which the cursor 105 is to be moved with reference
to the change in appearance of the path.
[0311] Furthermore, in the case of the present embodiment, the high
speed processor 200 can be calculate, as the state information of
the operation article 150, any one of or any combination of two or
more of speed information, moving direction information, moving
distance information, velocity vector information, acceleration
information, movement locus information, area information, and
positional information. As has been discussed above, since a
variety of information can be used as the state information of the
operation article 150 for determining whether or not the cursor 105
is correctly manipulated in correspondence with the guides (the
position guide "G", the direction guide "g" and the path guide
"rg"), the possibility of expression of guides is greatly expanded,
and thereby the design freedom of the game content is also greatly
increased.
[0312] Furthermore, in the case of the present embodiment, the
state information of the operation article 150 can be obtained by
intermittently emitting infrared light to the operation article 150
to which the reflection sheet 155 is attached and capturing the
image thereof. Because of this, no circuit which is driven by a
power supply need be provided within the operation article 150 for
obtaining the state information of the operation article 150.
Accordingly, it is possible to improve the manipulability and
reliability of the operation article 150, and to reduce the
cost.
[0313] Meanwhile, the present invention is not limited to the
embodiments as described above, but can be applied in a variety of
aspects without departing from the spirit thereof, and for example
the following modifications may be effected.
[0314] (1) In the above embodiments, the dance object 106 and the
background 110 are explained as images (follow-up images) which are
controlled to follow the motion of the operation article 150.
However, the present invention is not limited thereto, but any
arbitrary object can be selected as such a follow-up image. Also,
instead of scrolling the background 110 in order to express the
motion of the dance object 106, it is possible to move the dance
object 106 itself in the up, down, right and left directions.
[0315] (2) While the direct manipulation of the cursor 105 is
guided by both the position guide "G" and the direction guide "g"
in the above embodiments, it is possible to perform the guidance
only by one of them. In the case where only the direction guide "g"
is used to instruct the manipulation of the cursor 105, it is
preferred that the position guide "G" in the form of a still image
is arranged at each of the start point and the end point of the
direction guide "g". Also, while both the position guide "G" and
the path guide "rg" is used to guide the turning manipulation of
the cursor 105, it is possible to guide only by the path guide
"rg". Furthermore, while the guides (the position guide "G", the
direction guide "g" and the path guide "rg") are expressed by
animation in the above description, the present invention is not
limited thereto. Still further, the implementation of a guide is
not limited to those as described above.
[0316] (3) While the operation article 150 comprising the stick 152
and the reflection ball 151 is employed as an operation article in
the above embodiments, the configuration of the operation article
is not limited to those as described above as long as a reflecting
object is provided.
[0317] (4) While the coordinates of the operation article 150 are
calculated as illustrated in FIG. 23(a) and FIG. 23(d) in the above
embodiments, it is possible to convert the coordinates of a pixel
having the maximum luminance value greater than the predetermined
threshold value "Th" (refer to step S83 of FIG. 38) into
coordinates on the screen 91 and to make use of them as the
coordinates of a target point.
[0318] (5) While an arbitrary type of a processor can be used as
the high speed processor 200 of FIG. 6, it is preferred to make use
of the high speed processor of which the present applicant has
filed a patent application. This high speed processor is disclosed,
for example, in Japanese Patent Published Application No. Hei
10-307790 and U.S. Pat. No. 6,070,205 corresponding thereto.
[0319] While the present invention has been described in terms of
embodiments, it is apparent to those skilled in the art that the
invention is not limited to the embodiments as described in the
present specification. The present invention can be practiced with
modification and alteration within the spirit and scope which are
defined by the appended claims. Accordingly, the description of
this application is thus to be regarded as illustrative instead of
limiting in any way on the present invention.
* * * * *