U.S. patent application number 14/210347 was filed with the patent office on 2014-07-10 for information transmission system, information sending device, information receiving device, information transmission method, information sending method, information receiving method and program product.
This patent application is currently assigned to Casio Computer Co., Ltd.. The applicant listed for this patent is Casio Computer Co., Ltd.. Invention is credited to Nobuo IIZUKA, Keiichi KANEKO, Masaaki KIKUCHI.
Application Number | 20140193162 14/210347 |
Document ID | / |
Family ID | 46210145 |
Filed Date | 2014-07-10 |
United States Patent
Application |
20140193162 |
Kind Code |
A1 |
IIZUKA; Nobuo ; et
al. |
July 10, 2014 |
INFORMATION TRANSMISSION SYSTEM, INFORMATION SENDING DEVICE,
INFORMATION RECEIVING DEVICE, INFORMATION TRANSMISSION METHOD,
INFORMATION SENDING METHOD, INFORMATION RECEIVING METHOD AND
PROGRAM PRODUCT
Abstract
An information transmission system includes: an information
sending device including a light emitting section that emits light
in a plurality of colors, a modulating section that modulates
information to be transmitted into signals composed of changes in
color, and a light emission control section that controls the light
emitting section to emit light while changing color temporally
based on the signals generated by the modulating section; and a
receiving device including a camera that captures an image having
color, and a control and communication section that detects a
temporal color change of the light emitting section emitting light
by light emission control by the information transmitting device,
from images consecutively captured by the camera, decodes the
detected color change into information, and outputs the generated
information to a display section.
Inventors: |
IIZUKA; Nobuo; (Tokyo,
JP) ; KANEKO; Keiichi; (Kawasaki-shi, JP) ;
KIKUCHI; Masaaki; (Tokyo, JP) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Casio Computer Co., Ltd. |
Tokyo |
|
JP |
|
|
Assignee: |
Casio Computer Co., Ltd.
Tokyo
JP
|
Family ID: |
46210145 |
Appl. No.: |
14/210347 |
Filed: |
March 13, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13528403 |
Jun 20, 2012 |
|
|
|
14210347 |
|
|
|
|
Current U.S.
Class: |
398/172 ;
398/187; 398/202 |
Current CPC
Class: |
H04B 10/1129 20130101;
H04B 10/548 20130101; H04B 10/516 20130101; H04B 10/11 20130101;
H04B 10/69 20130101; H04B 10/116 20130101 |
Class at
Publication: |
398/172 ;
398/202; 398/187 |
International
Class: |
H04B 10/116 20060101
H04B010/116; H04B 10/548 20060101 H04B010/548; H04B 10/69 20060101
H04B010/69 |
Foreign Application Data
Date |
Code |
Application Number |
Jun 23, 2011 |
JP |
2011-139213 |
Jun 23, 2011 |
JP |
2011-139233 |
Claims
1. An information transmission system comprising: an information
sending device including a light emitting section which emits light
in a plurality of colors, a modulating section which modulates
information to be transmitted into signals composed of changes in
color, and a light emission control section which controls the
light emitting section to emit light while changing color
temporally based on the signals generated by the modulating
section; and an information receiving device including an imaging
section which captures an image having color, a converting section
which converts color change of images consecutively captured by the
imaging section to information on color space including at least
saturation and brightness as parameters, a generating section which
generates an image expressed by the parameters of saturation and
brightness on color space generated by the converting section from
the images consecutively captured by the imaging section, a
detecting section which detects the temporal color change by
identifying an image of the light emitting section whose light
emission control has been performed by the information sending
device, based on the image generated by the generating section, a
decoding section which decodes the color change detected by the
detecting section into information, and an information output
section which outputs the information generated by the decoding
section.
2. The information transmission system according to claim 1,
wherein: the information on color space generated by the converting
section includes a hue parameter; and the decoding section decodes
the color change into the information by judging the color change
detected by the detecting section as change in hue parameter in the
color space, and converting the color change to corresponding
signal sequence.
3. The information transmission system according to claim 1,
wherein: the information sending device and the information
receiving device each further include a table in which signals
included in information to be transmitted and received are
associated with information on color changes in a plurality of
stages; the modulating section modulates the information to be
transmitted into the signals composed of changes in color by
referencing the table; and the decoding section decodes the color
change detected by the detecting section into the information by
referencing the table.
4. The information transmission system according to claim 3,
wherein the table defines types of acceptable color changes based
on a redundancy level.
5. The information transmission system according to claim 1,
wherein the information sending device includes a display device
which displays a predetermined image by emitting light in a
plurality of colors in pixel units.
6. The information transmission system according to claim 5,
wherein the light emission control section controls light emission
while changing color temporally using a pixel area of the display
device as the light emitting section.
7. The information transmission system according to claim. 6,
wherein the pixel area of the display device whose light emission
control has been performed by the light emission control section
has a frame display area set around a periphery thereof which is
used to differentiate from an image displayed outside of the pixel
area.
8. An information receiving device comprising: an imaging section
which captures an image having color; a converting section which
converts color change of images consecutively captured by the
imaging section to information on color space including at least
saturation and brightness as parameters; a generating section which
generates an image expressed by the parameters of saturation and
brightness on color space generated by the converting section from
the images consecutively captured by the imaging section; a
detecting section which detects a pixel area whose color changes
temporally by identifying the pixel area whose color changes, based
on the image generated by the generating section; a decoding
section which decodes the color change detected by the detecting
section into information; and an information output section which
outputs the information generated by the decoding section.
9. The information receiving device according to claim 8, wherein:
the information on color space generated by the converting section
includes a hue parameter; and the decoding section decodes the
color change into the information by judging the color change
detected by the detecting section as change in hue parameter in the
color space, and converting the color change to corresponding
signal sequence.
10. The information receiving device according to claim 8, further
comprising: a table in which signals included in information to be
decoded are associated with information on color changes in a
plurality of stages; wherein the decoding section decodes the color
change detected by the detecting section into the information by
referencing the table.
11. An information transmission method comprising: modulating
information to be transmitted into signals composed of changes in
color; performing light emission control by controlling a light
emitting section which emits light in a plurality of colors so that
the light emitting section emits light while changing color
temporally based on the signals generated in the modulating;
converting color change of images consecutively captured by an
imaging section which captures an image having color to information
on color space including at least saturation and brightness as
parameters; generating an image expressed by the parameters of
saturation and brightness on color space generated by the
converting from the images consecutively captured by the imaging
section; detecting a temporal color change of the light emitting
section emitting light by performing the light emission control,
based on the image generated by the generating; decoding the color
change detected in the detecting into information; and outputting
the information generated in the decoding.
12. An information receiving method comprising: converting color
change of images consecutively captured by an imaging section which
captures an image having color to information on color space
including at least saturation and brightness as parameters;
generating an image expressed by the parameters of saturation and
brightness on color space generated in the converting from the
images consecutively captured by the imaging section; detecting a
pixel area whose color changes temporally by identifying the pixel
area whose color changes, based on the image generated in the
generating; decoding color change detected in the detecting into
information; and outputting the information generated in the
decoding.
13. A non-transitory computer-readable storage medium having a
program stored thereon that is executable by a computer in an image
display device having an imaging section which captures an image
having color, the program being executable by the computer to
control the computer to perform functions comprising: conversion
processing for converting color change of images consecutively
captured by the imaging section to information on color space
including at least saturation and brightness as parameters;
generation processing for generating an image expressed by the
parameters of saturation and brightness on color space generated by
the conversion processing from the images consecutively captured by
the imaging section; detection processing for detecting a pixel
area whose color changes temporally by identifying the pixel area
whose color changes, based on the image generated by the generation
processing; decode processing for decoding color change detected by
the detection processing into information; and information output
processing for outputting the information generated by the decode
processing.
14. An information sending device comprising: a table in which
signals included in information to be transmitted are associated
with information on color changes in a plurality of stages based on
redundancy; a modulating section which modulates the information to
be transmitted into signals composed of changes in color by
referencing the table; and a light emission control section which
controls a light emitting section which emits light in a plurality
of colors to emit light while changing color temporally based on
the signals generated by the modulating section.
15. The information sending device according to claim 14, wherein a
configuration of the color changes is determined based on color
separation characteristics or color shade adjustment
characteristics of a light receiving device for receiving the color
changes during light reception.
16. The information sending device according to claim 15, wherein
the light receiving device includes an image sensor.
17. The information sending device according to claim 14, wherein
the light emitting section is a pixel area of an image display
section which emits light in a plurality of colors in pixel
units.
18. The information sending device according to claim 17, wherein
the pixel area has a frame display area set around a periphery
thereof which is used to differentiate from an image displayed
outside of the pixel area.
19. An information sending method comprising: modulating
information to be transmitted into signals composed of changes in
color by referencing a table in which signals included in the
information to be transmitted are associated with information on
color changes in a plurality of stages based on redundancy; and
controlling a light emitting section which emits light in a
plurality of colors to emit light while changing color temporally
based on the signals generated in the modulating.
20. A non-transitory computer-readable storage medium having a
program stored thereon that is executable by a computer to control
the computer to perform functions comprising: modulation processing
for modulating information to be transmitted into signals composed
of changes in color by referencing a table in which signals
included in the information to be transmitted are associated with
information on color changes in a plurality of stages based on
redundancy; and light emission control processing for controlling a
light emitting section which emits light in a plurality of colors
to emit light while changing color temporally based on the signals
generated by the modulation processing.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This is a Divisional of U.S. application Ser. No.
13/528,403, filed Jun. 20, 2012, which is based upon and claims the
benefit of priority from prior Japanese Patent Applications No.
2011-139213 and No. 2011-139233, both filed Jun. 23, 2011, the
entire contents of all of which are incorporated herein by
reference.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The present invention relates to an information transmission
system, an information sending device, an information receiving
device, an information transmission method, an information sending
method, an information receiving method, and a program product
using a spatial optical transmission technology.
[0004] 2. Description of the Related Art
[0005] In recent years, so-called digital signage is gaining
attention as an information transmission system using a spatial
optical transmission technology. The digital signage refers to a
system for transmitting information in a location other than home,
such as outdoors, transportation facilities, storefronts, and
public facilities, using a display device connected to a
network.
[0006] FIG. 1 is a diagram showing the use of the digital signage.
In FIG. 1, an outdoor scene 100 is shown in the center, which
includes pedestrians 101 and vehicles 102, as well as buildings 103
to 105 located in the background. In particular, the building 104
in the center has a large-scale display terminal 106 mounted on a
wall surface thereof.
[0007] The display terminal 106 is a display device for digital
signage that visualizes and displays information 107 transmitted
from a server (not shown). This information 107 is, for example,
information related to a certain product. In the example shown in
FIG. 1, an image of a wristwatch is displayed as the product.
[0008] In order to sell this product, the price, the sales period,
the sales location, etc. need to be announced.
[0009] In the example shown in FIG. 1, light modulation areas 1061
to 1064 are provided for the delivery of the above-described
information. These modulation areas 1061 to 1064 are provided in
portions (the four corners in FIG. 1) of the screen of the display
terminal 106, and the required information is transmitted by
time-serial changes of the light thereof.
[0010] As a result, when people on the street catch sight of the
display on the display terminal 106 and capture an image of the
advertisement using his or her mobile electronic unit 108, they can
know detailed information related to the product, such as the
price, the sales period and the sales location.
[0011] In the case of the example shown in FIG. 1, a picture 110 of
the product and a words balloon 111 containing detailed information
on the product (information on discounts, such as 50% OFF, the
discount period, such as from 11:00 to 15:00, and the like) are
displayed on the screen 109 of the mobile electronic unit 108.
[0012] As types of spatial "optical" transmission technology, for
example, the following are known.
[0013] In U.S. Pat. No. 6,933,956, a technology for a spatial
optical transmission system composed of a light transmitting unit
and a light receiving unit is described.
[0014] An overview of the technology is basically as follows: A
light emitting unit logically determines a bit sequence that
structures information to be transmitted; alternatively selects a
bit pattern sequence from two pre-prepared bit pattern sequences
having a low correlation with each other, based on the
determination result; modulates the light in accordance with the
selection result; and transmits the modulated light. A
light-receiving unit receives the light and generates a binarized
signal based on the intensity of the light; generates a logic
signal 1 or a logic signal 0, when the bit pattern sequence
included in the binarized signal corresponds to either one of the
two bit pattern sequences; and reproduces the information included
in the light.
[0015] In Japanese Patent Application Laid-Open (Kokai) Publication
No. 2005-267169, a technology is described which actualizes
pointing by using flashing signals that are colored and have the
same hue value.
[0016] In Japanese Patent Application Laid-Open (Kokai) Publication
No. 2006-072778, a technology is described in which four states are
transmitted and received using prescribed changes in hue
difference.
[0017] In Japanese Patent Application Laid-Open (Kokai) Publication
No. 2010-287820, a technology for actualizing high-speed
communication is described in which, in addition to color changes,
one of the colors is flashed at high speed, the light intensity is
detected by a photodiode other than that of the image sensor being
added, and thereby another signal is superimposed.
[0018] In Japanese Patent Application Laid-Open (Kokai) Publication
No. 2009-186203, a technology is described in which transmission is
performed by a combination of light-emitting bodies in three
colors.
[0019] However, the technology described in U.S. Pat. No. 6,933,956
is merely a transmission technology based on the blinking of light
which performs optical transmission using the logical signal 1 and
the logical signal 0. Therefore, there is a problem in that, when
signals (binary modulation signals) generated by binary blinking of
light are received by a popularized camera having a common frame
rate (about 30 fps) and information is reproduced, a considerable
amount of time is required (about two seconds, described in detail
hereafter).
[0020] Although modulation of multivalued light using a combination
of the colors red (R), green (G), and blue (B) as in the other
known technologies can be performed to solve this problem, simply
performing the value multiplexing increases processing load on the
light receiving device.
SUMMARY OF THE INVENTION
[0021] An object of the present invention is to enable the decoding
of multivalued optical transmission information without increasing
processing load.
[0022] In order to achieve the above-described object, in
accordance with one aspect of the present invention, there is
provided an information transmission system comprising: an
information sending device including a light emitting section which
emits light in a plurality of colors, a modulating section which
modulates information to be transmitted into signals composed of
changes in color, and a light emission control section which
controls the light emitting section to emit light while changing
color temporally based on the signals generated by the modulating
section; and an information receiving device including an imaging
section which captures an image having color, a detecting section
which detects a temporal color change of the light emitting section
emitting light by light emission control by the light emission
control section from images consecutively captured by the imaging
section, a decoding section which decodes the color change detected
by the detecting section into information, and an information
output section which outputs the information generated by the
decoding section.
[0023] In accordance with another aspect of the present invention,
there is provided an information sending device comprising: an
image display section which has a light emitting section which
emits light in a plurality of colors in pixel units; a modulating
section which modulates information to be transmitted into signals
composed of changes in color; and a light emission control section
which controls a pixel area of the image display section to emit
light while changing color temporally based on the signals
generated by the modulating section.
[0024] In accordance with another aspect of the present invention,
there is provided an information receiving device comprising: an
imaging section which captures an image having color; a detecting
section which detects a pixel area whose color changes temporally
from images consecutively captured by the imaging section; a
decoding section which decodes color change detected by the
detecting section into information; and an information output
section which outputs the information generated by the decoding
section.
[0025] In accordance with another aspect of the present invention,
there is provided an information transmission method comprising: a
modulating step of modulating information to be transmitted into
signals composed of changes in color; a light emission control step
of controlling a light emitting section which emits light in a
plurality of colors so that the light emitting section emits light
while changing color temporally based on the signals generated in
the modulating step; a detecting step of detecting a temporal color
change of the light emitting section emitting light by light
emission control in the light emission control step, from images
consecutively captured by an imaging section which captures images
having color; a decoding step of decoding the color change detected
in the detecting step into information; and an information output
step of outputting the information generated in the decoding
step.
[0026] In accordance with another aspect of the present invention,
there is provided an information sending method comprising: a
modulating step of modulating information to be transmitted into
signals composed of changes in color; and a light emission control
step of controlling a pixel area of an image display section having
a light emitting section which emits light in a plurality of colors
in pixel units, so as to emit light while changing color temporally
based on the signals generated in the modulating step.
[0027] In accordance with another aspect of the present invention,
there is provided an information receiving method comprising: a
detecting step of detecting a pixel area whose color changes
temporally from images consecutively captured by an imaging section
which captures an image having color; a decoding step of decoding
color change detected in the detecting step into information; and
an information output step of outputting the information generated
in the decoding step.
[0028] In accordance with another aspect of the present invention,
there is provided a non-transitory computer-readable storage medium
having stored thereon a program that is executable by a computer
controlling an image display device having a light emitting section
which emits light in a plurality of colors in pixel units, the
program being executable by the computer to perform functions
comprising: modulation processing for modulating information to be
transmitted into signals composed of changes in color; and light
emission control processing for controlling a pixel area of the
image display device to emit light while changing color temporally
based on the signals generated by the modulation processing.
[0029] In accordance with another aspect of the present invention,
there is provided a non-transitory computer-readable storage medium
having stored thereon a program that is executable by a computer
having an imaging section which captures an image having color, the
program being executable by the computer to perform functions
comprising: detection processing for detecting a pixel area whose
color changes temporally from images consecutively captured by the
imaging section; decode processing for decoding color change
detected by the detection processing into information; and
information output processing for outputting the information
generated by the decode processing.
[0030] The above and further objects and novel features of the
present invention will more fully appear from the following
detailed description when the same is read in conjunction with the
accompanying drawings. It is to be expressly understood, however,
that the drawings are for the purpose of illustration only and are
not intended as a definition of the limits of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] FIG. 1 is a diagram showing the use of digital signage;
[0032] FIG. 2 is a block diagram of an information transmission
system according to an embodiment;
[0033] FIG. 3A is a diagram of modulation areas (light emitting
section 5 [1061 to 1064]) in a lighted state;
[0034] FIG. 3B is a diagram of the modulation areas (light emitting
section 5 [1061 to 1064]) in an unlighted state;
[0035] FIG. 4 is a diagram of a signal format for optical
communication according to the embodiment;
[0036] FIG. 5 is a diagram of color separation characteristics in a
color filter;
[0037] FIG. 6 is a diagram in which the results in FIG. 5 are
converted to HSV space;
[0038] FIG. 7 is a diagram of an encoding table:
[0039] FIG. 8 is a flowchart of internal processing by a light
receiving device 3;
[0040] FIG. 9 is a sub-flowchart of Step S3 in FIG. 8;
[0041] FIG. 10 is a sub-flowchart of Step S4 in FIG. 8;
[0042] FIG. 11A is a diagram showing an instance where the shape
evaluation of a modulation area is performed by aspect ratio, in
which the modulation area in the shape of a square is circumscribed
within a square;
[0043] FIG. 11B is a diagram showing an instance where the shape
evaluation of a modulation area is performed by aspect ratio, in
which the modulation area having an irregular shape is
circumscribed within a square;
[0044] FIG. 11C is a diagram showing an instance where the shape
evaluation of a modulation area is performed by area filling ratio,
in which the shape of the modulation area is substantially
oval;
[0045] FIG. 11D is a diagram showing an instance where the shape
evaluation of a modulation area is performed by area filling ratio,
in which the shape of the modulation area is elongated oval;
[0046] FIG. 12 is a diagram showing a buffering state of a buffer
memory provided in a RAM 123;
[0047] FIG. 13 is a diagram showing an example of a candidate area
table;
[0048] FIG. 14 is a sub-flowchart of Step S7 in FIG. 8;
[0049] FIG. 15 is a sub-flowchart of Step S72 in FIG. 14;
[0050] FIG. 16 is a diagram showing an image of the linking of area
tables;
[0051] FIG. 17 is a diagram showing a simplified image of the
successively linked areas;
[0052] FIG. 18 is a diagram showing an example of hue data
extraction;
[0053] FIG. 19 is an explanatory diagram of threshold setting;
and
[0054] FIG. 20 is a diagram showing an example of changes in hue
value.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
[0055] An embodiment of the present invention will be described
with reference to the drawings.
[0056] FIG. 2 is a block diagram of an information transmission
system used in FIG. 1. In FIG. 2, the information transmission
system 1 includes a light emitting device 2, a light receiving
device 3 and a service server 4.
[0057] The light emitting device 2 includes a light emitting
section 5, a storing section 6 that stores information such as tag
identification (ID), and a modulating section 7 for
modulation-driving the light emitting section 5 (equivalent to 1061
to 1064 in FIG. 1) using information stored in the storing section
6, and is provided in a system for transmitting information by a
display terminal 106, in locations such as outdoors, transportation
facilities, storefronts, and public facilities.
[0058] The light emitting section 5 transmits required information
(such as tag ID) based on the manner of light emission by the light
emitting section 5 (temporal color change of emitted light and
color intensity).
[0059] The light receiving device 3 is equivalent to a mobile
electronic unit 108 in FIG. 1. This light receiving device 3 (108)
is constituted by an optical system 8 including an imaging lens and
the like, a camera 9, an input sensor 10, a display section 11 such
as a liquid crystal display, a control and communication section
12, etc.
[0060] The service server 4 is, for example, a server that operates
an information providing site or a product sales site (so-called
online shop) on the Internet which is correlated to information
transmitted by digital signage.
[0061] The camera 9 is composed of a two-dimensional imaging device
mounted with a color filter, such as a charge-coupled device (CCD)
or a complementary metal-oxide semiconductor (CMOS). The camera 9
consecutively captures images within a predetermined viewing angle
at a cycle of several tens of frames per second, and outputs the
captured images to the control and communication section 12. In
this embodiment, the image-capturing cycle of the camera 9 is 30
frames per second (30 fps), taking the example of a typical (and
also general-purpose) two-dimensional imaging device. The
modulation frequency of the modulating section 7 of the light
emitting device 2 is half of the image-capturing cycle, or in other
words, 15 Hz.
[0062] The input sensor 10 is, for example, a sensor for detecting
various information inputted by user operation. Specifically, the
input sensor 10 is a QWERTY keyboard including a numeric keypad or
a touch panel.
[0063] The display section 11 is a high-definition display device,
such as a liquid crystal display. This display section 11
visualizes random information outputted accordingly from the
control and communication section 12, and outputs and displays the
visualized information.
[0064] The control and communication section 11 includes a
communication interface that interfaces with the service server 4,
a computer or a microcomputer (hereinafter, referred to as a
central processing unit [CPU]) 121, a read-only semiconductor
memory (hereinafter, referred to as a read-only memory [ROM]) 122,
and a writable/readable semiconductor memory (hereinafter, referred
to as a random access memory [RAM]) 123.
[0065] The control and communication section 12 is a control
element for a program control system including peripheral circuits
(not shown). This control and communication section 12 loads
control programs prestored in the ROM 122 into the RAM 123, and
executes them by the CPU 121.
[0066] [Modulation Method and Physical Format]
[0067] FIG. 3A and FIG. 3B are diagrams of modulation areas (light
emitting section 5 [1601 to 1604]) provided in portions of the
display terminal 106 for digital signage. As shown in FIG. 3A, the
light emitting section 5 (1601 to 1604) is expressed as a group
composed of several pixels in a predetermined portion of the
display terminal 106 (such as a corner of the terminal) for digital
signage.
[0068] The periphery of the light emitting section 5 (1601 to 1604)
composed of 2.times.2 pixels is surrounded by a pixel frame (in
this instance, a frame composed of 12 pixels labeled Bk) that
enables the receiving device 3 (108) to differentiate the light
emitting section 5 (1601 to 1604) from an image for digital
signage. This pixel frame is composed of black pixels that are in
an unlighted state at all times.
[0069] For example, when all the pixels of the light emitting
section 5 (1601 to 1604) are lit in red (R), the light emitting
section 5 (1601 to 1604) enters the state shown in FIG. 3A. When
all the pixels of the light emitting section 5 (1601 to 1604) are
unlit, the light emitting section 5 (1601 to 1604) is black (Bk) as
shown in FIG. 3B.
[0070] Note that the shape and the number of pixels of the frame
section are not limited to this example.
[0071] FIG. 4 is a diagram of a signal format according to the
present embodiment.
[0072] In FIG. 4, the signal format 13 is constituted by a header
section 131 composed of one non-luminous pulse (black), and a data
section 132 composed of the subsequent nine pulses in which any one
of the three colors red (R), blue (B), and green (G) is lit. Since
the number of colors (three colors) of the data section 132 is
three-valued (R, G, B) in contrast to the binary of black
(unlighted) and white (lighted) described earlier, three-value
modulation, so to speak, is performed in this example.
[0073] The reason for using the single pulse in "black" in the
header is that the luminance value of "black" is clearly and
significantly different compared to other chromatic colors and
therefore black is easier to separate without being affected by
color mixing.
[0074] Note that, although the three-value modulation using three
colors including red, blue, and green is performed in FIG. 4, the
present invention is not limited thereto and, for example, other
three colors including cyan, magenta, and yellow may be used.
Alternatively, seven colors in which white is added to these colors
may be used. That is, multiple values exceeding three values may be
used.
[0075] The selection of color configuration (the number of values)
to be used is mostly a matter of workshop modification.
[0076] For example, a color configuration considered to be suitable
based on the color separation characteristics, auto white balance
(AWB) characteristics, or the like of the camera 9 may be used.
[0077] FIG. 5 is a diagram of color separation characteristics of
the light emitting section 5 (1601 to 1604) and the color filter of
the camera 9.
[0078] As shown in FIG. 5, in actuality, slight amounts of the
green wavelength component and the blue wavelength component are
included even when only red is lighted.
[0079] Therefore, in the present embodiment, the three primary
colors, which are red, blue, and green and relatively separable,
are used.
[0080] FIG. 6 is a diagram in which the color space in FIG. 5 is
converted to hue, saturation, and value (HSV).
[0081] FIG. 6 indicates that red, green, blue, and the like that
are chromatic colors have a certain saturation or more and, even
when the luminance decreases, they can be easily separated and
distinguished from black (unlighted).
[0082] Accordingly, with the use of color modulation such as that
according to the present embodiment, the following advantageous
effect can be acquired. For example, when data amounting to 9 bits
in the data section 132 is to be outputted, the rational number
expression for this data section 132 is "3 9" (i.e., 3.sup.9) in
three-color modulation. This "3 9" is "19,683" in decimal notation,
and this "19,683" is "100110011100011" in binary notation, or in
other words, 15-bit numerical sequence. Therefore, 14 bits or more
can be expressed by at least nine pulses.
[0083] FIG. 7 is a diagram showing an encoding chart for converting
data values to a light-emission signal sequence.
[0084] This encoding chart is stored in advance in the ROM 122 of
the control and communication section 12.
[0085] Encoding signals "1", "2", and "3" in the chart indicate
"red", "blue", and "green", respectively.
[0086] Accordingly, "123" indicates that light is emitted in the
order of "red to blue to green".
[0087] When an encoding signal is "132", or in other words, when
light is emitted in the order of "red to green to blue", four types
of encoding results are acquired.
[0088] For example, in the encoding chart in FIG. 7, a first
encoding result having no redundancy is "8", and a second encoding
result having low redundancy is "7". A third encoding result having
medium redundancy is "7", and a fourth encoding result having high
redundancy is "2".
[0089] Here, the first encoding having no redundancy is provided
with 27 values, from 1 to 27.
[0090] The second encoding having low redundancy is provided with
24 values, from 1 to 24.
[0091] The third encoding having medium redundancy is provided with
8 values, from 1 to 8.
[0092] The fourth encoding having high redundancy is provided with
6 values, from 1 to 6.
[0093] The first encoding can transmit more information at a
significantly higher speed than conventional binary information
transmission. However, when the control and communication section
12 receives this information (captures an image) by the camera 9, a
pixel area where the same color continues for a certain amount of
time may be erroneously recognized as a part of a background image
having no changes.
[0094] The second encoding eliminates a state in which the light
emission of the same color continues for a certain amount of time
(in other words, only "111", "222", and "333").
[0095] In the third encoding, an initially identified color is used
as a reference, the next color is far in distance from the hue of
the initially identified color.
[0096] In the fourth encoding, the colors of three consecutive
pulses are different at all times, whereby noise of color existing
in nature can be eliminated.
[0097] Note that the selection of encoding is made based on the
environment and characteristics of equipment in which the light
emitting device 2 is arranged.
[0098] Next, operations of the light receiving device 3 will be
described.
[0099] FIG. 8 is a diagram of internal processing performed by the
overall light receiving device 3.
[0100] In the processing, first, the light receiving device 3
resets a frame counter in the RAM 123 (Step S1), and stores a frame
captured by the camera 9 in the RAM 123 (Step S2).
[0101] Next, the light receiving device 3 generates a binary image
of the stored frame (Step S3), and after creating a candidate area
table in the RAM 123 (Step S4), registers it in a list (Step
S5).
[0102] Then, the light receiving device 3 judges whether or not the
frame counter is a predetermined value n (Step S6). When judged
that the frame counter is not the predetermined value n, the light
receiving device 3 returns to Step S2. When judged that the frame
counter is the predetermined value n, the light receiving device 3
performs decode processing (Step S7), and after performing
information display processing (Step S8), repeats Step S1 and the
subsequent steps.
[0103] More specifically, at Step S3, the following processing is
performed.
[Candidate Area Detection Per Frame and Registration into Candidate
Area Table]
[0104] (a) Color Enhancement Correction Processing:
[0105] As processing preceding HSV color space processing, color
separation from RGB to R'G'B' is performed on modulated light of
the light emitting section 5 (1061 to 1064) by digital signage,
using a conversion matrix such as that in the following expression
(1).
[0106] As a result of this processing, color having high saturation
is further emphasized, and the separation is facilitated.
( B ' G ' R ' ) = ( a 11 a 12 a 13 b 11 b 12 b 13 c 11 c 12 c 13 )
( B G R ) [ Expression 1 ] ##EQU00001##
[0107] The components a, b, and c in the matrix in expression (1)
are, for example, values such as those in the following expression
(2). In terms of vector space, the processing brings the values
within the color space closer to the respective RGB axes.
a=(a11,a22,a33)=(-1.7,-0.65,-0.1)
b=(b11,b22,b33)=(-0.9,1.9,-0.1)
c=(c11,c22,c33)=(-0.1,-0.1,1.1) (2)
[0108] (b) Conversion to HSV Color Space
[0109] The saturation of the captured image is unaffected by the
surrounding environment and is mostly unchanged. Conversely, the
value of each color vector on which RGB decomposition has been
performed has a characteristic of being affected by the surrounding
environment. In order to address such characteristics in
conventional luminance-based search, search is performed by
conversion to an HSV color specification system, rather than
expression in RGB.
[0110] In the instance of the color modulation method and the color
modulation indicated by the above-described physical format, the
imaging results of the pulses in the data section have high
saturation (S) values at all times, regardless of the color of the
emitted light, even when the hue slightly fluctuates.
[0111] FIG. 9 is a detailed diagram of the processing at Step S3 in
FIG. 8.
[0112] As shown in FIG. 9, in the processing at Step S3, the light
receiving device 3 converts the captured image in the RGB color
space to an image in the HSV color space (Step S31). Next, the
light receiving device 3 sets a suitable threshold value for each
pixel of an image expressed by the S (saturation) parameter
(hereinafter, referred to as an S image), and an image expressed by
the V (brightness: also referred to as I [intensity]) parameter
(hereinafter, referred to as a V image) in the converted image
expressed in the HSV color space, and binarizes them (Step
S32).
[0113] The S image and the V image acquired thereby and a logical
product thereof are set as a black and white binary image
expressing a candidate for a communication area (Step S33).
[0114] [Labeling Processing]
[0115] Next, labeling processing that is one of the characteristic
features of the present embodiment will be described.
[0116] In this processing, an image of an area where a
light-modulated area excluding the header section 131 is included
at all times and color characteristics coincidentally match is
acquired as a black and white binary image. As a result, for
example, an image is acquired in which an area having a high
likelihood of color characteristics is white against a black
background.
[0117] FIG. 10 is a detailed diagram of the processing at Step S4
in FIG. 8.
[0118] As shown in FIG. 10, first, the light receiving device 3
performs a so-called labeling processing in which a continuous area
and a basic shape parameter are identified (Step S41). More
specifically, the light receiving device 3 performs processing for
identifying each continuous white area in the above-described
candidate image and determining the shape information thereof.
[0119] Note that, in the present embodiment, the gravity center of
the area, the area (pixel area) thereof, and the coordinates of the
four corners of the circumscribed quadrangle area thereof are
acquired.
[0120] In the subsequent processing, the light receiving device 3
extracts one of the acquired continuous areas (Step S42) and
performs filtering based on shape-related conditions. First, the
light receiving device 3 eliminates areas that are too small (such
as a 2.times.2-pixel square or smaller) in terms of area (area
size) as noise (Step S43).
[0121] Next, the light receiving device 3 evaluates the shape of an
area detected as a result of the labeling processing at Step S41.
In the present embodiment, for simplification of the processing,
the light receiving device 3 performs the evaluation based on the
shape likelihood using the aspect ratio (Step S44).
[0122] FIG. 11A to FIG. 11D are diagrams for explaining the shape
evaluation of the modulation area.
[0123] FIG. 11A and FIG. 11B are diagrams showing the aspect ratio
of the shape (white portion) of the modulation area. H is the
long-side length of the circumscribed quadrangle of the shape of
the modulation area, and W is the short-side length of the
circumscribed quadrangle of the shape of the modulation area. FIG.
11C and FIG. 11D are diagrams showing the area filling ratio of the
shape of the modulation area in relation to a predetermined area P
(10.times.10 pixels in FIG. 11C and FIG. 11D). This area filling
ratio is acquired by a value that is the area A of the shape (white
portion) of the modulation area divided by the predetermined area P
(10.times.10 pixels).
[0124] As described above, conditions under which an area is
considered to be a modulation area are set in advance regarding the
aspect ratio and the area filling ratio, whereby areas that do not
meet the conditions are not considered to be modulation areas (Step
S45).
[0125] An area that has not been eliminated has a high possibility
of being a modulation area, and therefore the light receiving
device 3 registers it in the candidate area table (Step S46).
Subsequently, until it is judged that the processing has been
completed on all candidate areas that may possibly be a modulation
area (YES at Step S47), the light receiving device 3 repeats the
above-described Steps S42 to S45.
[0126] As a result, list entries for a required number of frames
are acquired from Step S1 to Step S6, and registered as a table
list.
[0127] However, because the phase relationship is indefinite,
sampling by frames is twice the pulse cycle, or in other words, 30
fps in relation to 15 Hz-pulse-based modulation. Accordingly, since
the number of pulses constituting a block is 10 pulses as described
above, the list entries are n=2.times.10=20 and a buffering state
is such as that shown in FIG. 12 described hereafter.
[0128] As described above, in the present embodiment, filtering
based on shape-related conditions is performed for a modulation
area, whereby shapes that are, for example, clearly different or
clearly too small are eliminated in advance. Therefore, processing
load related to the searching of an information light source area
by the light receiving device 3 can be reduced.
[0129] In the actual implementation thereof, when frame analysis is
performed, and the current image-capturing condition (the
surrounding environment, the state of camera shake, and the like)
is clearly unsuitable for communication during the process of
creating a candidate area table and adding to a table list, such as
when the number of detected candidate areas is consecutively zero
for a certain number of times, the processing can be reset during
the process.
[0130] [Filtering Processing for Decode Processing Candidate]
[0131] FIG. 12 is a diagram showing a buffering state of a buffer
memory provided in the RAM 123.
[0132] F0 to Fn in the upper portion are frames, and the tables in
the lower portion indicate the buffering state of a candidate area
table for each frame F0 to Fn.
[0133] Note that a number of candidate area tables amounting to the
n-number of frames (n is a natural number) are provided. When a
predetermined number of frames is reached, the content is
rewritten.
[0134] The reason for providing a candidate area table for each
frame is to significantly reduce the amount of calculation through
use of compressed information as a candidate area table, rather
than processing a large number of images in a time direction at a
pixel data level, when processing time-series images.
[0135] Here, an example of the candidate area table which is
generated for each captured (imaged) frame will be described.
[0136] FIG. 13 is a diagram showing an example of a candidate area
table for frame number Fn=0.
[0137] In FIG. 13, gravity center coordinates (cx,xy), area (size),
and hue value (hue) are shown for each area No. (A).
[0138] For example, gravity center coordinates (10,50), an area
(70), and a hue value (80) are shown for area No. 1.
[0139] Also, gravity center coordinates (111,321), an area (23),
and a hue value (200) are shown for area No. 1.
[0140] As described above, in the present embodiment, the gravity
center coordinates (cx,xy) of an area subjected to labeling
processing, the area (size) thereof, and the hue value (hue)
thereof are successively stored in a candidate area table.
[0141] Note that, in the description below, the identification of
an individual candidate area is expressed as Fn:Am, and the
identification of its internal parameter is expressed as
Fn:Am:(cx,cy).
[0142] Also, in the present embodiment, the distance between two
area gravity centers (x1,y1) and (x2,y2) is determined using area
information.
[0143] However, because the area is already a dimension that is the
square of the coordinates, consideration is given to enable
addition as an evaluation measure of the same dimension.
((x2-x1) 2+(y2-y1) 2+( Size2- Size1) 2) (3)
[0144] Expression (3) indicates that, after square root calculation
is performed on the area, the distance calculation of ordinary
vectors in three dimensions is performed.
[0145] In actuality, expression (3) is operated within a range of
relatively small threshold values (such as 0 to 10). Therefore, to
obtain a similar evaluation value while reducing the calculation
amount, such as squaring the entirety, expression (3) may be
modified as the following expression (4).
(x2-x1)+(y2-y1)+ (Size2-Size1) (4)
[0146] As a result of the calculation, it can be considered that
"small evaluation value=same area" in similarity evaluation between
modulation areas.
[0147] FIG. 14 is a flowchart showing the details of the processing
at Step S7 in FIG. 8.
[0148] In the flowchart, the light receiving device 3 inputs a
candidate frame point of a processing target frame and a candidate
frame point of a frame previously acquired for one to three frames
into the above-described expression (4), and calculates the
evaluation value. Next, the light receiving device 3 creates a link
list of the candidate frame point of the processing target frame
and the candidate frame point of the frame previously acquired for
one to three frames, using the evaluation value (Step S71).
[0149] Next, the light receiving device 3 successively links the
link list between the two frames, while interpolating frames in
which a coordinate point is not present due to being black
(non-light-emitting) as black data within a permitted range (two
frames in this instance), and determines a chain of coordinate
point groups running 18 frames (Step S72).
[0150] Next, the light receiving device 3 eliminates discontinuous
link elements in which three frames have passed without the
presence of a candidate point, and rearranges a collection of
coordinate point groups amounting to 18 frames such that the
beginning of the like is black (Step S73).
[0151] Next, the light receiving device 3 extracts a complete link
area constituted by related coordinate point groups amounting to 18
frames (Step S74). Then, the light receiving device 3 performs
decode processing on a hue value (optical signal sequence)
corresponding to each coordinate point (Step S75), and judges
whether or not a valid decoding result (decoding value) has been
acquired (Step S76).
[0152] When judged that a valid decoding result has been acquired,
the light receiving device 3 performs request queuing for
coordinates and data to display system processing so that the word
balloon 111 and the like are displayed (Step S77). Next, the light
receiving device 3 judges whether or not the processing has been
completed for all candidate point groups amounting to 18 frames
(Step S78). Even when a valid decoding result is not acquired, the
light receiving device 3 judges whether or not the processing has
been completed for all candidate point groups amounting to 18
frames. In either case, when judged that the processing has not
been completed for all candidate point groups amounting to 18
frames, the light receiving device 3 repeats Step S74 and the
subsequent steps.
[0153] FIG. 15 is a flowchart showing the details of the processing
at Step S72.
[0154] In the flowchart, first, the light receiving device 3
defines a processing target frame Fx, and advances the processing
target frame Fx by one in the forward direction in terms of time
(Step S721). Next, the light receiving device 3 extracts one area
candidate Ax from the processing target frame Fx (Step S722) and
defines the processing target frame Fx as Fd=Fn+1 (Step S723).
[0155] Next, the light receiving device 3 calculates an evaluation
value of the area candidate Ax and each element of the processing
target frame Fd using the above-described expression (4) (Step
S724), and determines a smallest link combination based on the
evaluation value (Step S725).
[0156] Next, the light receiving device 3 judges whether or not the
evaluation value of the determined smallest link combination is
equal to or less than a threshold set in advance (Step S726). When
judged that the evaluation value is not equal to or less than the
threshold, the light receiving device 3 judges whether or not the
current processing target frame Fd is the fn+3rd frame (Step S728).
When judged that the current processing target frame Fd is the
fn+3rd frame, the light receiving device 3 judges that a linking
area has not been found (Step S729), and repeats Step S722 and the
subsequent steps.
[0157] At Step S728, when judged that the current processing target
frame Fd is not the fn+3rd frame, the light receiving device 3
inserts dummy data ("skip") and advances the processing target
frame Fd by one in the forward direction in terms of time (Step
S730). Then, the light receiving device 3 repeats Step S724 and the
subsequent steps.
[0158] At Step S726, when judged that the evaluation value of the
smallest link combination is equal to or less than the threshold
set in advance, the light receiving device 3 registers the current
list in the link list as an adjacent list (Step S727), and judges
whether or not the processing has been completed for all areas of
the processing target frame (Step S731). When judged that the
processing has not been completed, the light receiving device 3
repeats the processing at Step S722 and the subsequent steps. When
judged that the processing has been completed, the light receiving
device 3 judges whether or not evaluation of all frames has been
completed (Step S732). When judged that the evaluation has not been
completed, the light receiving device 3 returns to Step S721. When
judged that the evaluation has been completed, the light receiving
device 3 ends the processing.
[0159] Note that, in expression (4) of the present embodiment, an
evaluation value of 30 or less is "considered the same".
[0160] In the physical format according to the present embodiment,
when saturation and brightness are high, linking is performed.
However, because a header (black) is included at all times, the
linking of area tables per frame when the processing at FIG. 15 is
performed is as shown in FIG. 16.
[0161] FIG. 16 is a link image of area tables.
[0162] In this image, sections connected by solid lines indicate
the linking status of each candidate area where visible optical
communication is performed.
[0163] On the other hand, the dotted lines indicate a status in
which, although whether or not communication is being performed is
not clearly determined, linking on an evaluation value level is
judged to have been made.
[0164] At Step S73, the determined link is evaluated. In an area
where a modulation signal is present at this time, a path (a
connection of decoding results) that is linked over all 20 frames
is present at all times. Therefore, other areas are eliminated.
[0165] FIG. 17 is a diagram showing a simplified image of areas
judged to be successively linked. Images that have skipped the link
are considered black (non-illuminated), and interpolated as having
a saturation of 0. The hue data of area Am of other images are
extracted, respectively.
[0166] FIG. 18 is a diagram showing an example of hue data
extraction when a hue value has a range of 0 to 360. In FIG. 18,
only a hue value held in each candidate area element data
(individual F0:A0 and the like) are arrayed, corresponding to the
result rearranged as the final link candidates in FIG. 17.
[0167] In this way, a candidate as a time-based link is determined
from the color and shape candidates of a single frame. The optical
signal value held by the link area in FIG. 17 (when changed to hue
value link, in the case of the present embodiment) becomes that
shown in FIG. 18, if the hue value is considered to have a range of
0 to 360 as in typical definition. Areas judged to be in an
unlighted state and skipped are considered to have a value that is
clearly out of the hue value range (such as -1).
[0168] Returning to the flowchart in FIG. 14, first, the CPU 121
eliminates link areas that cannot be present as modulation areas.
These areas are candidate areas meeting a condition "the header
section 131 (-1 state) is not present".
[0169] When 19 frames are sampled in the above-described physical
format, a signal area always has one or two unlighted periods (due
to phase relationship between signal pulse and frame capture
timing). Therefore, an area where "one or two consecutive -1 does
not occur only once" is eliminated (an area where -1 is still
present after other values are removed from -1 is not considered to
be a signal).
[0170] The value sequence of an area meeting this condition is
selected. For example, in the case of the example in FIG. 18, area
No. 1 is eliminated because unlighted timing is not present.
[0171] Next, the hue value sequence is cyclically shifted to start
from one or two consecutive -1.
[0172] The processing up to this point is the processing at Step
S73.
[0173] At Step S74 and Step S75, the CPU 121 checks validity
regarding modulation of each link result, selects a phase, and
performs demodulation.
[0174] [Threshold Setting]
[0175] FIG. 19 is an explanatory diagram of threshold setting.
[0176] As shown in FIG. 19, the threshold of a possible range of a
color emitting pulse of a modulation signal is set on a hue axis,
taking into consideration color characteristics on the camera side
(including the effect of dynamic AWB control) and light source on
the transmitting side. For example, the threshold of R is near 0
(or 360), the threshold of G is near 120, and the threshold of B is
near 240.
[0177] Note that, although the threshold is fixed in the present
embodiment, the threshold may be dynamically optimized by being set
in accordance with an environment matching the characteristics of
the camera 9 or set in a valley between peaks in hue
distribution.
[0178] In addition, it is preferable that these threshold values
are optimized for each light-emitting point, whereby a more stable
reception can be performed even when receiving a plurality of
light-emission signals having different color characteristics.
[0179] As described above, a color that changes at 15 Hz (pulse
cycle of 66 ms) is sampled at 30 fps by the camera 9. Therefore,
the 18 sample sequences can be considered to be constituted by two
phases A and B.
[0180] An example of change in hue value in which a peak considered
to be an unlighted state comes at the beginning as described above
is as follows.
[0181] FIG. 20 is a diagram showing an example of change in hue
value, in which the vertical axis is a hue value, the horizontal
axis is a frame number, and phase A and phase B are arrayed along
the horizontal axis. As shown in FIG. 20, the pattern of change in
hue value varies. In this instance, phase A is comparatively an
optimal phase, and the color changes at phase B at all times.
[0182] Regarding a phase relationship with the light source, a
value within the thresholds may be acquired in phase A and phase B.
For example, in transition from R near zero to B, the intermediate
value is near 300 between B and R, rather than G. Conversely, in
transition from R near 340 to G, the intermediate value is near 60
of Y, rather than phase B.
[0183] In either case, when the area candidate is a modulation
signal, either of the phase A sequence and the phase B sequence is
within the value range thresholds at all times.
[0184] By the above-described processing, the area where the
communication is being made has been determined, and the change in
optical signals at an observation value level has been expressed by
a color information row that can be applied to a decoding table.
Next, when the color information row is collated with the decoder
table in FIG. 7, transmission bit data is acquired.
[0185] A noise area where saturation, spatial shape, time-based
linking and the like coincidentally match is, of course, eliminated
based on the rule in FIG. 7 regarding redundancy. Therefore, it is
highly unlikely that changes in nature are coincidentally taken as
data.
[0186] In the processing of the present embodiment, changes in
nature may coincidentally match. Accordingly, it is preferable that
reception error is prevented by error detection, correction, and
the like for the higher-order layers.
[0187] As a result, the following effects can be achieved by the
present embodiment.
[0188] A. Since the optical transmission method is used in which
color modulation is performed by at least three-values, and 3.sup.9
is equal to or greater than 14 bits in, for example, three-color
modulation, 14 bits or more can be expressed in nine pulses,
whereby transmitting time can be shortened.
[0189] B. In the decoding processing for image sensor
communication, table-based time-direction processing of frames is
performed. Therefore, the amount of processing to be handled can be
significantly reduced.
[0190] C. Time-based change in an area is made by the center of
gravity and size change. Therefore, link judgment having a high
degree of relevance can be performed.
[0191] D. The transmitting side performs repeated transmissions of
a fixed length. Therefore, rather than finding a header and
starting sampling for communication, data amounting to a data block
length is stored and a header is retrieved from the stored data. As
a result, response time for communication acquisition can be
minimized.
[0192] E. The detection of a modulation area is performed after
filtering based on shape conditions, such as the elimination of
shapes that are clearly different or too small, being performed.
Therefore, processing load on the light receiving device 3 related
to the searching of an information light source can be reduced.
[0193] While the present invention has been described with
reference to the preferred embodiments, it is intended that the
invention be not limited by any of the details of the description
therein but includes all the embodiments which fall within the
scope of the appended claims.
* * * * *