U.S. patent application number 12/479978 was filed with the patent office on 2009-12-24 for method and apparatus for outputting and displaying image data.
This patent application is currently assigned to Samsung Electronics Co., Ltd.. Invention is credited to Hyun-kwon Chung, Kil-soo Jung, Dae-jong Lee.
Application Number | 20090315884 12/479978 |
Document ID | / |
Family ID | 41812276 |
Filed Date | 2009-12-24 |
United States Patent
Application |
20090315884 |
Kind Code |
A1 |
Lee; Dae-jong ; et
al. |
December 24, 2009 |
METHOD AND APPARATUS FOR OUTPUTTING AND DISPLAYING IMAGE DATA
Abstract
A method of outputting three-dimensional (3D) images, the method
including: generating first-perspective image data and
second-perspective image data to display the 3D image by converting
a same two-dimensional (2D) image data; adding additional
information indicating a relationship between the first-perspective
image data and the second-perspective image data to the
first-perspective image data and/or the second-perspective image
data; and outputting the first-perspective image data and the
second-perspective image data.
Inventors: |
Lee; Dae-jong; (Suwon-si,
KR) ; Chung; Hyun-kwon; (Seoul, KR) ; Jung;
Kil-soo; (Osan-si, KR) |
Correspondence
Address: |
STEIN MCEWEN, LLP
1400 EYE STREET, NW, SUITE 300
WASHINGTON
DC
20005
US
|
Assignee: |
Samsung Electronics Co.,
Ltd.
Suwon-si
KR
|
Family ID: |
41812276 |
Appl. No.: |
12/479978 |
Filed: |
June 8, 2009 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61075184 |
Jun 24, 2008 |
|
|
|
Current U.S.
Class: |
345/419 ;
345/545 |
Current CPC
Class: |
G11B 27/10 20130101;
H04N 13/359 20180501; G06T 5/009 20130101; H04N 13/10 20180501;
G06T 5/40 20130101; G09G 2320/0626 20130101; H04N 13/194 20180501;
H04N 13/161 20180501; G11B 27/034 20130101; H04N 2213/005 20130101;
H04N 19/597 20141101; G06T 2207/10016 20130101; H04N 13/139
20180501; H04N 13/261 20180501; G06T 9/00 20130101; H04N 13/286
20180501; G06T 2207/20208 20130101; H04N 13/156 20180501; G06F 3/14
20130101; H04N 13/341 20180501; G11B 27/322 20130101; H04N 13/361
20180501; H04N 13/178 20180501; H04N 13/183 20180501; G06T 15/005
20130101; H04N 13/339 20180501; H04N 13/189 20180501 |
Class at
Publication: |
345/419 ;
345/545 |
International
Class: |
G06T 15/20 20060101
G06T015/20; G09G 5/36 20060101 G09G005/36 |
Foreign Application Data
Date |
Code |
Application Number |
Sep 19, 2008 |
KR |
10-2008-0092417 |
Claims
1. A method of outputting three-dimensional (3D) image data, the
method comprising: generating first-perspective image data and
second-perspective image data to display a 3D image by converting a
same two-dimensional (2D) image data; generating additional
information indicating a relationship between the first-perspective
image data and the second-perspective image data; and outputting
the first-perspective image data, the second-perspective image
data, and the additional information.
2. The method as claimed in claim 1, wherein the additional
information comprises pair information indicating that the
first-perspective image data and the second-perspective image data
are paired image data generated by converting the same 2D image
data.
3. The method as claimed in claim 1, wherein the first-perspective
image data is left-perspective image data, and the
second-perspective image data is right-perspective image data.
4. The method as claimed in claim 3, wherein the additional
information comprises perspective information indicating that the
first-perspective image data is the left-perspective image data,
and/or that the second-perspective image data is the
right-perspective image data.
5. The method as claimed in claim 1, wherein the generating of the
additional information comprises: generating the additional
information; and adding the generated additional information to the
first-perspective image data and/or the second perspective image
data.
6. The method as claimed in claim 1, wherein the generating of the
additional information comprises generating the additional
information as a bit string of the first-perspective image data
and/or the second-perspective image data.
7. A method of displaying image data, the method comprising:
receiving first-perspective image data and second-perspective image
data, which are generated by converting a same two-dimensional (2D)
image data and are to display a three-dimensional (3D) image;
receiving additional information indicating a relationship between
the first-perspective image data and the second-perspective image
data; and displaying the first-perspective image data and the
second-perspective image data based on the additional
information.
8. The method as claimed in claim 7, wherein the additional
information comprises pair information indicating that the
first-perspective image data and the second-perspective image data
are paired image data generated by converting the same 2D image
data.
9. The method as claimed in claim 7, wherein the first-perspective
image data is left-perspective image data, and the
second-perspective image data is right-perspective image data.
10. The method as claimed in claim 9, wherein the additional
information comprises perspective information indicating that the
first-perspective image data is the left-perspective image data,
and/or that the second-perspective image data is the
right-perspective image data.
11. The method as claimed in claim 7, further comprising: storing
the received first-perspective image data and the received
second-perspective image data.
12. The method as claimed in claim 11, wherein the
first-perspective image data is stored in a first buffer, of a
plurality of buffers classified according to a predetermined
standard, and the second-perspective image data is stored in a
second buffer, of the plurality of buffers.
13. The method as claimed in claim 12, wherein the plurality of
buffers comprises: the first buffer corresponding to
left-perspective image data of a first pair of received image data;
the second buffer corresponding to right-perspective image data of
the first pair; a third buffer corresponding to the
left-perspective image data of a second pair of received image
data; and a fourth buffer corresponding to the right-perspective
image data of the second pair.
14. The method as claimed in claim 7, wherein the displaying of the
first-perspective image data and the second-perspective image data
comprises repeatedly displaying the first-perspective image data
and the second-perspective image data for a predetermined number of
times.
15. The method as claimed in claim 14, further comprising: storing
the received first-perspective image data in a first buffer, of a
plurality of buffers classified according to a predetermined
standard, and storing the received second-perspective image data in
a second buffer, of the plurality of buffers; and when the
first-perspective image data and the second-perspective image data
have been repeatedly displayed for the predetermined number of
times, deleting the first-perspective image data and the
second-perspective image data from the first and second buffers,
respectively.
16. The method as claimed in claim 15, wherein the first
perspective image data is received every 1/N of a second, and the
first perspective image data is displayed every 1/Y of a second,
where N and Y are integers and N is less than Y
17. The method as claimed in claim 7, wherein the receiving of the
additional information comprises extracting the additional
information from the first-perspective image data and/or the
second-perspective image data.
18. An image data outputting device comprising: a generating unit
to generate first-perspective image data and second-perspective
image data to display a three-dimensional (3D) image by converting
a same two-dimensional (2D) image data; an adding unit to add
additional information indicating a relationship between the
first-perspective image data and the second-perspective image data
to the first-perspective image data and/or the second-perspective
image data; and an outputting unit to output the first-perspective
image data and the second-perspective image data.
19. The image data outputting unit as claimed in claim 18, wherein
the additional information comprises pair information indicating
that the first-perspective image data and the second-perspective
image data are paired image data generated by converting the same
2D image data.
20. The image data outputting unit as claimed in claim 18, wherein
the first-perspective image data is left-perspective image data,
and the second-perspective image data is right-perspective image
data.
21. The image data outputting unit as claimed in claim 19, wherein
the additional information comprises perspective information
indicating that the first-perspective image data is
left-perspective image data and the second-perspective image data
is right-perspective image data.
22. An image data displaying device comprising: a receiving unit to
receive first-perspective image data and second-perspective image
data, which are generated by converting a same two-dimensional (2D)
image data and are to display a three-dimensional (3D) image, and
to receive additional information indicating a relationship between
the first-perspective image data and the second-perspective image
data; and a displaying unit to display the first-perspective image
data and the second-perspective image data based on the additional
information.
23. The image data displaying device as claimed in claim 22,
wherein the additional information comprises pair information
indicating that the first-perspective image data and the
second-perspective image data are paired image data generated by
converting the same 2D image data.
24. The image data displaying device as claimed in claim 22,
wherein the first-perspective image data left-perspective image
data, and the second-perspective image data is right-perspective
image data.
25. The image data displaying device as claimed in claim 24,
wherein the additional information comprises perspective
information indicating that the first-perspective image data is the
left-perspective image data, and the second-perspective image data
is the right-perspective image data.
26. The image data displaying device as claimed in claim 22,
further comprising: a storage unit comprising a plurality of
buffers classified according to a predetermined standard; and a
control unit to control the image data displaying device based on
the additional information such that the first-perspective image
data is stored in a first buffer, of the plurality of buffers, and
the second-perspective image data is stored in a second buffer, of
the plurality of buffers.
27. The image data displaying device as claimed in claim 26,
wherein the control unit controls the displaying unit to repeatedly
display the first-perspective image data and the second-perspective
image data for a predetermined number of times.
28. The image data displaying device as claimed in claim 27,
wherein the control unit further controls the image data displaying
device such that the first-perspective image data and the
second-perspective image data are deleted from the first and second
buffers, respectively, in response to the displaying unit
repeatedly displaying the first-perspective image data and the
second-perspective image data for the predetermined number of
times.
29. The image data displaying device as claimed in claim 26,
wherein the receiving unit receives perspective image data every
1/N of a second, and the displaying unit displays the received
perspective image data every 1/Y of a second, where N and Y are
integers and N is less than Y
30. The image data displaying device as claimed in claim 22,
wherein the receiving unit comprises an extracting unit to extract
the additional information from the first-perspective image data
and/or the second-perspective image data.
31. A computer readable recording medium having recorded thereon a
computer program for executing the method of claim 1 by at least
one computer.
32. A computer readable recording medium having recorded thereon a
computer program for executing the method of claim 7 by at least
one computer.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of Korean Patent
Application No. 10-2008-0092417, filed Sep. 19, 2008, in the Korean
Intellectual Property Office, and the benefit of U.S. Provisional
Patent Application No. 61/075,184, filed Jun. 24, 2008, in the U.S.
Patent and Trademark Office, the disclosures of which are
incorporated herein by reference.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] Aspects of the present invention relate to methods and
apparatuses to output and display image data.
[0004] 2. Description of the Related Art
[0005] Recently, dramatic developments in image processing
technologies have resulted in more users demanding more realistic
images. In particular, the number of users who could previously
only watch two-dimensional (2D) images via screens but now demand
realistic three-dimensional (3D) images continuously increases.
Thus, technologies to provide 3D images are demanded.
[0006] According to a method of providing 3D images, an image may
be generated as a 3D image. Specifically, a base image and a depth
image are stored in a storage space. Then, a user reproduces 3D
images by processing the base image and the depth image by using an
image outputting device supporting reproduction of 3D images.
However, such a method is unlikely to create an image as a 3D
image. That is, in most cases, images are created as 2D images.
Thus, technologies to reproduce 3D images by processing such 2D
images are being used.
SUMMARY OF THE INVENTION
[0007] Aspects of the present invention provide methods and
apparatuses to output and to display three-dimensional (3D)
images.
[0008] According to an aspect of the present invention, there is
provided a method of outputting 3D image data, the method
including: generating first-perspective image data and
second-perspective image data to display the 3D image by converting
a same two-dimensional (2D) image data; generating additional
information indicating a relationship between the first-perspective
image data and the second-perspective image data; and outputting
the first-perspective image data, the second-perspective image
data, and the additional information.
[0009] According to an aspect of the present invention, the
additional information may include pair information indicating that
the first-perspective image data and the second-perspective image
data are paired image data generated by converting the same 2D
image data.
[0010] According to an aspect of the present invention, the
first-perspective image data may be left-perspective image data,
and the second-perspective image data may be right-perspective
image data.
[0011] According to an aspect of the present invention, the
additional information may include perspective information
indicating that the first-perspective image data is the
left-perspective image data, and the second-perspective image data
is the right-perspective image data.
[0012] According to another aspect of the present invention, there
is provided a method of displaying 3D image data, the method
including: receiving first-perspective image data and
second-perspective image data, which are generated by converting a
same 2D image data and are to display a 3D image; receiving
additional information indicating a relationship between the
first-perspective image data and the second-perspective image data;
and displaying the first-perspective image data and the
second-perspective image data based on the additional
information.
[0013] According to an aspect of the present invention, the
additional information may include pair information indicating that
the first-perspective image data and the second-perspective image
data are paired image data generated by converting the same 2D
image data.
[0014] According to an aspect of the present invention, the
first-perspective image data may be left-perspective image data,
and the second-perspective image data may be right-perspective
image data.
[0015] According to an aspect of the present invention, the
additional information may include perspective information
indicating that the first-perspective image data is the
left-perspective image data, and the second-perspective image data
is the right-perspective image data.
[0016] According to an aspect of the present invention, the
first-perspective image data may be stored in a first buffer, of a
plurality of buffers classified according to a predetermined
standard, and the second-perspective image data may be stored in a
second buffer, of the plurality of buffers.
[0017] According to an aspect of the present invention, the
first-perspective image data and the second-perspective image data
may be repeatedly displayed for a predetermined number of
times.
[0018] According to an aspect of the present invention, when the
first-perspective image data and the second-perspective image data
have been repeatedly displayed for the predetermined number of
times, the first-perspective image data and the second-perspective
image data may be deleted from the first and second buffers,
respectively.
[0019] According to another aspect of the present invention, there
is provided an image data outputting device including: a generating
unit to generate first-perspective image data and
second-perspective image data to display a 3D image by converting a
same 2D image data; an adding unit to add additional information
indicating a relationship between the first-perspective image data
and the second-perspective image data to the first-perspective
image data and/or the second-perspective image data; and an
outputting unit to output the first-perspective image data and the
second-perspective image data.
[0020] According to another aspect of the present invention, there
is provided an image data displaying device including: a receiving
unit to receive first-perspective image data and second-perspective
image data, which are generated by converting a same 2D image data
and are to display a 3D image, and to receive additional
information indicating a relationship between the first-perspective
image data and the second-perspective image data; and a displaying
unit to display the first-perspective image data and the
second-perspective image data based on the additional
information.
[0021] According to yet another aspect of the present invention,
there is provided a method of outputting three-dimensional (3D)
image data, the method including: generating first-perspective
image data and second-perspective image data to display a 3D image
by converting a same two-dimensional (2D) image data; generating
additional information indicating a relationship between the
first-perspective image data and the second-perspective image data;
and displaying the first-perspective image data and the
second-perspective image data based on the additional
information.
[0022] According to still another aspect of the present invention,
there is provided an image data outputting device including: a
generating unit to generate first-perspective image data and
second-perspective image data to display a three-dimensional (3D)
image by converting a same two-dimensional (2D) image data; and a
displaying unit to display the first-perspective image data and the
second-perspective image data based on the additional information
indicating a relationship between the first-perspective image data
and the second-perspective image data.
[0023] According to another aspect of the present invention, there
is provided an image data outputting system, including: an image
data outputting device including: a generating unit to generate
first-perspective image data and second-perspective image data to
display a three-dimensional (3D) image by converting a same
two-dimensional (2D) image data, an adding unit to add additional
information indicating a relationship between the first-perspective
image data and the second-perspective image data to the
first-perspective image data and/or the second-perspective image
data, and an outputting unit to output the first-perspective image
data and the second-perspective image data; and an image data
displaying device including: a receiving unit to receive the
first-perspective image data, the second-perspective image data,
and the additional information, and a displaying unit to display
the first-perspective image data and the second-perspective image
data based on the additional information.
[0024] According to another aspect of the present invention, there
is provided a method of outputting three-dimensional (3D) image
data, the method including: generating additional information
indicating a relationship between first-perspective image data and
second-perspective image data that are generated from a same
two-dimensional (2D) image data to display a 3D image.
[0025] Additional aspects and/or advantages of the invention will
be set forth in part in the description which follows and, in part,
will be obvious from the description, or may be learned by practice
of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0026] These and/or other aspects and advantages of the invention
will become apparent and more readily appreciated from the
following description of the embodiments, taken in conjunction with
the accompanying drawings of which:
[0027] FIG. 1 is a block diagram of an image data outputting device
according to an embodiment of the present invention;
[0028] FIG. 2A is a diagram showing a detailed configuration of the
image data outputting device shown in FIG. 1;
[0029] FIG. 2B is a diagram showing an example wherein an image
data outputting unit according to an embodiment of the present
invention outputs 3D image data to which additional information is
added;
[0030] FIG. 2C is a diagram showing another example wherein the
image data outputting unit outputs 3D image data to which
additional information is added;
[0031] FIG. 3 is a block diagram of the image data displaying
device according to an embodiment of the present invention;
[0032] FIG. 4 is a block diagram of an image data displaying device
according to another embodiment of the present invention;
[0033] FIG. 5 is a diagram showing an example of displaying 3D
image data by using the image data displaying device according to
an embodiment of the present invention;
[0034] FIGS. 6A to 6C are diagrams of a storage unit within the
image data displaying device according to an embodiment of the
present invention;
[0035] FIG. 7 is a flowchart of a method of outputting image data
according to an embodiment of the present invention; and
[0036] FIG. 8 is a flowchart of a method of displaying image data
according to an embodiment of the present invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0037] Reference will now be made in detail to the present
embodiments of the present invention, examples of which are
illustrated in the accompanying drawings, wherein like reference
numerals refer to the like elements throughout. The embodiments are
described below in order to explain the present invention by
referring to the figures.
[0038] FIG. 1 is a block diagram of an image data outputting device
100 according to an embodiment of the present invention. Referring
to FIG. 1, the image data outputting device 100 includes a
generating unit 110, an adding unit 120, and an outputting unit
130. While not required, the image data outputting device 100 can
be a transmission device, a data recorder that records the data on
a recording medium, a workstation, a desktop computer, a notebook
computer, etc., and can be implemented using one or more computers
and/or processors whose functions are executed using
software/firmware.
[0039] The generating unit 110 generates three-dimensional (3D)
image data by converting two-dimensional (2D) image data into 3D
image data. Here, 2D image data is image data used to reproduce 2D
images, whereas 3D image data is image data used to reproduce 3D
images. Specifically, when reproducing a stereoscopic image, the
generating unit 110 converts 2D image data and generates
left-perspective image data and right-perspective image data.
According to the present embodiment, the left-perspective image
data is image data for a left eye of a viewer, and the
right-perspective image data is image data for a right eye of the
viewer. When reproducing a multi-perspective image in three or more
perspectives, the generating unit 110 converts 2D image data and
generates three or more pieces of image data in different
perspectives. For convenience of explanation, it will be assumed
that a stereoscopic image is being reproduced and descriptions
thereof will focus on left-perspective image data, though it is
understood that aspects of the present invention are not limited
thereto. That is, aspects of the present invention, as described
below, can also be applied to right-perspective image data, and to
a case where a multi-perspective image is reproduced in three or
more perspectives.
[0040] The adding unit 120 adds additional information to at least
one piece of generated 3D image data. The adding unit 120 may add
additional information to each piece of generated 3D image data.
Here, additional information is information indicating
relationships among 3D image data generated by the generating unit
110 converting the same 2D image data, and may be in any format.
For example, pair information and/or perspective information may be
included in additional information. Pair information is information
indicating that left-perspective image data and right-perspective
image data are a pair when the left-perspective image data and the
right-perspective image data are generated by the generating unit
110 converting the same 2D image data. Perspective information is
information indicating whether an item of the generated 3D image
data is left-perspective image data or right-perspective image
data. The adding unit 120 may record the pair information and/or
the perspective information to a plurality of bit strings. For
example, the adding unit 120 may record the pair information in
upper 4-bits of 8-bits existing in a predetermined region of 3D
image data, and/or may record the perspective information in lower
4-bits of the 8-bits, though it is understood that aspects of the
present invention are not limited thereto. For example, the adding
unit 120 may record the pair information in the lower 4-bits of
8-bits existing in a predetermined region of 3D image data, and/or
may record the perspective information in the upper 4-bits of the
8-bits.
[0041] Pair information is information to indicate 2D image data
used to generate 3D image data, and a plurality of pieces of 3D
image data generated by converting the same 2D image data may have
the same pair information. Types of pair information may vary
according to various embodiments. For example, according to an
embodiment of the present invention, pair information may be
classified into 16 types according to sequences of 2D image data
used to generate 3D image data. In this case, pair information of
3D image data generated by the generating unit 110 converting first
2D image data may have a value "0000," and pair information of 3D
image data generated by the generating unit 110 converting second
2D image data may have a value "0001." In this regard, pair
information of 3D image data generated by the generating unit 110
converting sixteenth 2D image data may have a value "1111," and
pair information of 3D image data generated by the generating unit
110 converting seventeenth 2D image data may have a value "0000"
again, though it is understood that other embodiments are not
limited thereto. For example, according to another embodiment of
the present invention, the pair information may be classified into
two types depending on whether 2D image data used to generate 3D
image data are even-numbered 2D image data or odd-numbered 2D image
data. In this case, 3D image data generated by the generating unit
110 converting even-numbered 2D image data may have a value "0000,"
and 3D image data generated by the generating unit 110 converting
odd-numbered 2D image data may have a value "1111."
[0042] Perspective information may vary according to a number of
pairs of 3D image data to be generated by the generating unit 110
converting 2D image data. If stereoscopic images are to be
reproduced, perspective information may be limited to two types.
For example, perspective information of left-perspective image data
may have a value "0000," and perspective information of
right-perspective image data may have a value "1111." Thus,
additional information regarding left-perspective image data
generated by the generating unit 110 converting even-numbered 2D
image data may include a bit string having a value "00000000," and
additional information regarding right-perspective image data
generated by the generating unit 110 converting the same 2D image
data may include a bit string having a value "00001111." In this
regard, additional information regarding left-perspective image
data generated by the generating unit 110 converting odd-numbered
2D image data may include a bit string having a value "11110000,"
and additional information regarding right-perspective image data
generated by the generating unit 110 converting the same 2D image
data may include a bit string having a value "11111111."
[0043] The outputting unit 130 transmits the 3D image data, to
which the additional information is added by the adding unit 120,
to an image data displaying device 300 described below.
[0044] In the embodiment described above with reference to FIG. 1,
the generating unit 110 in the image data outputting device 100
generates 3D image data by converting 2D image data and generates
additional information indicating relationships among the 3D image
data. However, according to another embodiment of the present
invention, the generating unit 110 may not generate the additional
information. Instead, the generating unit 110 or the adding unit
120 may obtain additional information generated in advance and/or
externally and transmit the additional information to the image
data displaying device 300 together with 3D image data.
[0045] According to an embodiment of the present invention, the
image data outputting device 100 may further include a receiving
unit (not shown). The receiving unit receives metadata including
additional information from an external server via a network system
and/or from a broadcasting server that provides a cable
broadcasting service via a cable network, though it is understood
that aspects of the present invention are not limited thereto. For
example, according to other aspects, the receiving unit may receive
the metadata from an external storage device (such as a server) via
any wired and/or wireless connection (such as USB, Bluetooth,
Infrared, etc.). According to another embodiment of the present
invention, the image data outputting device 100 may further include
a reading unit (not shown). In this case, 2D image data and/or 3D
image data generated by the generating unit 110 converting 2D image
data is recorded in a recording medium. Furthermore, metadata
including additional information may also be recorded on the
recording medium. The reading unit may read the recording medium,
obtain metadata therefrom, and transmit the metadata to the image
data displaying device 300 together with the 3D image data.
[0046] FIG. 2A is a diagram showing a detailed configuration of the
image data outputting device 100 shown in FIG. 1. Referring to FIG.
2A, a decoder 210 decodes input 2D image data. The decoded 2D image
data is transmitted to the generating unit 110. The generating unit
110 generates a 3D image data pair, which is to be used to generate
3D images, by converting the input 2D image data. The adding unit
120 adds additional information that indicates relationships among
a plurality of 3D image data generated by the generating unit 110
converting the same 2D image data, to at least one of the 3D image
data. As described above, the additional information may include
pair information and/or perspective information.
[0047] According to the present embodiment, the adding unit 120
includes a perspective determining unit 122 and an information
recording unit 124. The perspective determining unit 122 determines
perspectives of a generated 3D image data pair. That is, the
perspective determining unit 122 determines which of 3D image data
items is left-perspective image data and which is right-perspective
image data.
[0048] The information recording unit 124 adds pair information
that indicates left-perspective image data and right-perspective
image data from among a 3D image data pair generated by using the
same 2D image data, and adds perspective information to each of the
image data. Pair information and perspective information may be
generated as metadata. However, aspects of the present invention
are not limited thereto. For example, the pair information and/or
the perspective information may be added by being recorded in a
predetermined region within 3D image data.
[0049] The outputting unit 130 outputs a 3D image data pair to
which the pair information and the perspective information are
added by the information recording unit 124. If the outputting unit
130 outputs 3D image data at a transmitting rate of 60 Hz, it takes
1/30 of a second to completely output a 3D image data pair.
[0050] FIG. 2B is a diagram showing an example wherein an image
data outputting unit according to an embodiment of the present
invention outputs 3D image data to which additional information 231
is added. Referring to FIG. 2B, the additional information 231 is
output as metadata regarding 3D image data 232 and 233. In this
case, an output sequence may vary according to embodiments. The
output sequence shown in FIG. 2B is the additional information 231,
left-perspective image data 232, and right-perspective image data
233.
[0051] FIG. 2C is a diagram showing another example wherein the
image data outputting unit outputs 3D image data to which
additional information is added. Referring to FIG. 2C, additional
pieces of information 242 and 244 are recorded in predetermined
regions of 3D image data 241 and 243, respectively. In other words,
the additional information 242 regarding left-perspective image
data 241 is recorded in the left-perspective image data 241, and
the additional information 244 regarding right-perspective image
data 243 is recorded in the right-perspective image data 243. At
this point, a sequence of outputting 3D image data may vary
according to embodiments. In FIG. 2C, the 3D image data is output
in the sequence of the left-perspective image data 241 followed by
the right-perspective image data 243.
[0052] FIG. 3 is a block diagram of the image data displaying
device 300 according to an embodiment of the present invention.
Referring to FIG. 3, the image data displaying device 300 includes
a receiving unit 310, an extracting unit 320, and a displaying unit
330. While not required, the image data displaying device 300 can
be a workstation, a desktop computer, a notebook computer, a
portable multimedia player, a television, a set-top box, a
reproducing device that reads the image data from a medium, etc.,
and can be implemented using one or more computers and/or
processors whose functions are executed using
software/firmware.
[0053] The receiving unit 310 receives 3D image data from the image
data outputting device 100. As described above, the received 3D
image data includes additional information indicating relationships
among 3D image data generated by converting the same 2D image data.
The additional information includes pair information and/or
perspective information. Pair information is information indicating
that left-perspective image data and right-perspective image data
generated by the generating unit 110 converting the same 2D image
data are a 3D image data pair. Left-perspective image data and
right-perspective image data having the same pair information can
be determined as a 3D image data pair. Perspective information is
information indicating whether 3D image data is left-perspective
image data or right-perspective image data.
[0054] The extracting unit 320 extracts the additional information
from the received 3D image data. The displaying unit 330 displays
the 3D image data based on the additional information. More
particularly, whether to display 3D image data is determined based
on the pair information and the perspective information. If it is
determined to display the 3D image data, a 3D image data display
sequence is determined, and the displaying unit 330 displays the 3D
image data according to the determined sequence. An example of
displaying 3D image data will be described in detail later with
reference to FIG. 5.
[0055] FIG. 4 is a block diagram of an image data displaying device
400 according to another embodiment of the present invention.
Referring to FIG. 4, the image data displaying device 400 includes
a receiving unit 410, an extracting unit 420, a control unit 430, a
storage unit 440, and a displaying unit 450. Here, operations of
the receiving unit 410, the extracting unit 420, and the displaying
unit 450 are similar to those of the receiving unit 310, the
extracting unit 320, and the displaying unit 330 shown in FIG. 3,
and thus detailed descriptions thereof will be omitted here.
[0056] The control unit 430 controls the image data displaying
device 400 to store, to delete, and/or to display 3D image data,
based on additional information. The storage unit 440 includes a
plurality of buffers 442 through 448, which are defined according
to a predetermined standard. The predetermined standard may vary
according to embodiments. For example, the buffers 442 through 448
may be defined based on the perspective information and/or the pair
information that are added to the 3D image data. The storage unit
440, which is controlled by the control unit 430, stores 3D image
data in one or more of the plurality of buffers, transmits stored
3D image data to the displaying unit 450, and/or deletes stored 3D
image data.
[0057] Hereinafter, operations of the image data displaying unit
450 will be described. As described above, the receiving unit 410
receives 3D image data from the image data outputting device 100.
The received 3D image data is image data generated by the
generating unit 110 converting 2D image data, and may include
additional information. The additional information is information
indicating relationships among 3D image data generated by the
generating unit 110 converting the same 2D image data, and may
include pair information and/or perspective information.
[0058] As an example, it is assumed that the pair information and
the perspective information are indicated by using 8-bit strings
within the 3D image data, wherein upper 4-bits of an 8-bit string
indicates the pair information and lower 4-bits of the 8-bit string
indicates the perspective information. The pair information may
indicate whether the 3D image data is generated by the generating
unit 110 converting even-numbered 2D image data or odd-numbered 2D
image data. For example, if a value of the upper 4-bits is "0000,"
the 3D image data is generated by the generating unit 110
converting even-numbered 2D image data. In contrast, if the value
of the upper 4-bits is "1111," the 3D image data is generated by
the generating unit 110 converting odd-numbered 2D image data. The
perspective information may indicate whether 3D image data is
left-perspective image data or right-perspective image data. For
example, if a value of the lower 4-bits is "0000," the 3D image
data is left-perspective image data. If the value of the lower
4-bits is "1111," the 3D image data is right-perspective image
data.
[0059] The extracting unit 420 extracts the pair information and
the perspective information from the 3D image data. For example, as
described above, if the extracted pair information and perspective
information is "00000000," the 3D image data is left-perspective
image data generated by converting even-numbered 2D image data.
Hereinafter, a 3D image data pair generated by converting
even-numbered 2D image data will be referred as a first pair, and a
3D image data pair generated by converting odd-numbered 2D image
data will be referred as a second pair. The control unit 430
selects a buffer, to which 3D image data is to be stored, based on
the pair information and the perspective information. The storage
unit 440 comprises four buffers 442, 444, 446, and 448. A first
buffer 442 stores left-perspective image data of the first pair,
and a second buffer 444 stores right-perspective image data of the
first pair. A third buffer 446 stores left-perspective image data
of the second pair, and a fourth buffer 448 stores
right-perspective image data of the second pair. Thus, 3D image
data having additional information indicated by the 8-bit string
"00000000" is stored in the first buffer 442. However, it is
understood that aspects of the present invention are not limited to
the four buffers 442 through 448 classified by the pair information
and the perspective information. According to other aspects, more
or less buffers may be provided according to other classification
schemes. For example, only two buffers may be provided, classified
by the pair information or the perspective information.
[0060] Thereafter, the control unit 430 controls the storage unit
440 such that the 3D image data stored in the first buffer 442 is
transmitted to the displaying unit 450. The displaying unit 450 may
display the 3D image data in a sequence of displaying
left-perspective image data of a pair and successively displaying
right-perspective image data of the same pair or vice versa.
[0061] The control unit 430 controls the display unit 450 and the
storage unit 440 such that the 3D image data is repeatedly
displayed a predetermined number of times and is deleted from a
buffer 442, 444, 446, or 448. Displaying and buffering the 3D image
data will be described in detail later with reference to FIGS. 5,
6A, and 6B. The displaying unit 450 displays the transmitted 3D
image data.
[0062] A conventional image data outputting device generates 3D
image data by converting 2D image data and transmits the 3D image
data to an image data displaying device without adding additional
information, such as pair information and/or perspective
information. An image data displaying device alternately displays
left-perspective image data and right-perspective image data in a
sequence that the 3D image data is received. Here, a user watches
left-perspective image data via his or her left eye and watches
right-perspective image data via his or her right eye by using an
auxiliary device to watch 3D images (e.g., 3D glasses or
goggles).
[0063] Even a conventional method enables a user to watch 3D images
in the case where 3D image data is sequentially transmitted and
displayed. However, in the case where 3D image data is not
sequentially transmitted or some pieces of 3D image data are not
transmitted and/or displayed due, for example, to a power failure,
a user cannot watch the 3D images. For example, it is assumed that,
while a user is watching left-perspective image data via his or her
left eye, power supplied to an image data displaying device
temporarily fails, and left-perspective image data is received
again when power supply to the image data displaying device is
recovered. In this case, 3D image data does not include perspective
information, and thus the image data displaying device cannot
determine whether the newly received 3D image data is
left-perspective image data or right-perspective image data.
Therefore, without separate synchronization, the image data
displaying device may determine left-perspective image data as
right-perspective image data. In this case, a user watches
left-perspective image data via his or her right eye and
right-perspective image data via his or her left eye, and thus the
3D effect cannot be sufficiently experienced. However, the image
data displaying device 300, 400 according to aspects of the present
invention uses perspective information included in 3D image data
such that left-perspective image data is viewed via the left eye
and right-perspective image data is viewed via the right eye. Thus,
a user can watch 3D images without distortion.
[0064] Furthermore, pair information is not included in
conventional 3D image data. Thus, the image data displaying device
300, 400 may not be able to display both left-perspective image
data and right-perspective image data, and the image data
displaying device 400 may only partially display a 3D image data
pair and then display a next pair. In this case, the 3D effect
cannot be sufficiently experienced by a user due to overlapping of
images. However, the image data displaying device 300, 400
according to aspects of the present invention guarantees the
display of both left-perspective image data and right-perspective
image data by using pair information included in 3D image data.
Thus, a user can watch natural 3D images.
[0065] FIG. 5 is a diagram showing an example of displaying 3D
image data by using the image data displaying device 400 according
to an embodiment of the present invention. Referring to FIG. 5, 3D
image data output from the image data outputting device 100
according to an embodiment of the present invention are shown in
chronological order in the lower part of FIG. 5. Here, if the image
data outputting device 100 outputs 3D image data at a rate of 60
Hz, the unit of time is set to 1/60 of a second.
[0066] 3D image data according to aspects of the present invention
includes pair information and perspective information. In the
present example, pair information is indicated by an upper bit
between two bits included in the 3D image data, and perspective
information is indicated by a lower bit. Specifically, 3D image
data where a bit indicating pair information (i.e., the upper bit)
is 0 is generated by using even-numbered 2D image data, whereas 3D
image data where a bit indicating pair information is 1 is
generated by using odd-numbered 2D image data. Furthermore, 3D
image data where a bit indicating perspective information (i.e.,
the lower bit) is 0 is left-perspective image data, whereas 3D
image data where a bit indicating perspective information is 1 is
right-perspective image data.
[0067] At 1/60 of a second, 3D image data 501 is output. In the 3D
image data 501, both a bit indicating pair information and a bit
indicating perspective information are 0. Thus, it is clear that
the 3D image data 501 is left-perspective image data generated by
using even-numbered 2D image data. At 2/60 of a second, 3D image
data 502 is output. In the 3D image data 502, a bit indicating pair
information is 0, and a bit indicating perspective information is
1. Thus, it is clear that the 3D image data 502 is
right-perspective image data generated by using even-numbered 2D
image data. Furthermore, it is clear that the 3D image data 501 and
the 3D image data 502 are paired image data, as the 3D image data
501 and the 3D image data 502 have the same pair information, have
different perspective information, and are transmitted
successively.
[0068] At 3/60 of a second, 3D image data 503 is output. In the 3D
image data 503, a bit indicating pair information is 1, and a bit
indicating perspective information is 0. Therefore, it is clear
that the 3D image data 503 is left-perspective image data generated
by using odd-numbered 2D image data. Furthermore, since pair
information of the 3D image data 502 and 503 are different from
each other, it is clear that the 3D image data 502 and the 3D image
data 503 are not paired image data.
[0069] 3D image data displayed by the image data displaying device
300, 400 according to aspects of the present invention are shown in
chronological order in the upper part of FIG. 5. At 1/60 of a
second, the 3D image data 501 is received from the image data
outputting device 100, and the received 3D image data 501 is stored
in the storage unit 440. At 2/60 of a second, the 3D image data 502
is received from the image data outputting device 100, and the
received 3D image data 502 is stored in the storage unit 440. At
3/60 of a second, the 3D image data 503 is received from the image
data outputting device 100, and the received 3D image data 503 is
stored in the storage unit 440. At this point, the image data
displaying device 400 displays stored left-perspective image data
(that is, the 3D image data 501).
[0070] At 3.5/60 of a second, no 3D image data is received from the
image data outputting device 100, as the 3D image data in the
present embodiment is received every 1/60 of a second. Here, the
image data displaying device 400 displays stored right-perspective
image data (that is, the 3D image data 502). In the prior art,
received image data has to be displayed continuously at a point of
time when no image data is received from the image data outputting
device 100. Therefore, a 3D image data pair is displayed every 1/30
of a second. However, stored 3D image data is used in the present
embodiment, and thus an image data pair can be displayed every 1/60
of a second.
[0071] At 4/60 of a second, 3D image data 504 is received from the
image data outputting device 100, and the received 3D image data
504 is stored in the storage unit 440. At this point, the image
data displaying device 400 displays the 3D image data 501. Since
the 3D image data 501 is repeatedly displayed twice, the 3D image
data 501 is deleted from the storage unit 440. At 4.5/60 of a
second, no 3D image data is received from the image data outputting
device 100. At this point, the image data displaying device 400
displays the 3D image data 502. Since the 3D image data 502 is
repeatedly displayed twice, the 3D image data 502 is deleted from
the storage unit 440.
[0072] In the prior art, when a conventional image data outputting
device outputs image data every 1/60 of a second, the image data
displaying device 400 displays one image data every 1/60 of a
second. Accordingly, in the case of displaying 2D image data, it
takes 1/60 seconds to display a scene. However, in the case of
displaying 3D image data, it takes 2/60 of a second to display a
scene, because a user can watch a scene after two pieces of 3D
image data are displayed. However, images appear more natural to a
user when a scene is displayed every 1/60 of a second. Thus, 3D
images displayed by the conventional image data displaying device
do not seem natural to a user, compared to those displayed
according to aspects of the present invention. That is, the image
data displaying device 300, 400 according to aspects of the present
invention classifies and stores received 3D image data by using a
plurality of buffers, and repeatedly displays 3D image data a
predetermined number of times by using stored 3D image data. Thus,
a 3D image data pair can be displayed every 1/60 of a second.
Therefore, a user can watch natural 3D images.
[0073] FIGS. 6A and 6B are diagrams of the storage unit 440 of the
image data displaying device 400 according to an embodiment of the
present invention. Referring to FIGS. 6A and 6B, the storage unit
440 of the image data displaying device 400 includes the first
buffer 442, the second buffer 444, the third buffer 446, and the
fourth buffer 448.
[0074] The first buffer 442 stores left-perspective image data of
the first pair, and the second buffer 444 stores right-perspective
image data of the first pair. The third buffer 446 stores
left-perspective image data of the second pair, and the fourth
buffer 448 stores right-perspective image data of the second pair.
In FIGS. 6A and 6B, 3D image data where a bit indicating pair
information is 0 is 3D image data of the first pair, and 3D image
data where a bit indicating pair information is 1 is 3D image data
of the second pair.
[0075] Referring to FIGS. 5 and 6A, operations of the storage unit
440 at a time frame between 1/60 of a second and 2.5/60 of a second
will now be described. At 1/60 of a second, the 3D image data 501
is received from the image data outputting device 100, and the
received 3D image data 501 is stored in the storage unit 440. In
the 3D image data 501, a bit indicating pair information is 0, and
a bit indicating perspective information is also 0. Thus, the 3D
image data 501 is stored in the first buffer 442. At 1.5/60 of a
second, no data is received from the image data outputting device
100. At 2/60 of a second, the 3D image data 502 is received from
the image data outputting device 100, and the received 3D image
data 502 is stored in the storage unit 440. In the 3D image data
502, a bit indicating pair information is 0, and a bit indicating
perspective information is 1. Thus, the 3D image data 502 is stored
in the second buffer 444. At 2.5/60 of a second, no data is
received from the image data outputting device 100.
[0076] Referring to FIGS. 5 and 6B, operations of the storage unit
440 at a time frame between 3/60 of a second and 4.5/60 of a second
will now be described. At 3/60 of a second, the 3D image data 503
is received from the image data outputting device 100, and the
received 3D image data 503 is stored in the storage unit 440. In
the 3D image data 503, a bit indicating pair information is 1, and
a bit indicating perspective information is 0. Thus, the 3D image
data 503 is stored in the third buffer 446. At this point, the 3D
image data 501 stored in the first buffer 442 is output to the
displaying unit 450. At 3.5/60 of a second, no data is received
from the image data outputting device 100. At this point, the 3D
image data 502 stored in the second buffer 444 is output to the
displaying unit 450. At 4/60 of a second, the 3D image data 504 is
received from the image data outputting device 100, and the
received 3D image data 504 is stored in the storage unit 440. In
the 3D image data 504, a bit indicating pair information is 1, and
a bit indicating perspective information is 1. Thus, the 3D image
data 504 is stored in the fourth buffer 448. At this point, the 3D
image data 501 stored in the first buffer 442 is output to the
displaying unit 450. Since the 3D image data 501 stored in the
first buffer 442 is displayed twice, the 3D image data 501 is
deleted from the first buffer 442. At 4.5/60 of a second, no data
is received from the image data outputting device 100. At this
point, the 3D image data 502 stored in the second buffer 444 is
output to the displaying unit 450. Since the 3D image data 502
stored in the second buffer 444 is displayed twice, the 3D image
data 502 is deleted from the second buffer 444.
[0077] Referring to FIGS. 5 and 6C, operations of the storage unit
440 at a time frame between 5/60 of a second and 6.5/60 of a second
will now be described. At 5/60 of a second, the 3D image data 505
is received from the image data outputting device 100, and the
received 3D image data 505 is stored in the storage unit 440. In
the 3D image data 505, a bit indicating pair information is 0, and
a bit indicating perspective information is also 0. Thus, the 3D
image data 505 is stored in the third buffer 442. At this point,
the 3D image data 503 stored in the third buffer 444 is output to
the displaying unit 450. At 5. 5/60 of a second, no data is
received from the image data outputting device 100. At this point,
the 3D image data 504 stored in the fourth buffer 448 is output to
the displaying unit 450. At 6/60 of a second, the 3D image data 506
is received from the image data outputting device 100, and the
received 3D image data 506 is stored in the storage unit 440. In
the 3D image data 506, a bit indicating pair information is 0, and
a bit indicating perspective information is 1. Thus, the 3D image
data 506 is stored in the second buffer 444. At this point, the 3D
image data 503 stored in the third buffer 446 is output to the
displaying unit 450. Since the 3D image data 503 stored in the
third buffer 446 is displayed twice, the 3D image data 503 is
deleted from the third buffer 446. At 6.5/60 of a second, no data
is received from the image data outputting device 100. At this
point, the 3D image data 504 stored in the fourth buffer 448 is
output to the displaying unit 450. Since the 3D image data 504
stored in the fourth buffer 448 is displayed twice, the 3D image
data 504 is deleted from the fourth buffer 448.
[0078] In FIGS. 6A and 6B, one item of 3D image data 501 or 502 is
displayed every 1/120 of a second by using the stored 3D image data
501 and 502. Thus, a 3D image data pair 501 and 502 is displayed
every 1/60 of a second. According to other embodiments, an image
data outputting rate of the image data outputting device 100 may be
less than or greater than 60 Hz. For example, if the image data
outputting rate is less than 60 Hz, the image data outputting
device 100 may still be controlled to display a 3D image data pair
every 1/60 of a second by adjusting the number of times 3D image
data is repeatedly displayed (i.e., adjusting the number of times
to be greater than two).
[0079] Furthermore, in FIGS. 6A and 6B, both left-perspective image
data 501 and 503 and right-perspective image data 502 and 504 of a
same pair are stored, and stored 3D image data 501 and 502 is
displayed as soon as left-perspective image data 503 of a next pair
is received. However, it is understood that aspects of the present
invention are not limited thereto. For example, according to other
aspects, stored left-perspective image data 501 may be displayed
from when a pair of left-perspective image data and
right-perspective image data 503 and 504 are received ( 2/60 of a
second).
[0080] FIG. 7 is a flowchart of a method of outputting image data
according to an embodiment of the present invention. Referring to
FIG. 7, first-perspective image data and second-perspective image
data to display a 3D image are generated in operation S710. The
first-perspective image data and the second-perspective image data
are generated by converting the same 2D image data. Additional
information indicating relationships between the generated
first-perspective image data and the generated second-perspective
image data (operation S710) is generated, and the generated
additional information is added to the first-perspective image data
and/or the second-perspective image data in operation S720. That
is, the additional information may be added to both the
first-perspective image data and the second-perspective image data,
or just one of the first-perspective image data and the
second-perspective image data. The additional information may
include pair information that indicates that the first-perspective
image data and the second-perspective image data are an image data
pair generated by converting the same 2D image data. Furthermore,
the additional information may additionally or alternatively
include perspective information that indicates whether each of the
first-perspective image data and the second-perspective image data
is left-perspective image data or right-perspective image data.
Moreover, the pair information and the perspective information may
be recorded within the first-perspective image data and the
second-perspective image data. For example, the pair information
may be recorded by using an upper 4-bits of 8-bits in a
predetermined region of the first-perspective image data and the
second-perspective image data, and the perspective information may
be recorded by using a lower 4-bits of the 8-bits. The
first-perspective image data and the second-perspective image data,
to which additional information is added, are output in operation
S730.
[0081] FIG. 8 is a flowchart of a method of displaying image data
according to an embodiment of the present invention. Referring to
FIG. 8, first-perspective image data and second-perspective image
data to display a 3D image are received in operation S810. The
first-perspective image data and the second-perspective image data
are generated by using the same 2D image data. According to aspects
of the present invention, the first-perspective image data and the
second-perspective image data include additional information. The
additional information is information indicating relationships
among 3D image data generated by converting the same 2D image data,
and may include pair information and/or perspective information.
Pair information is information indicating that first-perspective
image data and second-perspective image data, generated by
converting the same 2D image data, are paired image data.
Perspective information is information indicating whether each of
the first-perspective image data and the second-perspective image
data is left-perspective image data or right-perspective image
data. If the first-perspective image data is left-perspective image
data, the second-perspective image data is right-perspective image
data. If the first-perspective image data is right-perspective
image data, the second-perspective image data is left-perspective
image data.
[0082] The additional information is obtained from the
first-perspective image data and the second-perspective image data
in operation S820. The additional information may either be
transmitted after being included in the first-perspective image
data and/or the second-perspective image data, or be separately
transmitted. In the case where the additional information is
separately transmitted, the additional information may be
separately received from an external server or an image data
outputting device 100.
[0083] The first-perspective image data and the second-perspective
image data are displayed based on the additional information in
operation S830. Hereinafter, displaying the first-perspective image
data will be described in detail under an assumption that the
additional information includes pair information and perspective
information. First, the received first-perspective image data is
stored based on the pair information and perspective information.
For example, the first-perspective image data may be stored in a
first buffer 442 in which left-perspective image data of a first
pair is stored, a second buffer 444 in which right-perspective
image data of the first pair is stored, a third buffer 446 in which
left-perspective image data of a second pair is stored, and a
fourth buffer 448 in which right-perspective image data of the
second pair is stored, as illustrated in FIG. 4. When the
first-perspective image data and the second-perspective image data
are both stored in the buffers, the first-perspective image data
and the second-perspective image data are sequentially displayed.
At this point, the first-perspective image data and the
second-perspective image data are repeatedly displayed a
predetermined number of times, and are deleted from the buffers
after being repeatedly displayed the predetermined number of
times.
[0084] While not restricted thereto, aspects of the present
invention can also be written as computer programs and can be
implemented in general-use or specific-use digital computers that
execute the programs using a computer-readable recording medium.
Examples of the computer-readable recording medium include magnetic
storage media (e.g., ROM, floppy disks, hard disks, etc.) and
optical recording media (e.g., CD-ROMs, or DVDs). Aspects of the
present invention may also be realized as a data signal embodied in
a carrier wave and comprising a program readable by a computer and
transmittable over the Internet.
[0085] Although a few embodiments of the present invention have
been shown and described, it would be appreciated by those skilled
in the art that changes may be made in this embodiment without
departing from the principles and spirit of the invention, the
scope of which is defined in the claims and their equivalents.
* * * * *