U.S. patent application number 14/122361 was filed with the patent office on 2014-05-08 for image processing apparatus, image processing method, and program.
This patent application is currently assigned to SONY CORPORATION. The applicant listed for this patent is Hanae Higuchi, Daisuke Kurosaki, Yasumasa Oda. Invention is credited to Hanae Higuchi, Daisuke Kurosaki, Yasumasa Oda.
Application Number | 20140125661 14/122361 |
Document ID | / |
Family ID | 44785456 |
Filed Date | 2014-05-08 |
United States Patent
Application |
20140125661 |
Kind Code |
A1 |
Kurosaki; Daisuke ; et
al. |
May 8, 2014 |
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND
PROGRAM
Abstract
The present disclosure provides an image processing apparatus to
be newly improved that can easily generate contents having
three-dimensional images. The image processing apparatus generates
a plurality of plane images and sets the virtual distances of a
depth direction to the plurality of generated plane images,
respectively. The image processing apparatus converts the plurality
of plane images into a three-dimensional image where the space
positions of objects of the plurality of plane images are set, on
the basis of the virtual distances set to the plurality of
generated plane images, and obtains data of the three-dimensional
image. In addition, the image processing apparatus displays an
editing screen that generates the plurality of plane images. The
editing screen displays the plurality of plane images individually
or to be overlapped and displays the plane images by providing tabs
to the plane images, respectively.
Inventors: |
Kurosaki; Daisuke; (Nagano,
JP) ; Oda; Yasumasa; (Nagano, JP) ; Higuchi;
Hanae; (Tokyo, JP) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Kurosaki; Daisuke
Oda; Yasumasa
Higuchi; Hanae |
Nagano
Nagano
Tokyo |
|
JP
JP
JP |
|
|
Assignee: |
SONY CORPORATION
Tokyo
JP
|
Family ID: |
44785456 |
Appl. No.: |
14/122361 |
Filed: |
February 16, 2012 |
PCT Filed: |
February 16, 2012 |
PCT NO: |
PCT/JP2012/001012 |
371 Date: |
November 26, 2013 |
Current U.S.
Class: |
345/419 |
Current CPC
Class: |
G06T 19/00 20130101;
G06T 15/20 20130101; H04N 13/183 20180501; H04N 13/261 20180501;
G06T 15/00 20130101; G06T 15/10 20130101; G06T 17/20 20130101; G06T
17/00 20130101 |
Class at
Publication: |
345/419 |
International
Class: |
G06T 15/00 20060101
G06T015/00 |
Foreign Application Data
Date |
Code |
Application Number |
Jun 6, 2011 |
JP |
2011-126792 |
Claims
1. An image processing apparatus comprising: an image generating
unit that generates a plurality of plane images and sets virtual
distances in a depth direction to the plurality of generated plane
images, respectively; a three-dimensional image converting unit
that converts the plurality of plane images into a
three-dimensional image where objects' positions in space in each
of the plurality of plane images are set, based on the virtual
distances set to the plurality of plane images generated by the
image generating unit; a three-dimensional image generating unit
that outputs data of the three-dimensional image converted by the
three-dimensional image converting unit; an editing screen
generating unit that displays the plurality of plane images
generated by the image generating unit individually or in a
overlapped manner and generates display data of an editing screen
displayed by providing tabs to the plane images, respectively; and
an input unit that receives an operation to generate or edit images
in the editing screen generated by the editing screen generating
unit.
2. The image processing apparatus according to claim 1, wherein the
editing screen that is generated by the editing screen generating
unit includes the tabs for the individual plane images that display
thumbnail images where the plane images corresponding to the
individual tabs are reduced, and the plane image corresponding to
any tab is displayed on the editing screen by a selection operation
of any tab in the input unit.
3. The image processing apparatus according to claim 2, wherein the
editing screen that is generated by the editing screen generating
unit includes tabs that display thumbnail images where images
showing the virtual distances of the plurality of plane images are
reduced.
4. The image processing apparatus according to claim 1, wherein
distance scales showing setting of the virtual distances of the
plane images and operation positions to operate the virtual
distances of the plane images by the input unit are displayed on
the editing screen that is generated by the editing screen
generating unit.
5. The image processing apparatus according to claim 4, wherein, in
the distance scales, settings of the distances of the plurality of
plane images are distinguished and displayed, and the operation
positions are prepared for every plane image.
6. The image processing apparatus according to claim 4, wherein
displaying to indicate the position of a virtual display surface is
performed in the distance scales.
7. The image processing apparatus according to claim 1, wherein the
three-dimensional image converting unit converts the plurality of
plane images into a three-dimensional image where an image portion
of a lower side from the horizontal line position set to one
specific plane image of the plural plane images becomes an inclined
surface gradually changing from the virtual distance.
8. The image processing apparatus according to claim 7, wherein
horizontal line position scales showing setting of the horizontal
line position are displayed on the editing screen generated by the
editing screen generating unit and the horizontal line position
shown by the horizontal line position scales is changed by
receiving an operation in the input unit.
9. The image processing apparatus according to claim 8, wherein the
plane images other than the specific plane image display the
position crossing the inclined surface in the three-dimensional
image, on the editing screen generated by the editing screen
generating unit.
10. The image processing apparatus according to claim 8, wherein
the three-dimensional image converting unit erases an object that
becomes the lower side of the position crossing the inclined
surface in the three-dimensional image, with respect to the plane
images other than the specific plane image.
11. The image processing apparatus according to claim 8, wherein
the image generating unit sets a lower end of a designated object
in the plane images other than the specific plane image to the
position matched with the position crossing the inclined surface in
the three-dimensional image.
12. An image processing apparatus comprising: an image control unit
that controls an image displayed on a display unit; a
three-dimensional image generating unit that generates a
three-dimensional image where the space positions of objects of a
plurality of plane images are set, from the plurality of plane
images having the virtual distances in a depth direction,
respectively; and an input unit that receives an operation from a
user, wherein the image control unit displays thumbnail images in
which the plurality of plane images and overlapped images, in which
the plane images are displayed in an overlapped manner at a
predetermined angle in a depth direction, are reduced, and the
image control unit displays the plane image or the overlapped image
corresponding to the selected thumbnail image in an editable state
along the thumbnail images of the plurality of plane images and the
overlapped images, when the input unit receives a command selecting
the thumbnail image.
13. The image processing apparatus according to claim 12, wherein
the image control unit displays the plane images and the overlapped
images in an overlapped manner and displays the thumbnail images as
tabs of the plurality of plane images and the overlapped images,
respectively.
14. The image processing apparatus according to claim 13, wherein
the image control unit further displays a tab to receive an
addition of the plane images.
15. The image processing apparatus according to claim 12, wherein
the image control unit displays the overlapped image on a front
surface and displays only the tabs of the non-selected plane
images, when the thumbnail image corresponding to the overlapped
image is selected, and the input unit accepts an input to change
the virtual distance of the selected plane image in a depth
direction, among the plane images displayed in the overlapped
image.
16. The image processing apparatus according to claim 12, wherein
the image control unit displays a screen where only an image of the
plane image in the screen is highlighted on a front surface, when
the thumbnail image corresponding to the plane image is
selected.
17. The image processing apparatus according to claim 12, wherein
the image control unit displays a screen where images of the
non-selected plane images in the screen are grayed out on a front
surface, when the thumbnail image corresponding to the plane image
is selected.
18. The image processing apparatus according to claim 12, wherein
the image control unit displays distance scales showing the virtual
distances of the plane images in the depth direction and displays a
depth change operation input unit to operate the virtual distances
of the plane images displayed in an editable state in the depth
direction by the input unit, and the input unit accepts an
operation to generate or edit the images in the plane images and
accepts an operation with respect to the depth change operation
input unit, when the plane images are displayed in the editable
state.
19. The image processing apparatus according to claim 18, wherein
the depth change operation input unit has a button to move the
virtual distances of the plane images displayed in the editable
state in the depth direction to the front side and a button to move
the virtual distances to the inner side.
20. The image processing apparatus according to claim 18, wherein
the depth change operation input unit is an object that corresponds
to the plane image displayed on the distance scales in the editable
state.
21. The image processing apparatus according to claim 12, wherein
the image control unit displays a horizontal line change operation
input unit to operate the horizontal line position set in the plane
image displayed in an editable state by the input unit, and the
input unit accepts an operation with respect to the horizontal line
change operation input unit, when the plane image is displayed in
an editable state.
22. The image processing apparatus according to claim 21, wherein
the image control unit does not display an image of the lower side
of the horizontal line position set in the plane image displayed in
the editable state.
23. The image processing apparatus according to claim 21, wherein
the three-dimensional image generating unit does not reflect an
image of the lower side of the horizontal line position set in each
plane image to a generated three-dimensional image.
24. The image processing apparatus according to claim 13, wherein
the image control unit performs a control operation to reflect the
edit result on the thumbnail image of the overlapped image, when
the plane image corresponding to the selected thumbnail image is
edited.
25. The image processing apparatus according to claim 12, further
comprising: the display unit.
26. An image processing method comprising: generating a plurality
of plane images and setting virtual distances in a depth direction
to the plurality of generated plane images, respectively;
converting the plurality of plane images into a three-dimensional
image where space positions of objects in each of the plurality of
plane images are set, based on the virtual distances set to the
plurality of generated plane images; outputting data of the
converted three-dimensional image; displaying the plurality of
generated plane images individually or to be overlapped and
generating display data of an editing screen displayed by providing
tabs to the plane images, respectively; and accepting an operation
to generate or edit images in the generated editing screen.
27. A program that causes a computer to execute an image process,
the program causing the computer to execute: generating a plurality
of plane images and setting the virtual distances of a depth
direction to the plurality of generated plane images, respectively;
converting the plurality of plane images into a three-dimensional
image where the space positions of objects of the plurality of
plane images are set, on the basis of the virtual distances set to
the plurality of generated plane images; outputting data of the
converted three-dimensional image; displaying the plurality of
generated plane images individually or to be overlapped and
generating display data of an editing screen displayed by providing
tabs to the plane images, respectively; and accepting an operation
to generate or edit images in the generated editing screen.
Description
TECHNICAL FIELD
[0001] The present disclosure relates to an image processing
apparatus, an image processing method, and a program and
particularly, to a technology for generating a three-dimensional
image.
BACKGROUND ART
[0002] In recent years, an image display apparatus that displays an
image (three-dimensional image: so-called 3D image) perceived by a
user as a three-dimensional image is released and begins to spread
(for example, Patent Literature 1). The apparatus that can display
the three-dimensional image is not limited to a television and
other image display apparatuses. Of personal computers, there are
types that can display a three-dimensional image.
[0003] Among applications running in the personal computer, there
are applications that can generate contents having
three-dimensional images. If the contents are generated by the
applications and the user views the contents in a predetermined
manner, the user can perceive images included in the contents as
the three-dimensional images.
CITATION LIST
Patent Literature
[0004] PTL 1
[0005] JP 2010-210712A
SUMMARY
Technical Problem
[0006] However, according to the related art, in generating
contents having three-dimensional images, it is necessary to set a
positional relation using dedicated software. Accordingly, it is
difficult for an end user to generate such contents.
[0007] In light of the foregoing, it is desirable to provide an
image processing apparatus, an image processing method, and a
computer program which are novel or improved ones and which enable
contents having three-dimensional images to be generated
easily.
[0008] An image processing apparatus of the present disclosure
comprises an image generating unit that generates a plurality of
plane images and sets virtual distances in a depth direction to the
plurality of generated plane images, respectively.
[0009] An image processing apparatus of the present disclosure
comprises a three-dimensional image converting unit that converts
the plurality of plane images into a three-dimensional image where
objects' positions in space in each of the plurality of plane
images are set, based on the virtual distances set to the plurality
of plane images generated by the image generating unit.
[0010] An image processing apparatus of the present disclosure
comprises a three-dimensional image generating unit that outputs
data of the three-dimensional image converted by the
three-dimensional image converting unit.
[0011] An image processing apparatus of the present disclosure
comprises an editing screen generating unit that displays the
plurality of plane images generated by the image generating unit
individually or in a overlapped manner and generates display data
of an editing screen displayed by providing tabs to the plane
images, respectively.
[0012] An image processing apparatus of the present disclosure
comprises and an input unit that receives an operation to generate
or edit images in the editing screen generated by the editing
screen generating unit.
[0013] An image processing method of the present disclosure
comprises generating a plurality of plane images and setting
virtual distances in a depth direction to the plurality of
generated plane images, respectively; converting the plurality of
plane images into a three-dimensional image where space positions
of objects in each of the plurality of plane images are set, based
on the virtual distances set to the plurality of generated plane
images; outputting data of the converted three-dimensional image;
displaying the plurality of generated plane images individually or
to be overlapped and generating display data of an editing screen
displayed by providing tabs to the plane images, respectively; and
accepting an operation to generate or edit images in the generated
editing screen.
[0014] A program of the present disclosure that causes a computer
to execute an image process, the program causing the computer to
execute: generating a plurality of plane images and setting the
virtual distances of a depth direction to the plurality of
generated plane images, respectively; converting the plurality of
plane images into a three-dimensional image where the space
positions of objects of the plurality of plane images are set, on
the basis of the virtual distances set to the plurality of
generated plane images; outputting data of the converted
three-dimensional image; displaying the plurality of generated
plane images individually or to be overlapped and generating
display data of an editing screen displayed by providing tabs to
the plane images, respectively; and accepting an operation to
generate or edit images in the generated editing screen.
Solution to Problem
[0015] According to the present disclosure, a plurality of plane
images are appropriately displayed so that a user can generate and
display a desired three-dimensional image through a simple
operation.
Advantageous Effects of Invention
[0016] According to the present disclosure, there are provided an
image processing apparatus, an image processing method, and a
computer program which are novel or improved ones and which enables
contents having three-dimensional images to be easily
generated.
BRIEF DESCRIPTION OF DRAWINGS
[0017] FIG. 1
[0018] FIG. 1 is a block diagram illustrating an example of the
configuration of an image processing apparatus according to an
embodiment of the present disclosure.
[0019] FIG. 2
[0020] FIG. 2 is a diagram illustrating an example of an editing
screen (example in which all layers are displayed) according to the
embodiment of the present disclosure.
[0021] FIG. 3
[0022] FIG. 3 is an exemplary diagram illustrating an example of an
image of each layer according to the embodiment of the present
disclosure.
[0023] FIG. 4
[0024] FIG. 4 is an exemplary diagram illustrating an example of a
depth display screen according to the embodiment of the present
disclosure.
[0025] FIG. 5
[0026] FIG. 5 is an exemplary diagram illustrating a concept of
converting images of a plurality of layers into three-dimensional
images.
[0027] FIG. 6
[0028] FIG. 6 is an exemplary diagram illustrating a state in which
images of a plurality of layers are converted into
three-dimensional images.
[0029] FIG. 7
[0030] FIG. 7 is an exemplary diagram illustrating an example in
which images of a plurality of layers are converted into
three-dimensional left and right channel images.
[0031] FIG. 8
[0032] FIG. 8 is an exemplary diagram illustrating an example of
three-dimensional images that are generated from images of a
plurality of layers.
[0033] FIG. 9
[0034] FIG. 9 is an exemplary diagram illustrating a converted
state of the case in which there is a layer in which the front side
of a virtual display surface is set to a virtual position.
[0035] FIG. 10
[0036] FIG. 10 is an exemplary diagram illustrating an example of a
three-dimensional image of the case where there is a layer in which
the front side of a virtual display surface is set to a virtual
position.
[0037] FIG. 11
[0038] FIG. 11 is a flowchart illustrating an example of the flow
of an editing screen display process according to the embodiment of
the present disclosure.
[0039] FIG. 12
[0040] FIG. 12 is an exemplary diagram illustrating an example of
an editing screen (example in which a third layer is displayed)
according to the embodiment of the present disclosure.
[0041] FIG. 13
[0042] FIG. 13 is an exemplary diagram illustrating an example of
an editing screen (example in which a second layer is displayed)
according to the embodiment of the present disclosure.
[0043] FIG. 14
[0044] FIG. 14 is an exemplary diagram illustrating an example of
an editing screen (example in which a first layer is displayed)
according to the embodiment of the present disclosure.
[0045] FIG. 15
[0046] FIG. 15 is a flowchart illustrating an example of the flow
of a depth screen display process according to the embodiment of
the present disclosure.
[0047] FIG. 16
[0048] FIG. 16 is an exemplary diagram illustrating an example
(first example) of displaying a depth screen according to the
embodiment of the present disclosure.
[0049] FIG. 17
[0050] FIG. 17 is an exemplary diagram illustrating an example
(second example) of displaying a depth screen according to the
embodiment of the present disclosure.
[0051] FIG. 18
[0052] FIG. 18 is an exemplary diagram illustrating an example
(third example) of displaying a depth screen according to the
embodiment of the present disclosure.
[0053] FIG. 19
[0054] FIG. 19 is an exemplary diagram illustrating an example
(fourth example) of displaying a depth screen according to the
embodiment of the present disclosure.
[0055] FIG. 20
[0056] FIG. 20 is a flowchart illustrating an example of the flow
of a horizontal line setting process according to the embodiment of
the present disclosure.
[0057] FIG. 21
[0058] FIG. 21 is an exemplary diagram illustrating an example of a
horizontal line setting screen according to the embodiment of the
present disclosure.
[0059] FIG. 22
[0060] FIG. 22 is an exemplary diagram illustrating an example
(first example) of a display screen of a specific layer at the time
of setting a horizontal line according to the embodiment of the
present disclosure.
[0061] FIG. 23
[0062] FIG. 23 is an exemplary diagram illustrating an example
(second example) of a display screen of a specific layer at the
time of setting a horizontal line according to the embodiment of
the present disclosure.
[0063] FIG. 24
[0064] FIG. 24 is an exemplary diagram illustrating an example
(third example) of a display screen of a specific layer at the time
of setting a horizontal line according to the embodiment of the
present disclosure.
[0065] FIG. 25
[0066] FIG. 25 is an exemplary diagram illustrating an example
(fourth example) of a display screen of a specific layer at the
time of setting a horizontal line according to the embodiment of
the present disclosure.
[0067] FIG. 26
[0068] FIG. 26 is an exemplary diagram illustrating an example of a
display screen at the time of capturing a camera image according to
the embodiment of the present disclosure.
[0069] FIG. 27
[0070] FIG. 27 is an exemplary diagram illustrating a display
example of a three-dimensional image where camera images are
synthesized according to the embodiment of the present
disclosure.
[0071] FIG. 28
[0072] FIG. 28 is an exemplary diagram illustrating an example of a
display screen at the time of importing an image file according to
the embodiment of the present disclosure.
[0073] FIG. 29
[0074] FIG. 29 is an exemplary diagram illustrating an example of
an importing range setting at the time of importing an image file
according to the embodiment of the present disclosure.
[0075] FIG. 30
[0076] FIG. 30 is an exemplary diagram illustrating a display
example of a three-dimensional image in which images imported from
an image file are synthesized according to the embodiment of the
present disclosure.
[0077] FIG. 31
[0078] FIG. 31 is an exemplary diagram illustrating an example of a
list display screen of productions according to the embodiment of
the present disclosure.
[0079] FIG. 32
[0080] FIG. 32 is a block diagram illustrating an example of
detailed hardware configuration of an image processing apparatus
according to the embodiment of the present disclosure.
DESCRIPTION OF EMBODIMENTS
[0081] Hereinafter, preferred embodiments of the present disclosure
will be described in detail with reference to the appended
drawings. Note that, in this specification and the appended
drawings, structural elements that have substantially the same
function and structure are denoted with the same reference
numerals, and repeated explanation of these structural elements is
omitted.
[0082] Hereinafter, preferred embodiments of the present disclosure
will be described in detail with reference to the appended
drawings, in the following order.
1. Embodiment of the Present Disclosure
[0083] 1-1. Configuration of an image processing apparatus (FIG.
1)
[0084] 1-2. Outline of a three-dimensional image (FIGS. 3 to
10)
[0085] 1-3. Display example of an editing screen (FIGS. 2 and FIGS.
11 to 14)
[0086] 1-4. Display example of a depth adjustment screen (FIGS. 15
to 19)
[0087] 1-5. Display example of a ground surface setting screen
(FIGS. 20 to 25)
[0088] 1-6. Example where a camera image is captured (FIGS. 26 and
27)
[0089] 1-7. Example where an image file is imported (FIGS. 28 to
30)
[0090] 1-8. Display example of a generated image list (FIG. 31)
[0091] 1-9. Specific example of hardware configuration (FIG.
32)
1. Embodiment of the Present Disclosure
[0092] 1-1. The functional configuration of an image processing
apparatus
[0093] First, the configuration of an image processing apparatus
according to an embodiment (hereinafter, called "this example") of
the present disclosure will be described. FIG. 1 is a diagram
illustrating the configuration of an image processing apparatus 100
according to this example.
[0094] An image processing apparatus 100 according to this example
illustrated in FIG. 1 is configured to generate an image by a user
operation and display and store the generated image. As illustrated
in FIG. 1, the image processing apparatus 100 includes an image
generating/processing unit 110, an image storage unit 120, an input
unit 130, and an image display unit 140.
[0095] The image generating/processing unit 110 provides the user
with an image generation screen through the image display unit 140
or generates a three- dimensional image from the image generated by
the user. As illustrated in FIG. 1, the image generating/processing
unit 110 that is included in the image processing apparatus 100
according to this example includes an image generating unit 112, a
three-dimensional image converting unit 114, a three-dimensional
image generating unit 116, and an editing screen generating unit
118. The editing screen generating unit 118 also functions as an
image control unit that controls image generation in each unit.
Alternatively, the image generating/processing unit 110 may include
an image control unit that controls image generation in each unit,
separately from the editing screen generating unit 118.
[0096] The image generating/processing unit 110 generates a
plurality of plane images (for example, three plane images) on the
basis of a user operation in a state in which an editing screen for
image generation is displayed on the image display unit 140 and
generates a three-dimensional image from the plurality of generated
plane images (two-dimensional images). The editing screen for the
image generation is a screen illustrated in FIG. 2 and displays an
intermediate image in generation processing at the center and
displays operation buttons or tabs around the image. The editing
screen illustrated in FIG. 2 will be described below in detail.
Information of the depth from the reference position (for example,
virtual display surface) is set to each of the plurality of
generated plane images and the three-dimensional image is generated
on the basis of the depth.
[0097] In addition, the image generating/processing unit 110
supplies image data of the three-dimensional image generated by the
image generating/processing unit 110 to the image display unit 140
and the image display unit 140 displays the three-dimensional
image. The user views the three-dimensional image using a
predetermined method (for example, in a state in which the user
wears a shutter spectacle of a time-sharing driving type) and
perceives the three-dimensional image displayed on the image
display unit 140 as a three-dimensional image.
[0098] The image generating unit 112 displays the editing screen
for the image generation on the image display unit 140 and
generates an image by the user operation. If the image generating
unit 112 generates images including a plurality of layers using the
image generation screen provided by the image generating unit 112,
the images that include the plurality of layers are converted into
a three-dimensional image by the three-dimensional image converting
unit 114 and the three-dimensional image generating unit 116. The
images including the plurality of layers that are generated by the
image generating unit 112 are stored in the image storage unit 120
according to the user operation.
[0099] The three-dimensional image converting unit 114 executes a
conversion process for displaying the images including the
plurality of layers transmitted from the image generating unit 112
as the three-dimensional image on the image display unit 140. The
image processing apparatus 100 according to this example previously
assumes the distance between eyes of the user and the distance
between the user and the display surface and executes a conversion
process for displaying the image as the three-dimensional image on
the image display unit 140, on the basis of the virtual distance
between the layers (information of the depth of each layer of the
images). Specifically, in order for the three-dimensional image to
be generated by performing a coordinate conversion on the images
including the plurality of layers, the three-dimensional image
converting unit 114 executes a coordinate conversion process with
respect to the images including the plurality of layers.
[0100] At the time of the conversion process in the
three-dimensional image converting unit 114, if the user adjusts
the depth of each layer of the images while the three-dimensional
image is displayed on the image display unit 140 and changes a
depth state, the three-dimensional image converting unit 114
executes the conversion process in real time according to the
change. Thereby, the user can adjust the depth of each layer of the
images and confirm the three-dimensional image after the adjustment
through the display on the editing screen in real time. An example
of a process for adjusting the depth of each layer of the images
will be described in detail below.
[0101] When the image processing apparatus 100 generates the
three-dimensional image from the planar images including the
plurality of layers generated by the user, the image processing
apparatus 100 executes preview display of the three-dimensional
image. By the preview display of the three-dimensional image, the
user can previously grasp how the images would be seen in a
three-dimensional manner before storing the generated image as the
three-dimensional image.
[0102] The three-dimensional image generating unit 116 generates
the three-dimensional image from the images including the plurality
of layers, on the basis of the conversion process executed by the
three-dimensional image converting unit 114.
[0103] The three-dimensional image generated by the
three-dimensional image generating unit 116 is displayed on the
image display unit 140 and is stored in the image storage unit 120
according to the operation of the input unit 130 from the user.
[0104] The editing screen generating unit 118 generates display
data of the editing screen, on the basis of a reception state of an
input operation in the input unit 130. The editing screen
generating unit 118 supplies the display data generated by the
editing screen generating unit 118 to the image display unit 140
and displays the editing screen.
[0105] The image storage unit 120 stores the images including the
plurality of layers generated by the image generating/processing
unit 110 or the three-dimensional image generated by converting the
images including the plurality of layers. The images stored in the
image storage unit 120 are read from the image storage unit 120
according to the operation of the input unit 130 by the user, are
processed by the image generating/processing unit 110, and are
displayed by the image display unit 140.
[0106] The input unit 130 includes various input devices that
execute an input operation with respect to the image processing
apparatus 100 by the user, for example, a keyboard, a mouse, a
graphic tablet, and a touch panel. The user can generate the images
including the plurality of layers by operating the input unit 130
and adjust the depth of each layer of the images when the images
are converted into the three-dimensional image.
[0107] The image display unit 140 is a display that displays an
image. For example, the image display unit 140 displays the images
including the plurality of layers generated by the image
generating/processing unit 110 or the three-dimensional image
generated by converting the images including the plurality of
layers. The image display unit 140 displays a screen to allow the
images to be generated by the user of the image processing
apparatus 100. An example of the display screen will be described
below. A touch panel may be disposed on an image display surface of
the image display unit 140 and the user may directly operate
buttons in a displayed image. The touch panel that is included in
the image display unit 140 functions as a part of the input unit
130. The image display unit 140 may be a device that is separated
from the image processing apparatus 100.
[0108] The image display unit 140 may be configured using a display
device that can display the three-dimensional image. A method that
displays the three-dimensional image is not limited to a specific
display method. For example, as the method that displays the
three-dimensional image, a method that switches an image for a
right eye and an image for a left eye at a high speed and displays
the images is known. As a method that transmits the
three-dimensional image to the image display unit 140, a frame
sequential method, a side-by-side method, and a top and bottom
method and the like are known.
[0109] The images that are generated by the image
generating/processing unit 110 may be output to the television
receiver and other display devices that are connected to the image
processing apparatus 100 and can display the three-dimensional
image.
[0110] 1-2. Outline of generation of a three-dimensional image
[0111] Next, an outline of generation of the three-dimensional
image by the image processing apparatus 100 according to the
embodiment of the present disclosure will be described with
reference to FIGS. 3 to 10.
[0112] First, as illustrated in FIG. 3, the image processing
apparatus 100 according to this example has plane images 251, 252,
and 253 including three layers of a long-distance view, a
middle-distance view, and a short-distance view generated by the
user. The image processing apparatus 100 synthesizes the plane
images 251, 252, and 253 including the three layers of the
long-distance view, the middle-distance view, and the
short-distance view and the plane images are converted into a
three-dimensional image by the three-dimensional image converting
unit 114. As such, by converting the images including the three
layers into the three-dimensional image, the user can generate the
three-dimensional image without executing a complicated imaging
process.
[0113] In an example of FIG. 3, objects a, b, c, and g respectively
corresponding to a mountain, the sun, a road, and a horizontal line
are drawn in the plane image 251 of the long-distance view, an
object d of a tree is drawn in the plane image 252 of the
middle-distance view, and objects e and f each corresponding to a
dog and an insect are drawn in the plane image 253 of the
short-distance view. The number of layers that is three is
exemplary and the number of layers may be plural.
[0114] The depth that is illustrated by the distance from the
reference position is set to the image of each layer. FIG. 4 is a
diagram illustrating a setting situation of the depth. A setting
state illustrated in FIG. 4 is a 3D state thumbnail tab 202 of the
editing screen to be described below and the state illustrated in
the drawing is reduced and displayed. When the 3D state thumbnail
tab 202 is selected, the setting state illustrated in FIG. 4 is
enlarged and displayed on the editing screen.
[0115] The setting state of the depth illustrated in FIG. 4 will be
described. A depth axis 301 is set to a direction orthogonal to a
display screen and the depth positions of the layer images 311,
312, and 313 are set on the depth axis 301. In the example of FIG.
4, the depth axis 301 is illustrated as an oblique axis that has a
predetermined angle.
[0116] An image frame that is illustrated by a broken line in FIG.
4 illustrates a virtual display surface 304 at the position of the
image display surface of the display. The virtual display surface
304 exists at the depth position 301a on the depth axis 301. The
position of the most front side of the depth axis 301 is
illustrated as a front edge portion 302.
[0117] In the example of FIG. 4, the first layer image 311 of the
long-distance view exists at the innermost depth position 301b on
the depth axis 301 and the second layer image 312 of the
middle-distance view exists at the depth position 301c near almost
the center on the depth axis 301. The third layer image 313 of the
short-distance view exists at the depth position 301d of the front
side of the virtual display surface 304 on the depth axis 301. In
this example, the layer image 313 of the short-distance view exists
at the depth position of the front side of the virtual display
surface 304. However, the layer image 313 of the short-distance
view may be inner than the virtual display surface 304 (closer to
the long-distance view).
[0118] As such, when the three layer images are prepared, the layer
image may be automatically set to the predetermined depth position
suitable for the long-distance view as the first layer image 311,
in an initial state. Likewise, the layer image may be automatically
set to the predetermined depth position suitable for the
middle-distance view as the second layer image 312 and the layer
image may be automatically set to the predetermined depth position
suitable for the short-distance view as the third layer image
313.
[0119] As illustrated in FIG. 4, the three-dimensional image is
generated such that the objects a to g in the layer images 311,
312, and 313 are disposed at the depth positions of the layer
images in a virtual three-dimensional space.
[0120] However, when the position of a horizontal line c is set to
the horizontal line position 321 with respect to the layer image
311 of the long-distance view, an image portion of the lower side
of the horizontal line position 321 of the layer image 311 of the
long-distance view is disposed to be inclined on the
three-dimensional space and protrude to the front side. That is,
the image portion of the lower side of the horizontal line c
becomes an inclined surface 303 that gradually protrudes to the
front side, from the depth position 301b of the layer image 311 to
the front edge portion 302, as the image portion progresses to the
lower side. In the example of FIG. 4, the object g of the road that
is drawn on the lower side of the horizontal line position 321 in
the layer image 311 is disposed on the inclined surface 303.
[0121] In the example of FIG. 4, the inclined surface 303 protrudes
to the front edge portion 302 of the front side of the third layer
image 313. However, the inclined surface 303 may protrude to the
virtual display surface 304. In this case, the layer (third layer
image 313) of the most front side protrudes to the front side of a
terminating point of the front side of the inclined surface
303.
[0122] As such, when setting of the horizontal line is performed
with respect to the first layer image 311 of the long-distance view
and setting matched with the horizontal line is performed with
respect to the objects in the other layer images 312 and 313, each
object is automatically appropriately disposed on the inclined
surface 303. Alternatively, a process for erasing inappropriate
portions of the objects is executed. A specific example of the
setting of the horizontal line and the process associated with the
horizontal line will be described below.
[0123] Next, an example of a process for converting the images into
the three-dimensional image in the three-dimensional image
converting unit 114 will be described.
[0124] FIG. 5 is a diagram illustrating a concept of converting
normal two-dimensional images including a plurality of layers into
a three-dimensional image. FIG. 5 illustrates an aspect where
two-dimensional images including a plurality of layers (plane
images 251, 252, and 253) are converted into an image 250R for a
right eye which the user views using the right eye and an image
250L for a left eye which the user views using the left eye.
However, in FIG. 5, the process of the horizontal line described
above is not illustrated. The three-dimensional image converting
unit 114 calculates the drawing positions of the image for the
right eye and the image for the left eye to generate the image 250R
for the right eye and the image 250L for the left eye from the
two-dimensional images.
[0125] Next, an example of a specific method to calculate the
drawing positions of the image for the right eye and the image for
the left eye will be described.
[0126] FIGS. 6 to 8 are exemplary diagrams illustrating an example
where normal two-dimensional images including a plurality of layers
are converted into a three-dimensional image. FIG. 6 illustrates a
coordinate conversion when an image for a right eye and an image
for a left eye are generated from two-dimensional images including
three layers illustrated in FIG. 7 and illustrates a coordinate
conversion when the three layers are inner than the display
surface, as illustrated in FIG. 8. FIG. 6 schematically illustrates
a state in which each layer and the display surface are viewed from
the top side.
[0127] As illustrated in FIG. 6, a distance E between the left eye
and the right eye and the virtual viewing distance L are previously
assumed. The three-dimensional image converting unit 114 executes a
projection coordinate conversion for the image for the right eye
and a projection coordinate conversion for the image for the left
eye, using the layer depths D1, D2, and D3 between the virtual
display surface 259 and the layers. With the projection coordinate
conversion, the pixel positions of the objects of each layer image
in the virtual display surface 259 are calculated.
[0128] As such, the three-dimensional image converting unit 114
executes the projection coordinate conversion with respect to the
virtual display surface 259 and the image processing apparatus 100
can convert the normal two-dimensional images including the
plurality of layers into the three-dimensional image.
[0129] The examples of FIGS. 6 to 8 are examples where each layer
is set to the depth deeper than the depth of the virtual display
surface 259. However, each layer may be set to the depth of the
front side of the virtual display surface 259.
[0130] FIGS. 9 and 10 illustrate an example where two-dimensional
images are converted into a three-dimensional image when an image
of the layer of the short-distance view is disposed on the front
side of a virtual display surface 259'. FIG. 9 schematically
illustrates a coordinate conversion when an image for a right eye
and an image for a left eye are generated from two-dimensional
images including three layers illustrated in FIG. 10, in an
upwardly viewed state.
[0131] As illustrated in FIG. 9, the pixel positions of the image
for the right eye and the image for the left eye where the objects
of the image of the layer of the front side of the virtual display
surface 259' are projected on the virtual display surface 259' are
reversed to the pixel positions in the layer deeper than the
virtual display surface 259' at the left and right sides.
[0132] In the case of this example, as illustrated in FIG. 10, a
three-dimensional image where the objects e and f disposed on the
layer of the short-distance view are viewed to protrude to the
front side of the display surface is obtained.
[0133] 1-3. Display example of the editing screen
[0134] Next, an image generation process using the editing screen
that is needed to generate the three-dimensional image will be
described.
[0135] The editing screen is the screen illustrated in FIG. 2 that
is displayed on the image display unit 140, on the basis of the
process in the editing screen generating unit 118. A flowchart of
FIG. 11 illustrates a flow of the process in the editing screen
generating unit 118. The description will be given with reference
to FIG. 11. First, it is determined whether the input unit 130
receives a user operation to display the editing screen (step S11).
In this case, when the user operation is not received, a waiting
state is maintained and when the user operation to display the
editing screen is received, the process for displaying the 3D
editing screen illustrated in FIG. 2 is executed (step S12). In
this state, in the 3D editing screen, all of the layers are
overlapped and displayed as the displayed intermediate images in
generation processing. In addition, multi-layer thumbnails that
correspond to the number of layers are displayed.
[0136] In addition, it is determined whether an operation to select
any one of the displayed multi-layer thumbnails by the input unit
130 is received (step S13). When the operation is not received, a
waiting state is maintained and the image of the editing screen is
not changed.
[0137] In addition, when the operation to select any one of the
layer thumbnails is received, the image of the layer that
corresponds to the selected layer thumbnail is displayed on the
editing screen (step S14). However, in this case, in the image
display of each layer, the display brightness of the images of the
other layers is decreased and the image of each layer is
simultaneously displayed, and the entire three-dimensional image is
displayed to be recognized.
[0138] In a state in which the image of the specific layer is
displayed in step S14, it is determined whether an operation to
change the layer to another layer by selecting the layer thumbnail
exists (step S15). In this case, when it is determined that the
operation to change the layer to another layer exists, a change
process of the layer that is displayed on the editing screen is
executed (step S16), the process returns to step S14, and an image
of the layer after the change is displayed.
[0139] When it is determined that the operation to change the layer
to another layer does not exist in step S15, it is determined
whether an operation to change the display to the overlapped
display of all of the layers exists (step S17). In this case, when
it is determined that the operation to change the display to the
overlapped display of all of the layers exists, the display is
changed to the overlapped display of all of the layers and the
process returns to the 3D editing screen display in step S12.
[0140] In step S17, when it is determined that the operation to
change the display to the overlapped display of all of the layers
does not exist, the screen display of the selected layer in step
S14 is continuously executed.
[0141] Next, a specific display example of the editing screen will
be described with reference to FIGS. 2 and 12 to 14. FIG. 2
illustrates an example of an editing screen in a state in which all
layers of intermediate images in generation processing are
overlapped and displayed. In the editing screen, intermediate image
display 203 in edition processing is performed with a relatively
large image at a center portion. In the example of FIG. 2, as the
editing image display 203, an image where the images 251 to 253 of
the three layers illustrated in FIG. 3 are overlapped and displayed
is displayed. In order to illustrate the image including the images
251, 252, and 253 of the three layers, three layer edge portions of
a first layer edge portion 203a, a second layer edge portion 203b,
and a third layer edge portion 203c are overlapped and illustrated
with respect to an upper edge portion of the editing image display
203. In the edge portions 203a to 203c of the three images, the
edge portion (for example, edge portion 203a) of the image that
corresponds to the selected tab is displayed on the front side and
the edge portions of the other images are displayed on the inner
side.
[0142] In addition, plural tabs 201, 202, and 211 to 213 to select
a display image are disposed on the upper side of the intermediate
image display 203 in edition processing in the editing screen. The
tabs 201, 202, and 211 to 213 display the images allocated to the
selected tabs, when a user operation to select the corresponding
tab display places by the touch panel operation or the like
exists.
[0143] The image that is allocated to each tab will be described.
The layer thumbnail tabs 211, 212, and 213 are tabs to individually
display the layers. Specifically, the first layer thumbnail tab 211
is a tab that displays a first layer image, the second layer
thumbnail tab 212 is a tab that displays a second layer image, and
the third layer thumbnail tab 213 is a tab that displays a third
layer image. In the thumbnail tabs 211 to 213, images of the layers
are reduced and displayed as the thumbnail images. Therefore, when
the image of each layer is modified, the thumbnail image is also
modified in the same way.
[0144] The tab 201 that is displayed adjacent to the layer
thumbnail tabs 211, 212, and 213 is a tab that adds the layer
image. That is, when the tab 201 is selected, a layer image is
newly added and an editing screen of the newly added layer image is
displayed.
[0145] In addition, the 3D state thumbnail tab 202 that is disposed
to be closer to the right side of the upper side of the
intermediate image display 203 in edition processing is a tab that
displays a depth state of the image of each layer. When the 3D
state thumbnail tab 202 is selected, the display screen of the
depth state illustrated in FIG. 4 is enlarged and displayed in the
place of the editing image display 203. In the 3D state thumbnail
tab 202, the display screen of the depth state illustrated in FIG.
4 is reduced and displayed. With respect to the reduction display
in the 3D state thumbnail tab 202, the depth state is adjusted and
a display content changes according to the adjustment.
[0146] A peripheral portion of the intermediate image display 203
in edition processing of the editing screen illustrated in FIG. 2
will be described. On the left end of the editing image display
203, a plurality of buttons 221 to 225 are disposed. In this
example, a file importing button 221, a camera capturing button
222, a stamp button 223, a character input button 224, and a depth
operation button 225 are prepared.
[0147] If a user operation to select the file importing button 221
exists, a process for importing a file image that is prepared in
the image storage unit 120 or an external memory starts.
[0148] If a user operation to select the camera capturing button
222 exists, a process for capturing image data from a camera device
connected to the apparatus starts.
[0149] In the stamp button 223 and the character input button 224,
a process for inputting prepared figures or characters starts, by a
user operation of each button.
[0150] If a user operation to select the depth operation button 225
exists, a depth adjustment screen is displayed on the editing image
display 203. A display process of the depth adjustment screen will
be described below.
[0151] As illustrated in FIG. 2, a generation start button 241, a
save button 242, and a three-dimensional display button 243 are
displayed on the right end of the editing image display 203. The
generation start button 241 is a button to instruct to generate a
new three-dimensional image. The save button 242 is a button to
instruct to store an intermediate image in generation processing in
the image storage unit 120. The three-dimensional display button
243 is a button to switch three-dimensional display and normal
display (2D display) of the image displayed on the editing image
display 203.
[0152] On the lower side of the right end of the editing image
display 203, a pen tool 290 is displayed. The pen tool 290 of this
example displays a first pen 291, a second pen 292, a third pen
293, and an eraser 294. If a user operation to select display
places of the pens 291, 292, and 293 exists, a line can be drawn
with a color or a line type allocated to each pen. If a user
operation to select a display place of the eraser 294 exists, the
drawn line is erased. The pen tool 290 may have a function of
supporting other drawing or erasure.
[0153] As illustrated in FIG. 2, object display units 231 to 237
are provided on a lower end of the editing image display 203. In
the plurality of object display units 231 to 237, for example,
seven newest objects that are used by the user are displayed. In
the example of FIG. 2, each object of the intermediate image in
generation processing is illustrated.
[0154] FIG. 2 illustrates an example where images of the individual
layers are overlapped and displayed in the editing image display
203. Even when the other images are displayed in the editing image
display 203, the same display is performed around the editing image
display 203. In the editing screen of FIG. 2, a tab that selects
overlapped display where the image of each layer illustrated in
FIG. 2 is overlapped is not provided in particular. However, the
tab that selects the overlapped display may be prepared separately
from the tab for the image of each layer. When the tab to select
the overlapped display where the image of each layer is overlapped
is provided, a thumbnail image where an overlapped image of the
image of each layer is reduced and displayed is displayed on the
tab. Therefore, when the image of each layer is modified, the
thumbnail image where the overlapped image is reduced and displayed
is modified according to the change.
[0155] FIG. 12 illustrates an example where the image of the third
layer (image 253 of the short-distance view of FIG. 3) is displayed
in the editing image display 203. The image display of the third
layer is performed by selecting the third layer thumbnail tab 213
by a user operation.
[0156] When the image 253 of the third layer is displayed, the
objects e and f in the image of the third layer are displayed with
set colors or brightness. The objects of the images of the other
layers are overlapped and displayed after decreasing the display
brightness. That is, only the image of the third layer is
highlighted and displayed and the images of the other layers are
grayed out and displayed.
[0157] By performing an operation to add the objects in the images
in the intermediate image display 203 in edition processing or
drawing by the user in a state illustrated in FIG. 12, the image of
the third layer is generated or edited.
[0158] FIG. 13 illustrates an example where the image of the second
layer (image 252 of the middle-distance view of FIG. 3) is
displayed in the editing image display 203. The image display of
the second layer is performed by selecting the second layer
thumbnail tab 212 by a user operation.
[0159] When the image 252 of the second layer is displayed, the
object d in the image of the second layer is displayed with set
colors or brightness. The objects of the images of the other layers
are overlapped and displayed after decreasing the display
brightness. By performing an operation to add the objects in the
images in the intermediate image display 203 in edition processing
or drawing by the user in a state illustrated in FIG. 13, the image
of the second layer is generated or edited.
[0160] FIG. 14 illustrates an example where the image of the first
layer (image 251 of the long-distance view of FIG. 3) is displayed
in the editing image display 203. The image display of the first
layer is performed by selecting the first layer thumbnail tab 211
by a user operation.
[0161] When the image 251 of the first layer is displayed, the
objects a, c, and g in the image of the first layer are displayed
with set colors or brightness. At the time of the display, the
objects of the images of the other layers are overlapped and
displayed after decreasing the display brightness.
[0162] By performing an operation to add the objects in the images
in the intermediate image display 203 in edition processing or
drawing by the user in a state illustrated in FIG. 14, the image of
the first layer is generated or edited.
[0163] In the display examples of FIGS. 12 to 14, the image of the
selected layer is highlighted and displayed and the images of the
other layers are grayed out and displayed. Meanwhile, as the
editing image, only the objects of the image of the selected layer
may be displayed, and the objects of the images of the other layers
may not be displayed.
[0164] 1-4. Display example of a depth adjustment screen
[0165] Next, a flow of a depth adjustment process of an image of
each layer will be described with reference to a flowchart of FIG.
15.
[0166] A depth adjustment process is started by a user operation to
select a depth operation button 225 of the editing screen
illustrated in FIG. 2, for example. That is, as illustrated in FIG.
15, the editing screen generating unit 118 determines whether an
operation to select the depth operation button 225 exists in a
state in which the editing screen is displayed (step S21). When the
operation to select the depth operation button 225 does not exist,
a waiting state is maintained until the operation exists.
[0167] In step S21, when it is determined that the operation to
select the depth operation button 225 exists, overlapped display of
the images of all of the layers is performed as the intermediate
image display 203 in edition processing and a depth bar is
displayed on the upper side of the intermediate image display 203
in edition processing (step S22). The depth bar is a scale that
illustrates the depth of each layer. As illustrated in this
example, when the depth bar is configured using the images of the
three layers, the depth positions of the three layers are displayed
by the depth bar. On the lower side of the editing image display
203, a depth adjustment button is displayed. A specific display
example will be described below.
[0168] However, in this example, for each layer, an adjustment
button to move the depth of the image to the inner side and an
adjustment button to move the depth of the image to the front side
are prepared and displayed.
[0169] The editing screen generating unit 118 determines whether a
user operation of any adjustment button exists (step S23). In this
case, when it is determined that the operation of the adjustment
button does not exist, the display of step S22 is continuously
executed. When it is determined that the operation of the
adjustment button exists, the image of the layer that corresponds
to the operated operation button is displayed as the intermediate
image display 203 in edition processing (step S24). The depth
position that is set to the image of the corresponding layer is
changed according to the operation situation of the adjustment
button and the depth position of the depth bar display is changed
according to the corresponding position (step S25). After the
setting or the display is changed in step S25, the display returns
to the display of step S22. However, the intermediate image display
203 in edition processing may be the display of only the operated
layer until a next operation exists.
[0170] FIG. 16 is a diagram illustrating a display example of the
depth bar.
[0171] When the operation of the depth operation button 225 of the
editing screen exists, as illustrated in FIG. 16, the overlapped
display of the images of all of the layers is performed as the
intermediate image display 203 in edition processing and the depth
bar 401 is displayed on the upper side of the editing image display
203.
[0172] In this example, the depth positions of the images of the
three layers are illustrated in one depth bar 401. That is, in the
depth bar 401, the depth position 401a of the image of the first
layer, the depth position 401b of the image of the second layer,
and the depth position 401c of the image of the third layer are
illustrated by changing the display colors.
[0173] In scales that are given to the depth bar 401, "0" indicates
the depth position of the virtual display surface, the inner side
is indicated by a minus value, and the front side is indicated by a
plus value (plus display is not illustrated).
[0174] On the lower side of the editing image display 203, buttons
that adjust the depth position are displayed for the image of each
layer. Specifically, a depth adjustment button 411 that moves the
depth to the front side and a depth adjustment button 412 that
moves the depth to the inner side are displayed as adjustment
buttons for the image of the first layer. A depth adjustment button
421 that moves the depth to the front side and a depth adjustment
button 422 that moves the depth to the inner side are displayed as
adjustment buttons for the image of the second layer. A depth
adjustment button 431 that moves the depth to the front side and a
depth adjustment button 432 that moves the depth to the inner side
are displayed as adjustment buttons for the image of the third
layer.
[0175] If the user operation to select the display places of the
depth adjustment buttons 411 to 432 exists, the setting of the
depth of the image of each layer is changed. For example, the depth
position of the image of the first layer is changed by the
operations of the depth adjustment buttons 411 and 412 and the
display of the depth position 401a of the image of the first layer
in the depth bar 401 is moved as illustrated by an arrow La.
[0176] In addition, the depth position of the image of the second
layer is changed by the operations of the depth adjustment buttons
421 and 422 and the display of the depth position 401b of the image
of the second layer in the depth bar 401 is moved as illustrated by
an arrow Lb.
[0177] In addition, the depth position of the image of the third
layer is changed by the operations of the depth adjustment buttons
431 and 432 and the display of the depth position 401c of the image
of the third layer in the depth bar 401 is moved as illustrated by
an arrow Lc.
[0178] The depth adjustment buttons 411 to 432 are limited to the
positions adjacent to the depth positions of the images of the
adjacent layers. For example, a range of the movement La of the
image of the first image is from the deepest position to the
position adjacent to the depth position of the image of the
adjacent second layer.
[0179] FIG. 17 illustrates another display example of the depth
bar.
[0180] In this example, in the editing image display 203, the image
of the selected layer (in this example, image of the third layer)
is highlighted and displayed. In a depth bar 501, only the depth
position 502 of the layer of the emphasis display is illustrated as
the depth bar 501. In the case of the example of FIG. 17, in the
depth bar 501, an entire depth range set by the image processing
apparatus 100 is illustrated and a position display 503 of the
virtual display surface is illustrated.
[0181] In the case of the example of FIG. 17, with respect to the
depth adjustment buttons, only the depth adjustment buttons 511 and
512 for the image of the third layer that is the selecting layer
are displayed. The depth position of the image of the layer of the
emphasis display is adjusted by the operations of the depth
adjustment buttons 511 and 512 and the display position of the
depth position 502 of the depth bar 501 is changed as illustrated
by an arrow Ld.
[0182] FIG. 18 illustrates another display example of the depth
bar.
[0183] In the example of FIG. 18, similar to the example of FIG.
17, the image of the selected layer (in this example, image of the
second layer) is highlighted and displayed in the editing image
display 203. In a depth bar 601, only a range where the depth of
the image of the layer can be adjusted is displayed with scales. A
depth position 602 is displayed in the depth bar 601. That is, in
the depth of the image of the second layer, the inner side is
limited to the depth position of the image of the first layer and
the front side is limited to the depth position of the image of the
third layer. A range where the depth is limited and adjusted
becomes a scale display range of the depth bar 601.
[0184] Therefore, when the depth position is adjusted by the
operations of the depth adjustment buttons 611 and 612, the depth
position 602 moves in the range of the displayed depth bar 601, as
illustrated by an arrow Le.
[0185] FIG. 19 illustrates another display example of the depth
bar.
[0186] In an example of FIG. 19, the overlapped display of the
images of all of the layers is performed in the intermediate image
display 203 in edition processing and a depth position 702 of the
image of the layer where the depth is adjusted is illustrated in a
depth bar 701. In addition, a position 703 of the virtual display
surface is indicated. Similar to the example of FIG. 16, depth
adjustment buttons 711, 712, 721, 722, 731, and 732 are provided
for each layer and the depth position 702 is changed by adjusting
the depth by the button operation, as illustrated by an arrow
Lf.
[0187] When the depth of the specific layer is adjusted by the
operations of the depth adjustment buttons 711, 712, 721, 722, 731,
and 732, the position of the image of the layer that is compatible
with the adjustment of the depth is indicated by an image frame 704
of four corners, in the editing image display 203.
[0188] At almost the center of the image, display 705 of a
numerical value (in this example, "-25") indicating the setting
position of the depth is performed.
[0189] In this way, display where setting of the depth can be
recognized from the display image may be performed.
[0190] 1-5. Display example of a ground surface setting screen
[0191] Next, a flow of a ground surface setting process of an image
will be described with reference to a flowchart of FIG. 20.
[0192] The ground surface setting process starts when the user
operates a button (not illustrated in the drawings) to instruct
setting of the ground surface, in the editing screen illustrated by
FIG. 2. That is, as illustrated in FIG. 20, the editing screen
generating unit 118 determines whether an operation to instruct
setting of the ground surface exists, in a state in which the
editing screen is displayed (step S31). When the operation of the
setting of the ground surface does not exist, a waiting state is
maintained until the corresponding operation exists.
[0193] In step S31, when it is determined that the operation of the
setting of the ground surface exists, the overlapped display of the
images of all of the layers is performed as the intermediate image
display 203 in edition processing and a slider bar for horizontal
line adjustment is vertically displayed on one end of the
intermediate image display 203 in edition processing (step S32). A
slider handle that indicates the position of the horizontal line is
displayed on the slider bar for the horizontal line adjustment, and
it is determined whether a drag operation of the slider handle
exists (step S33).
[0194] When it is determined that the operation of the slider
handle exists, a change process of the position of the horizontal
line is executed according to the operation (step S34). The lower
side of the horizontal line of the image of the innermost layer
(first layer) is set to the inclined surface on the
three-dimensional space, according to the change of the position of
the horizontal line. The setting of the inclined surface is the
process already described in FIG. 4 and the inclined surface
corresponds to the ground surface.
[0195] Next, it is determined whether a mode to erase the lower
side of the ground surface in the images of the layers other than
the first layer is set (step S35). In this case, when the mode to
erase the lower side of the ground surface is set, the objects at
the positions that become the lower side of the inclined surface
(ground surface) of the images of the layers other than the first
layer are erased (step S36).
[0196] When it is determined that the mode to erase the lower side
of the ground surface is not set in step S35, it is determined
whether the object set to be disposed on the ground surface exists
among the objects of the images of the individual layers (step
S37). In this case, when it is determined that the object set to be
disposed on the ground surface exists, the position of the lower
end of the corresponding object is adjusted to the position
crossing the inclined surface in the image of the layer where the
object exists (step S38).
[0197] When it is determined that a drag operation does not exist
in step S33, after the processes of steps S36 and S38 are executed
and when it is determined that the object set to be disposed on the
ground surface does not exist in step S37, the process returns to
the horizontal line slider bar display process of step S32.
[0198] Next, a specific display example at the time of adjusting
the horizontal line will be described.
[0199] FIG. 21 illustrates an example where the horizontal line
adjustment bar 261 which is the horizontal line slider bar is
displayed in the intermediate image display 203 in edition
processing in the editing screen. In the horizontal line adjustment
bar 261, the ground surface setting position display 262 is
indicated. In this example, the ground surface setting position
display 262 is matched with the ground line c drawn in the image of
the first layer. The matching process is executed by the user
operation.
[0200] As such, the ground surface of the lower side of the
horizontal line of the image of the first layer is set as the
inclined surface illustrated in FIG. 4 by setting the horizontal
line. The position of the ground surface may be displayed in the
images of the layers other than the first layer (images of the
second layer and the third layer).
[0201] For example, as illustrated in FIG. 22, ground surface
position display 271 where the image 252 of the second layer and
the ground surface (inclined surface) cross is indicated by a
broken line as the intermediate image display 203 in edition
processing in the editing screen. By this display, it is determined
whether the arrangement position of the object in the image 252 of
the layer (in this example, object d of a tree) is appropriate,
from the display of the ground surface position. That is, a lower
end of the object d of the tree is almost matched with the ground
surface position display 271 illustrate by a broken line and an
appropriate three-dimensional image is obtained. Meanwhile, the
lower end of the object d of the tree is at the upper side of the
ground surface position display 271 illustrated by the broken line,
the tree is floated. When the lower end of the object d is at the
lower side of the ground surface position display 271, the tree
sinks into the ground surface. As a result, an unnatural
three-dimensional image is obtained in both cases. As illustrated
in FIG. 22, the ground surface position display 271 is performed to
effectively prevent the image from becoming the unnatural
three-dimensional image.
[0202] FIG. 23 illustrates another display example of the position
of the ground surface. In the example of FIG. 23, only the upper
side of the position crossing the ground surface is displayed as
the image of the second layer and the lower side of the ground
surface is displayed as a non-display portion 272 (black display
portion). In FIG. 23, the image of the third layer of the front
side of the second layer is displayed, for example, after
decreasing the brightness. However, the objects e and f in the
image of the third layer may not be displayed.
[0203] FIG. 24 illustrates an example where the objects of the
lower side of the ground surface that is the inclined surface in
the layers other than the first layer that is the long-distance
view are erased and the display is performed as the intermediate
image display 203 in edition processing in the editing screen. A
process of FIG. 24 corresponds to the process in step S36 of the
flowchart of FIG. 20.
[0204] In the example of FIG. 24, a lower part of an object e of a
dog of the third layer that is the short-distance view becomes the
lower side of the ground surface. At this time, the object e is
displayed in a state in which the lower part of the object e of the
lower side of the ground surface is erased.
[0205] In this way, the object that becomes the lower side of the
ground surface is not displayed so that unnatural display where the
object exists on the lower side of the ground surface when the
generated image is viewed three-dimensionally can be prevented. The
partial erasure process of the object illustrated in FIG. 24 may be
executed in the three-dimensional image display and the
corresponding object may be completely displayed when the image of
each layer is individually displayed.
[0206] FIG. 25 illustrates an example of an operation screen in the
case where a lower end of the object in the image of each layer is
matched with the ground surface of the inclined surface and display
is performed as the intermediate image display 203 in edition
processing in the editing screen. A process of FIG. 25 corresponds
to the process in step S38 of the flowchart of FIG. 20.
[0207] In the example of FIG. 25, an operation screen with respect
to the object e of the dog of the third layer that is the
short-distance view is illustrated. In this example, as illustrated
in FIG. 25, position movement buttons 281 and 282, a returning
button 283, an erasure button 284, and a ground surface adjustment
button 285 are displayed around the object e.
[0208] The user performs the operation to select each button and
the position of the object e is modified. In this case, when the
operation to select the ground surface adjustment button 285
exists, the editing screen generating unit 118 executes a process
for automatically matching the position of the lower end of the
object e with a surface crossing the ground surface.
[0209] Therefore, the corresponding object e is automatically
disposed on the ground surface by selecting the ground surface
adjustment button 285 and an appropriate three-dimensional image
can be generated.
[0210] 1-6. Example where a camera image is captured
[0211] FIGS. 26 and 27 illustrate an example where a camera image
is taken in an intermediate image in generation processing.
[0212] For example, if an operation to select the camera capturing
button 222 in the editing screen illustrated in FIG. 2 exists, as
illustrated in FIG. 26, a camera capturing operation screen 810 is
displayed on the editing screen. A process for reading a camera
image from an external camera device (or storage device where the
camera image is stored) connected to the image processing apparatus
100 is executed by an operation using the camera capturing
operation screen 810. In addition, as illustrated in FIG. 26,
display of the camera capturing image 811 is performed by the
reading. An extraction image 812 where a background is removed from
the camera capturing image 811 is obtained by an operation in the
camera capturing operation screen 810.
[0213] By disposing the extraction image 812 on the image of any
layer, the extraction image 812 can be disposed as one of the
objects in the intermediate image in generation processing, as
illustrated in FIG. 27. The depth position of the layer where the
extraction image is disposed is selected by the user using the
camera capturing operation screen 810. Alternatively, the camera
capturing image may be automatically disposed on the layer of the
most front side (short-distance view).
[0214] 1-7. Example where an image file is imported
[0215] FIGS. 28 to 30 illustrate an example where a file image is
imported in an intermediate image in generation processing.
[0216] For example, if an operation to select the file importing
button 221 in the editing screen illustrated in FIG. 2 exists, as
illustrated in FIG. 28, an image file importing operation screen
820 is displayed in the editing screen. A process for reading
selected image data from the image file stored in the designated
place is executed by an operation using the image file importing
operation screen 820. In addition, as illustrated in FIG. 28, an
imported image 821 is displayed by the reading. As illustrated in
FIG. 29, an extraction image 822 that is partially extracted from
the imported image 821 is obtained by an operation in the image
file importing operation screen 820.
[0217] By disposing the extraction image 822 on the image of any
layer, as illustrated in FIG. 30, the extraction image 822 can be
disposed as one of the objects in the generated image. In this
case, the depth position of the layer where the extraction image is
disposed is selected by the user using the image file importing
operation screen 820.
[0218] <1-8. Display example of a list of generated images
[0219] The three-dimensional image that is generated using the
editing screen in the process described above is stored in the
image storage unit 120 of the image processing apparatus 100. A
list of data of the stored three-dimensional images can be
displayed on one screen.
[0220] FIG. 31 illustrates an example where a list of generated
images is displayed. In this example, generated images 11, 12, 13,
. . . are reduced and displayed. In this case, in display of each
image, a two-dimensional image or a three-dimensional image may be
displayed.
[0221] In columns of generated image display, numbers of layers
displays 11a, 12a, 13a, . . . that number of layers are indicated
by figures are performed. In this case, as the displays of the
number of layers by the figures, a figure where three images are
overlapped is displayed when the number of layers is three.
[0222] By displaying the list of generated images, the generated
images can be easily selected. The selected images may be displayed
in the editing screen illustrated in FIG. 2 and editing work may be
performed.
[0223] 1-9. Specific example of hardware configuration
[0224] Next, a specific example of the hardware configuration of
the image processing apparatus 100 according to this example will
be described with reference to FIG. 32.
[0225] FIG. 32 illustrates an example where the image processing
apparatus 100 is configured as an information processing device
such as a computer device.
[0226] The image processing apparatus 100 mainly includes a CPU
901, a ROM 903, a RAM 905, a host bus 907, a bridge 909, an
external bus 911, an interface 913, an input device 915, an output
device 917, an image capturing device 918, a storage device 919, a
drive 921, a connection port 923, and a communication device
925.
[0227] The CPU 901 functions as an operation processing device and
a control device and controls all or a part of operations in the
image processing apparatus 100, according to various programs
stored in the ROM 903, the RAM 905, the storage device 919, and a
removable recording medium 927. The ROM 903 stores programs or
operation parameters that are used by the CPU 901. The RAM 905
primarily stores programs used in execution of the CPU 901 or
parameters appropriately changed in the execution. These devices
are connected to each other by a host bus 907 configured by an
internal bus such as a CPU bus.
[0228] The host bus 907 is connected to an external bus 911 such as
a peripheral component interconnect/interface (PCI) bus through the
bridge 909.
[0229] The input device 915 is an operation unit such as a mouse, a
keyboard, a touch panel, a button, a switch, and a lever operated
by the user. The input device 915 may be a remote control unit
(so-called remote controller) using infrared rays and other radio
waves or an external connection apparatus 929 such as a mobile
phone or a PDA that is compatible with the operation of the image
processing apparatus 100. The input device 915 is configured using
an input control circuit that generates an input signal on the
basis of information input by the user using the operation unit and
outputs the input signal to the CPU 901. The user of the image
processing apparatus 100 operates the input device 915 and can
input various data to the image processing apparatus 100 or
instructs the image processing apparatus 100 to execute a process
operation.
[0230] The output device 917 is configured using a display device
such as a liquid crystal display device, a plasma display device,
an EL display device, and a lamp, a sound output device such as a
speaker and a headphone, or a device such as a printer device, a
mobile phone, and a facsimile that can visually or audibly notify
the user of acquired information. The output device 917 outputs the
result that is obtained by various processes executed by the image
processing apparatus 100. Specifically, the display device displays
the result obtained by the various processes executed by the image
processing apparatus 100 with a text or an image. Meanwhile, the
sound output device converts an audio signal configured using
reproduced sound data or acoustic data into an analog signal and
outputs the analog signal.
[0231] For example, the image capturing device 918 is provided on
the display device and the image processing apparatus 100 can
capture a still image or a moving image of the user with the image
capturing device 918. The image capturing device 918 includes a
charge coupled device (CCD) image sensor or a complementary metal
oxide semiconductor (CMOS) image sensor and converts light
condensed by a lens into an electric signal and can capture a still
image or a moving image.
[0232] The storage device 919 is a data storage device that is
configured as an example of a storage unit of the image processing
apparatus 100. For example, the storage device 919 is configured
using a magnetic storage device such as a hard disk drive (HDD), a
semiconductor storage device, an optical storage device, or a
magneto-optical storage device. The storage device 919 stores
programs executed by the CPU 901, various data, and acoustic signal
data or image signal data acquired from the outside.
[0233] The drive 921 is a reader/writer for a storage medium and is
incorporated in the image processing apparatus 100 or is attached
to the outside of the image processing apparatus 100. The drive 921
reads information that is recorded in the mounted removable
recording medium 927 such as a magnetic disk, an optical disk, a
magneto-optical disk, or a semiconductor memory and outputs the
information to the RAM 905. The drive 921 can record an information
on the mounted removable recording medium 927 such as the magnetic
disk, the optical disk, the magneto-optical disk, or the
semiconductor memory. The removable recording medium 927 is a DVD
medium, a Blu-ray medium, a compact flash (registered trademark)
(CompactFlash: CF), a memory stick, a secure digital (SD) memory
card, or the like. The removable recording medium 927 may be an
integrated circuit (IC) card or an electronic apparatus where an IC
chip of a non-contact type is mounted.
[0234] The connection port 923 is a port to directly connect the
apparatus to the image processing apparatus 100, such as a
universal serial bus (USB) port, an IEEE1394 port such as i.Link, a
small computer system interface (SCSI) port, an RS-232C port, an
optical audio terminal, or a high-definition multimedia interface
(HDMI) port. By connecting the external connection apparatus 929 to
the connection port 923, the image processing apparatus 100
acquires the acoustic image data or the image signal data directly
from the external connection apparatus 929 or provides the acoustic
image data or the image signal data to the external connection
apparatus 929.
[0235] The communication device 925 is a communication interface
that is configured by a communication device for connection with a
communication network 931. The communication device 925 is a
communication card such as a wired or wireless local area network
(LAN), a Bluetooth, and a communication card for a wireless USB
(WUSB), a router for optical communication, a router for an
asymmetrical digital subscriber line (ADSL), or a modem for various
communications. The communication device 925 can transmit and
receive a signal based on a predetermined protocol such as TCP/IP
between the Internet or other communication apparatuses and the
communication device. The communication network 931 that is
connected to the communication device 925 is configured by a
network connected by wire or wireless. For example, the
communication network 931 may be the Internet, a home LAN, infrared
communication, radio wave communication, or satellite
communications.
[0236] The example of the hardware configuration that can realize
the function of the image processing apparatus 100 according to
this example is described. The various components may be configured
using general-purpose members or may be configured by hardware
specialized in the functions of the individual components.
Therefore, the hardware used in the configuration may be
appropriately changed according to various technological levels to
carry out this embodiment.
[0237] The program (software) that executes each process step
executed by the image processing apparatus 100 according to this
example may be generated, the program may be deployed in a
general-purpose computer device, and the same process may be
executed. The program may be stored in various media or may be
downloaded from the server side to the computer device through the
Internet.
[0238] It should be understood by those skilled in the art that
various modifications, combinations, sub-combinations and
alterations may occur depending on design requirements and other
factors insofar as they are within the scope of the appended claims
or the equivalents thereof.
[0239] Note that the following configurations are within the scope
of the present disclosure. (1)
[0240] An image processing apparatus comprising: [0241] an image
generating unit that generates a plurality of plane images and sets
virtual distances in a depth direction to the plurality of
generated plane images, respectively; a three-dimensional image
converting unit that converts the plurality of plane images into a
three-dimensional image where objects' positions in space in each
of the plurality of plane images are set, based on the virtual
distances set to the plurality of plane images generated by the
image generating unit; [0242] a three-dimensional image generating
unit that outputs data of the three-dimensional image converted by
the three-dimensional image converting unit; [0243] an editing
screen generating unit that displays the plurality of plane images
generated by the image generating unit individually or in a
overlapped manner and generates display data of an editing screen
displayed by providing tabs to the plane images, respectively; and
[0244] an input unit that receives an operation to generate or edit
images in the editing screen generated by the editing screen
generating unit.
[0245] (2)
[0246] The image processing apparatus according to (1), [0247]
wherein the editing screen that is generated by the editing screen
generating unit includes the tabs for the individual plane images
that display thumbnail images where the plane images corresponding
to the individual tabs are reduced, and the plane image
corresponding to any tab is displayed on the editing screen by a
selection operation of any tab in the input unit.
[0248] (3)
[0249] The image processing apparatus according to (1) or (2),
[0250] wherein the editing screen that is generated by the editing
screen generating unit includes tabs that display thumbnail images
where images showing the virtual distances of the plurality of
plane images are reduced.
[0251] (4)
[0252] The image processing apparatus according to any one of (1)
to (3), [0253] wherein distance scales showing setting of the
virtual distances of the plane images and operation positions to
operate the virtual distances of the plane images by the input unit
are displayed on the editing screen that is generated by the
editing screen generating unit.
[0254] (5)
[0255] The image processing apparatus according to any one of (1)
to (4), [0256] wherein, in the distance scales, settings of the
distances of the plurality of plane images are distinguished and
displayed, and the operation positions are prepared for every plane
image.
[0257] (6)
[0258] The image processing apparatus according to any one of (1)
to (5), [0259] wherein displaying to indicate the position of a
virtual display surface is performed in the distance scales.
[0260] (7)
[0261] The image processing apparatus according to any one of (1)
to (6), [0262] wherein the three-dimensional image converting unit
converts the plurality of plane images into a three-dimensional
image where an image portion of a lower side from the horizontal
line position set to one specific plane image of the plural plane
images becomes an inclined surface gradually changing from the
virtual distance.
[0263] (8)
[0264] The image processing apparatus according to any one of (1)
to (7), [0265] wherein horizontal line position scales showing
setting of the horizontal line position are displayed on the
editing screen generated by the editing screen generating unit and
the horizontal line position shown by the horizontal line position
scales is changed by receiving an operation in the input unit.
[0266] (9)
[0267] The image processing apparatus according to any one of (1)
to (8), [0268] wherein the plane images other than the specific
plane image display the position crossing the inclined surface in
the three-dimensional image, on the editing screen generated by the
editing screen generating unit.
[0269] (10)
[0270] The image processing apparatus according to any one of (1)
to (9), [0271] wherein the three-dimensional image converting unit
erases an object that becomes the lower side of the position
crossing the inclined surface in the three-dimensional image, with
respect to the plane images other than the specific plane
image.
[0272] (11)
[0273] The image processing apparatus according to any one of (1)
to (10), [0274] wherein the image generating unit sets a lower end
of a designated object in the plane images other than the specific
plane image to the position matched with the position crossing the
inclined surface in the three-dimensional image.
[0275] (12)
[0276] An image processing apparatus comprising: [0277] an image
control unit that controls an image displayed on a display unit;
[0278] a three-dimensional image generating unit that generates a
three-dimensional image where the space positions of objects of a
plurality of plane images are set, from the plurality of plane
images having the virtual distances in a depth direction,
respectively; and [0279] an input unit that receives an operation
from a user, [0280] wherein the image control unit displays
thumbnail images in which the plurality of plane images and
overlapped images, in which the plane images are displayed in an
overlapped manner at a predetermined angle in a depth direction,
are reduced, and the image control unit displays the plane image or
the overlapped image corresponding to the selected thumbnail image
in an editable state along the thumbnail images of the plurality of
plane images and the overlapped images, when the input unit
receives a command selecting the thumbnail image.
[0281] (13)
[0282] The image processing apparatus according to (12), [0283]
wherein the image control unit displays the plane images and the
overlapped images in an overlapped manner and displays the
thumbnail images as tabs of the plurality of plane images and the
overlapped images, respectively.
[0284] (14)
[0285] The image processing apparatus according to (12) or (13),
[0286] wherein the image control unit further displays a tab to
receive an addition of the plane images.
[0287] (15)
[0288] The image processing apparatus according to any one of (12)
to (14), [0289] wherein the image control unit displays the
overlapped image on a front surface and displays only the tabs of
the non-selected plane images, when the thumbnail image
corresponding to the overlapped image is selected, and [0290] the
input unit accepts an input to change the virtual distance of the
selected plane image in a depth direction, among the plane images
displayed in the overlapped image.
[0291] (16)
[0292] The image processing apparatus according to any one of (12)
to (15), [0293] wherein the image control unit displays a screen
where only an image of the plane image in the screen is highlighted
on a front surface, when the thumbnail image corresponding to the
plane image is selected.
[0294] (17)
[0295] The image processing apparatus according to any one of (12)
to (16), [0296] wherein the image control unit displays a screen
where images of the non-selected plane images in the screen are
grayed out on a front surface, when the thumbnail image
corresponding to the plane image is selected.
[0297] (18)
[0298] The image processing apparatus according to any one of (12)
to (17), [0299] wherein the image control unit displays distance
scales showing the virtual distances of the plane images in the
depth direction and displays a depth change operation input unit to
operate the virtual distances of the plane images displayed in an
editable state in the depth direction by the input unit, and [0300]
the input unit accepts an operation to generate or edit the images
in the plane images and accepts an operation with respect to the
depth change operation input unit, when the plane images are
displayed in the editable state.
[0301] (19)
[0302] The image processing apparatus according to any one of (12)
to (18), [0303] wherein the depth change operation input unit has a
button to move the virtual distances of the plane images displayed
in the editable state in the depth direction to the front side and
a button to move the virtual distances to the inner side.
[0304] (20)
[0305] The image processing apparatus according to any one of (12)
to (19), [0306] wherein the depth change operation input unit is an
object that corresponds to the plane image displayed on the
distance scales in the editable state.
[0307] (21)
[0308] The image processing apparatus according to any one of (12)
to (20), [0309] wherein the image control unit displays a
horizontal line change operation input unit to operate the
horizontal line position set in the plane image displayed in an
editable state by the input unit, and [0310] the input unit accepts
an operation with respect to the horizontal line change operation
input unit, when the plane image is displayed in an editable
state.
[0311] (22)
[0312] The image processing apparatus according to any one of (12)
to (21), [0313] wherein the image control unit does not display an
image of the lower side of the horizontal line position set in the
plane image displayed in the editable state.
[0314] (23)
[0315] The image processing apparatus according to any one of (12)
to (22), [0316] wherein the three-dimensional image generating unit
does not reflect an image of the lower side of the horizontal line
position set in each plane image to a generated three-dimensional
image.
[0317] (24)
[0318] The image processing apparatus according to any one of (12)
to (23), [0319] wherein the image control unit performs a control
operation to reflect the edit result on the thumbnail image of the
overlapped image, when the plane image corresponding to the
selected thumbnail image is edited.
[0320] (25)
[0321] The image processing apparatus according to any one of (12)
to (24), further comprising: [0322] the display unit.
[0323] (26)
[0324] An image processing method comprising: [0325] generating a
plurality of plane images and setting virtual distances in a depth
direction to the plurality of generated plane images, respectively;
[0326] converting the plurality of plane images into a
three-dimensional image where space positions of objects in each of
the plurality of plane images are set, based on the [0327] virtual
distances set to the plurality of generated plane images; [0328]
outputting data of the converted three-dimensional image; [0329]
displaying the plurality of generated plane images individually or
to be overlapped and generating display data of an editing screen
displayed by providing tabs to the plane images, respectively; and
[0330] accepting an operation to generate or edit images in the
generated editing screen.
[0331] (27)
[0332] A program that causes a computer to execute an image
process, the program causing the computer to execute: [0333]
generating a plurality of plane images and setting the virtual
distances of a depth direction to the plurality of generated plane
images, respectively; [0334] converting the plurality of plane
images into a three-dimensional image where the space positions of
objects of the plurality of plane images are set, on the basis of
the virtual distances set to the plurality of generated plane
images; [0335] outputting data of the converted three-dimensional
image; [0336] displaying the plurality of generated plane images
individually or to be overlapped and generating display data of an
editing screen displayed by providing tabs to the plane images,
respectively; and [0337] accepting an operation to generate or edit
images in the generated editing screen.
Reference Signs List
[0338] a to f display object
[0339] 11, 12, 13 generated image
[0340] 11a, 12a, 13a number of layers display
[0341] 100 image processing apparatus
[0342] 110 image generating/processing unit
[0343] 112 image generating unit
[0344] 114 three-dimensional image converting unit
[0345] 116 three-dimensional image generating unit
[0346] 118 editing screen generating unit
[0347] 120 image storage unit
[0348] 130 input unit
[0349] 140 image display unit
[0350] 201 tab
[0351] 202 3D state thumbnail tab
[0352] 203 image display in edition processing
[0353] 203a first layer edge portion
[0354] 203b second layer edge portion
[0355] 203c third layer edge portion
[0356] 211 to 213 layer thumbnail tab
[0357] 221 file importing button
[0358] 222 camera capturing button
[0359] 223 stamp button
[0360] 224 character input button
[0361] 225 depth operation button
[0362] 231 to 237 object display unit
[0363] 241 generation start button
[0364] 242 save button
[0365] 243 three-dimensional display button
[0366] 250L image for left eye
[0367] 250R image for right eye
[0368] 251, 251' first layer image
[0369] 252, 252' second layer image
[0370] 253, 253' third layer image
[0371] 259, 259' display surface
[0372] 261 horizontal line adjustment bar
[0373] 262 ground surface setting position display
[0374] 271 ground surface position display in layer
[0375] 272 non-display portion
[0376] 281, 282 position movement button
[0377] 283 returning button
[0378] 284 erasure button
[0379] 285 ground surface adjustment button
[0380] 290 pen tool
[0381] 291 first pen
[0382] 292 second pen
[0383] 293 third pen
[0384] 294 eraser
[0385] 301 depth axis
[0386] 301a to 301d depth position
[0387] 302 front edge portion
[0388] 303 ground surface setting position
[0389] 304 virtual display surface
[0390] 311 first layer image
[0391] 312 second layer image
[0392] 313 third layer image
[0393] 321 horizontal line position
[0394] 401 depth bar display
[0395] 401a, 401b, 401c layer position
[0396] 411, 412, 421, 422, 431, 432 depth adjustment button
[0397] 501 depth bar display
[0398] 502 virtual display surface position
[0399] 503 layer position
[0400] 511, 512 depth adjustment button
[0401] 601 depth bar display
[0402] 603 layer position
[0403] 611, 612 depth adjustment button
[0404] 701 depth bar display
[0405] 702 virtual display surface position
[0406] 703 layer position
[0407] 704 depth frame
[0408] 705 depth value display
[0409] 711, 712, 721, 722, 731, 732 depth adjustment button
[0410] 810 camera capturing operation screen
[0411] 811 camera capturing image
[0412] 812 extraction image
[0413] 820 image file importing operation screen
[0414] 821 imported image
[0415] 822 extraction image
* * * * *