U.S. patent application number 14/475289 was filed with the patent office on 2014-12-18 for image processing apparatus, image display system, and image processing method and program.
The applicant listed for this patent is CANON KABUSHIKI KAISHA. Invention is credited to Takao Tani, Takuya Tsujimoto.
Application Number | 20140368632 14/475289 |
Document ID | / |
Family ID | 48697507 |
Filed Date | 2014-12-18 |
United States Patent
Application |
20140368632 |
Kind Code |
A1 |
Tsujimoto; Takuya ; et
al. |
December 18, 2014 |
IMAGE PROCESSING APPARATUS, IMAGE DISPLAY SYSTEM, AND IMAGE
PROCESSING METHOD AND PROGRAM
Abstract
There is provided an image processing apparatus which can
prevent the case where it is difficult to perform highly accurate
diagnosis using combined boundary regions in a combined image. An
image processing apparatus includes image data acquisition means,
combined-image data generation means, and combined-boundary-region
display data generation means. The image data acquisition means
acquires multiple pieces of divided image data obtained by
capturing images of multiple regions into which a captured area for
an imaging target is divided. The combined-image data generation
means generates combined-image data on the basis of the multiple
divided image data. The combined-boundary-region display data
generation means generates display image data to be used for an
observer to recognize combined boundary regions in the
combined-image data. The combined-boundary-region display data
generation means changes at least one of a color and a brightness
for all of the combined boundary regions included in a display
area.
Inventors: |
Tsujimoto; Takuya;
(Kawasaki-shi, JP) ; Tani; Takao; (Tokyo,
JP) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
CANON KABUSHIKI KAISHA |
Tokyo |
|
JP |
|
|
Family ID: |
48697507 |
Appl. No.: |
14/475289 |
Filed: |
September 2, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13909847 |
Jun 4, 2013 |
8854448 |
|
|
14475289 |
|
|
|
|
PCT/JP2012/083830 |
Dec 27, 2012 |
|
|
|
13909847 |
|
|
|
|
Current U.S.
Class: |
348/79 |
Current CPC
Class: |
H04N 5/3415 20130101;
G02B 21/36 20130101; G06T 3/00 20130101; H04N 7/18 20130101; G09G
5/02 20130101; H04N 5/23238 20130101; G06T 3/4038 20130101 |
Class at
Publication: |
348/79 |
International
Class: |
G02B 21/36 20060101
G02B021/36; G09G 5/02 20060101 G09G005/02 |
Foreign Application Data
Date |
Code |
Application Number |
Dec 27, 2011 |
JP |
2011-286785 |
Dec 26, 2012 |
JP |
2012-282781 |
Claims
1. An image processing apparatus comprising: image data acquisition
means that acquires a plurality of pieces of divided image data
obtained by capturing images of a plurality of regions into which a
captured area for an imaging target is divided; combined-image data
generation means that generates combined-image data on the basis of
the plurality of divided image data; and combined-boundary-region
display data generation means that generates display image data to
be used for an observer to recognize combined boundary regions in
the combined-image data, wherein the combined-boundary-region
display data generation means changes at least one of a color and a
brightness for all of the combined boundary regions included in a
display area.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a Continuation of co-pending U.S. patent
application Ser. No. 13/909,847, filed Jun. 4, 2013, which is
Continuation of International Patent Application No.
PCT/JP2012/083830, filed Dec. 27, 2012, which claims the benefit of
Japanese Patent Application No. 2011-286785, filed Dec. 27, 2011
and Japanese Patent Application No. 2012-282781, filed Dec. 26,
2012, all of which are hereby incorporated by reference herein in
their entirety.
TECHNICAL FIELD
[0002] The present invention relates to image processing
apparatuses, and particularly, to digital image processing for
observing an imaging target.
BACKGROUND ART
[0003] Recently, in the field of pathology, attention has been
focused on virtual slide systems that serve as a substitute for an
optical microscope which is a tool for pathological diagnosis and
that enable pathological diagnosis to be performed on a display by
photographing a sample to be examined (imaging target) which is
mounted on a preparation and by digitizing images. By digitizing
pathological diagnosis images by using a virtual slide system,
optical microscope images of a sample to be examined in the related
art can be handled as digital data. As a result, advantages, such
as faster telediagnosis, the usage of digital images as an
explanatory aid for patients, sharing of rare cases, and more
efficient education and training, are expected to be achieved.
[0004] To achieve operations using a virtual slide system which are
approximately equivalent to those using an optical microscope, it
is necessary to digitize the entire sample to be examined which is
on a preparation. By digitizing the sample to be examined, digital
data generated using a virtual slide system can be observed using
viewer software which operates on a personal computer (PC) or a
workstation. The number of pixels for the entire digitized sample
to be examined is typically several hundreds of millions to several
billions, which is a very large amount of data.
[0005] The amount of data generated using a virtual slide system is
enormous, and various observations can be performed from
micro-observation (enlarged detail image) to macro-observation
(bird's-eye view of the whole) by using a viewer to perform an
enlargement/reduction process, and the system is thus convenient in
various different ways. By obtaining all pieces of necessary
information in advance, instant display of a low-magnification
image to a high-magnification image can be performed using a
resolution and a magnification which are desired by a user. In
addition, obtained digital data is subjected to image analysis,
and, for example, determination of the shape of a cell, calculation
of the number of cells, and calculation of the nucleus to cytoplasm
area ratio (N/C ratio) are performed. Accordingly, various types of
information useful in pathological diagnosis can be also
presented.
[0006] As such a technique of obtaining a high-magnification image
of an imaging target, a method has been proposed in which
high-magnification images obtained by photographing portions of an
imaging target are used to obtain a high-magnification image of the
entire imaging target. Specifically, in PTL 1, a microscope system
is disclosed which divides an imaging target into small portions
that are photographed, and which combines the images for the small
portions thus obtained into a combined image for the imaging target
which is displayed. In PTL 2, an image display system is disclosed
which obtains partial images for an imaging target by capturing
images while the stage of a microscope is being moved and corrects
distortions in the images so as to combine the images. In PTL 2, a
combined image having unobtrusive connecting regions is
generated.
CITATION LIST
Patent Literature
[0007] PTL 1: Japanese Patent Laid-Open No. 2007-121837 [0008] PTL
2: Japanese Patent Laid-Open No. 2010-134374
[0009] A connecting region in a combined image obtained using the
microscope system of PTL 1 and the image display system of PTL 2 is
highly likely to be an image different from that obtained when a
pathologist performs observation using an optical microscope, due
to an adverse effect of artifacts caused by, for example,
misregistration between partial images which inevitably occurs and
distortion correction. Nevertheless, when diagnosis is made using a
combined image without recognizing the possibility of such
misdiagnosis, there arises a problem in that diagnosis is made on
the basis of connecting regions in the combined image, hindering
highly accurate diagnosis.
SUMMARY OF INVENTION
[0010] The present invention is essentially embodied in an image
processing apparatus including image data acquisition means,
combined-image data generation means, and combined-boundary-region
display data generation means. The image data acquisition means
acquires multiple pieces of divided image data obtained by
capturing images of multiple regions into which a captured area for
an imaging target is divided. The combined-image data generation
means generates combined-image data on the basis of the multiple
divided image data. The combined-boundary-region display data
generation means generates display image data to be used for an
observer to recognize combined boundary regions in the
combined-image data. The combined-boundary-region display data
generation means changes at least one of a color and a brightness
for all of the combined boundary regions included in a display
area.
[0011] In addition, the present invention is essentially embodied
in a microscope image display system including at least the image
processing apparatus and an image display apparatus. The image
display apparatus displays combined-image data which is for the
imaging target and which is transmitted from the image processing
apparatus, and image data to be used for an observer to recognize
the combined boundary regions.
[0012] Further, the present invention is essentially embodied in an
image processing method including acquiring multiple pieces of
divided image data obtained by capturing images of multiple regions
into which a captured area for an imaging target is divided,
generating combined-image data on the basis of the multiple divided
image data, and generating display image data to be used for an
observer to recognize combined boundary regions in the
combined-image data. In the generating of display image data, at
least one of a color and a brightness is changed for all of the
combined boundary regions included in a display area.
[0013] Furthermore, the present invention is essentially embodied
in a program causing a computer to execute a process including
acquiring multiple pieces of divided image data obtained by
capturing images of multiple regions into which a captured area for
an imaging target is divided, generating combined-image data on the
basis of the multiple divided image data, and generating display
image data to be used for an observer to recognize combined
boundary regions in the combined-image data. In the generating of
display image data, at least one of a color and a brightness is
changed for all of the combined boundary regions included in a
display area.
[0014] Further features of the present invention will become
apparent from the following description of exemplary embodiments
with reference to the attached drawings.
BRIEF DESCRIPTION OF DRAWINGS
[0015] FIG. 1 is an exemplary overall view of the apparatus
configuration of an image display system using an image processing
apparatus according to the present invention.
[0016] FIG. 2 is an exemplary functional block diagram for an
imaging apparatus in an image display system using an image
processing apparatus according to the present invention.
[0017] FIG. 3 is an exemplary functional block diagram for an image
processing apparatus according to the present invention.
[0018] FIG. 4 is an exemplary hardware configuration diagram for an
image processing apparatus according to the present invention.
[0019] FIGS. 5A and 5B are diagrams for describing concepts of
combined-image data generation and combined-boundary-region data
generation according to a first embodiment.
[0020] FIG. 6 is an exemplary flowchart of generation of
combined-boundary-region display data in an image processing
apparatus according to the present invention.
[0021] FIG. 7 is an exemplary flowchart of generation of
combined-boundary-region rendered data.
[0022] FIG. 8 is an exemplary flowchart of a superimposing
process.
[0023] FIGS. 9A to 9E illustrate an exemplary display screen of an
image display system according to the present invention.
[0024] FIG. 10 is an exemplary flowchart of switching of display
performed by switching means of an image processing apparatus
according to the present invention.
[0025] FIG. 11 is an overall view of an image display system using
an image processing apparatus according to a second embodiment.
[0026] FIG. 12 is a diagram for describing a concept of
combined-boundary-region data generation according to the second
embodiment.
[0027] FIG. 13 is a flowchart of combined-boundary-region data
generation according to the second embodiment.
[0028] FIG. 14 is a flowchart of a superimposing process according
to the second embodiment.
[0029] FIGS. 15A to 15C illustrate an exemplary display screen of
an image display system according to the second embodiment.
DESCRIPTION OF EMBODIMENTS
[0030] Embodiments of the present invention will be described below
with reference to the drawings. The entire description below is
about preferable embodiments of the present invention, and the
present invention is not limited to this.
[0031] A preferable image processing apparatus according to the
present invention includes image data acquisition means,
combined-image data generation means, and combined-boundary-region
display data generation means. The image data acquisition means
acquires multiple pieces of divided image data obtained by
capturing images of multiple regions into which a captured area for
an imaging target is divided. The combined-image data generation
means generates combined-image data on the basis of the multiple
divided image data. The combined-boundary-region display data
generation means generates display image data to be used for an
observer to recognize combined boundary regions in the
combined-image data. The combined-boundary-region display data
generation means changes at least one of a color and a brightness
for all of the combined boundary regions included in a display
area. Thus, when images (data) captured by dividing the region for
an imaging target to be photographed into small portions are
combined so that an image (data) of the imaging target is
generated, the case can be prevented in which it is difficult to
perform highly accurate diagnosis using the combined boundary
regions due to display of the combined image (data) which may be
different from the original image of the imaging target. The image
processing apparatus according to the present invention can be
applied to an image obtained using a microscope. In addition, the
image processing apparatus according to the present invention can
be used in an image display system, and particularly in a
microscope image display system or a virtual slide system.
[0032] Examples of a method for combining images (data) which is
used herein include connection of pieces of image data
(hereinafter, may be also referred to as "partial image data"),
superimposition of pieces of partial image data, alpha-blending of
pieces of partial image data, and interpolation to combine pieces
of partial image data smoothly. Examples of the method for
connecting pieces of image data to be overlapped include a method
in which the pieces of image data are connected by aligning them on
the basis of the position information of the stage, a method in
which the pieces of image data are connected by associating the
corresponding points or the corresponding lines in the pieces of
divided image, and a method in which the pieces of image data are
connected on the basis of the position information of the divided
image data. Superimposition means that pieces of image data overlap
in a broad sense. Examples of the method for superimposing pieces
of image data include a case where portions or the entireties of
the pieces of image data overlap in a region having overlapped
image data. Alpha-blending indicates that two images are combined
using a coefficient (a value). Examples of the method in which
interpolation is performed to connect pieces of image data smoothly
include a process using zero-order interpolation, a process using
linear interpolation, and a process using higher-degree
interpolation. To connect images smoothly, a process using
higher-degree interpolation is preferable.
[0033] The combined-boundary-region display data generation means
is means for generating data to be used for an observer to visually
recognize the combined boundary regions in the displayed image.
[0034] A combined boundary region is obtained when image data is
combined, and is a connecting region between pieces of original
partial image data or a region which is in a combined image and in
which image data whose appearance is difference from that of the
original partial image data is generated due to a combining
process. When a combined boundary region is displayed, an observer
needs to recognize the combined boundary regions visually.
Therefore, a connecting region between pieces of original partial
image data is to be an area, not merely a line, which includes a
certain degree of surrounding area. The width of the certain degree
of surrounding area may depend on a display magnification.
[0035] The term "data for displaying a combined boundary region" is
not merely data about a combined boundary region (such as position
information of the combined boundary region), but one of the
followings: data, on the basis of which combined-image data is
processed so that combined boundary regions in an image to be
displayed can be visually recognized; and a portion of data (which
is included in combined-image data) which is rewritten so as to be
changed into data different from partial image data so that
combined boundary regions in the combined-image data can be
visually recognized.
[0036] The combined-boundary-region display data generation means
may extract combined boundary regions after combined-image data is
generated, so as to generate image data to be used for an observer
to recognize the combined boundary regions, or may generate image
data to be used for an observer to recognize combined boundary
regions, on the basis of, for example, position information of
divided image data.
[0037] The generation of combined-image data and the generation of
image data to be used for an observer to recognize combined
boundary regions are performed in any sequence, and, for example,
may be simultaneously performed. As the method for displaying a
combined boundary region, the color or the brightness is preferably
changed.
[0038] Preferably, the image processing apparatus obtains the
pieces of divided image data by using the image data acquisition
means which captures microscope images (optical microscope images),
and uses the obtained data in a virtual slide system.
[0039] The image data acquisition means may acquire pieces of
divided image data obtained by capturing images in such a manner
that the pieces of divided image data have overlapped regions, and
the combined-boundary-region display data generation means may
generate area data in which the pieces of divided image data
overlap, as image data to be used for an observer to recognize the
combined boundary regions.
[0040] When pieces of divided image data are obtained by capturing
images in such a manner that the pieces of divided image data have
overlapped regions, the combined-image data generation means
preferably performs superimposition or blending on the pieces of
divided image data so as to generate combined-image data.
[0041] When pieces of divided image data are obtained by capturing
images in such a manner that the pieces of divided image data have
overlapped regions, the combined-image data generation means
preferably interpolates the regions in which the pieces of divided
image data overlap, so as to generate combined-image data.
[0042] The combined-image data generation means may combine pieces
of divided image data so as to generate combined-image data to be
displayed, and the combined-boundary-region display data generation
means may generate a line for a region in which the pieces of
divided image data are connected, as combined-boundary-region
data.
[0043] Preferably, the image processing apparatus further includes
combined-boundary-region data switching means that performs
switching of image data to be used for an observer to recognize the
combined boundary regions generated by the combined-boundary-region
display data generation means. The combined-boundary-region data
switching means may preferably switch the mode of display of
combined boundary regions between a mode in which the combined
boundary regions are displayed and a mode in which the combined
boundary regions are not displayed.
[0044] The combined-boundary-region data switching means preferably
performs switching of image data to be used for an observer to
recognize the combined boundary regions generated by the
combined-boundary-region display data generation means, at a
certain boundary. As a certain boundary, a predetermined
magnification or a predetermined scroll speed (of an image
displayed on the image display apparatus) may be used. For example,
only in the case of a magnification higher than a certain
magnification or in the case of a scroll speed (of an image
displayed on the image display apparatus) lower than a certain
speed, it is preferable to generate image data to be used for an
observer to recognize the combined boundary regions.
[0045] A preferable image display system according to the present
invention includes at least the above-described image processing
apparatus, and an image display apparatus that displays
combined-image data which is for the imaging target and which is
transmitted from the image processing apparatus, and that displays
image data to be used for an observer to recognize the combined
boundary regions in the combined-image data.
[0046] A preferable image processing method according to the
present invention includes acquiring multiple pieces of divided
image data obtained by capturing images of multiple regions into
which a captured area for an imaging target is divided, generating
combined-image data on the basis of the multiple divided image
data, and generating display image data to be used for an observer
to recognize combined boundary regions in the combined-image data.
In the generating of display image data, at least one of a color
and a brightness is changed for all of the combined boundary
regions included in a display area. The generating of
combined-image data and the generating of display image data may be
simultaneously performed.
[0047] A preferable program according to the present invention
causes a computer to execute a process including acquiring multiple
pieces of divided image data obtained by capturing images of
multiple regions into which a captured area for an imaging target
is divided, generating combined-image data on the basis of the
multiple divided image data, and generating display image data to
be used for an observer to recognize combined boundary regions in
the combined-image data. In the generating of display image data,
at least one of a color and a brightness is changed for all of the
combined boundary regions included in a display area.
[0048] The preferable aspects in the description about the image
processing apparatus according to the present invention may be
reflected in the image processing method or program according to
the present invention.
First Embodiment
[0049] An image processing apparatus according to the present
invention may be used in an image display system including an
imaging apparatus and an image display apparatus. This image
display system will be described by using FIG. 1.
[0050] Configuration of Image Pickup System
[0051] FIG. 1 illustrates an image display system using an image
processing apparatus according to the present invention. The image
display system includes an imaging apparatus (a microscope
apparatus or a virtual slide scanner) 101, an image processing
apparatus 102, and an image display apparatus 103, and has a
function of obtaining and displaying two-dimensional images of an
imaging target (sample to be examined) which is to be photographed.
The imaging apparatus 101 and the image processing apparatus 102
are connected to each other through a cable 104 which is a
dedicated I/F or a general-purpose I/F, whereas the image
processing apparatus 102 and the image display apparatus 103 are
connected to each other through a cable 105 which is a
general-purpose I/F.
[0052] As the imaging apparatus 101, a virtual slide apparatus may
be used which has a function of capturing multiple two-dimensional
images at different positions in the two-dimensional direction and
outputting digital images. A solid-state image sensing element,
such as a charge-coupled device (CCD) or a complementary metal
oxide semiconductor (CMOS), is used to obtain a two-dimensional
image. Instead of a virtual slide apparatus, the imaging apparatus
101 may include a digital microscope apparatus in which a digital
camera is attached to an eyepiece portion of a typical optical
microscope.
[0053] The image processing apparatus 102 has, for example, a
function of generating combined-image data by using multiple pieces
of divided original image data obtained from the imaging apparatus
101. The image processing apparatus 102 is constituted by a
general-purpose computer or workstation which includes hardware
resources, such as a central processing unit (CPU), a RAM, a
storage device, an operation unit, and an I/F. The storage device
is a mass information storage device such as a hard disk drive, and
stores, for example, programs, data, and an operating system (OS)
for achieving processes described below. The above-described
functions are achieved with the CPU loading necessary programs and
data from the storage device onto the RAM and executing the
programs. The operation unit is constituted by, for example, a
keyboard and a mouse, and is used by an operator to input various
instructions. The image display apparatus 103 is a monitor that
displays the images for observation which are the results of
computation performed by the image processing apparatus 102, and is
constituted by, for example, a CRT or a liquid crystal display.
[0054] In the example in FIG. 1, an image pickup system is
constituted by three apparatuses which are the imaging apparatus
101, the image processing apparatus 102, and the image display
apparatus 103. However, the configuration of the present invention
is not limited to this. For example, an image processing apparatus
into which an image display apparatus is integrated may be used, or
the function of an image processing apparatus may be incorporated
into an imaging apparatus. Alternatively, the functions of an
imaging apparatus, an image processing apparatus, and an image
display apparatus may be achieved in a single apparatus. In
contrast, the function of, for example, an image processing
apparatus may be divided into small functions which are performed
in multiple apparatuses.
[0055] Configuration of Imaging Apparatus
[0056] FIG. 2 is a block diagram illustrating the functional
configuration of the imaging apparatus 101.
[0057] The imaging apparatus 101 generally includes a lighting unit
201, a stage 202, a stage control unit 205, an imaging optical
system 207, an image pickup unit 210, a development processing unit
216, a pre-measurement unit 217, a main control system 218, and a
data output unit 219.
[0058] The lighting unit 201 is means which uniformly irradiates,
with light, a preparation 206 located on the stage 202, and
includes a light source, an illumination optical system, and a
control system for driving the light source. The stage 202 is
driven and controlled by the stage control unit 205, and can be
moved in the three XYZ axes. The preparation 206 is a member in
which a slice of tissue or smear cells which serve as an
observation object are put onto the slide glass so as to be held
together with a mounting agent under the cover glass.
[0059] The stage control unit 205 includes a drive control system
203 and a stage driving mechanism 204. The drive control system 203
receives an instruction from the main control system 218, and
controls driving of the stage 202. The movement direction, the
movement amount, and the like of the stage 202 are determined on
the basis of the position information and the thickness information
(distance information) of an imaging target which are measured by
the pre-measurement unit 217, and on the basis of an instruction
from a user when necessary. The stage driving mechanism 204 drives
the stage 202 in accordance with an instruction from the drive
control system 203.
[0060] The imaging optical system 207 is a lens unit for forming an
optical image of an imaging target on the preparation 206 onto an
imaging sensor 208.
[0061] The image pickup unit 210 includes the imaging sensor 208
and an analog front end (AFE) 209. The imaging sensor 208 is a
one-dimensional or two-dimensional image sensor which converts a
two-dimensional optical image into an electrical physical quantity
through photoelectric conversion, and, for example, a CCD or CMOS
device is used. When a one-dimensional sensor is used, a
two-dimensional image is obtained by performing scanning in a
scanning direction. An electric signal having a voltage value
according to light intensity is output from the imaging sensor 208.
In the case where a color image is desired as a captured image, for
example, a single-chip image sensor to which a color filter using a
Bayer array is attached may be used. The image pickup unit 210
captures divided images for an imaging target with the stage 202
being driven in the XY axes directions.
[0062] The AFE 209 is a circuit that converts an analog signal
which is output from the imaging sensor 208 into a digital signal.
The AFE 209 includes an H/V driver, a correlated double sampling
(CDS), an amplifier, an AD converter, and a timing generator, which
are described below. The H/V driver converts a vertical
synchronizing signal and a horizontal synchronizing signal for
driving the imaging sensor 208 into a potential which is necessary
to drive the sensor. The CDS is a correlated double sampling
circuit which removes fixed-pattern noise. The amplifier is an
analog amplifier which adjusts a gain of an analog signal which has
been subjected to noise reduction in the CDS. The AD converter
converts an analog signal into a digital signal. In the case where
an output from the final stage of the imaging apparatus is 8-bit,
the AD converter converts an analog signal into digital data
obtained through quantization from 10 bits to the order of 16 bits,
with consideration of downstream processes, and outputs the digital
data. The converted sensor output data is called RAW data. The RAW
data is subjected to a development process in the development
processing unit 216 which is located downstream. The timing
generator generates a signal for adjusting timing for the imaging
sensor 208 and timing for the development processing unit 216 which
is located downstream.
[0063] In the case where a CCD is used as the imaging sensor 208,
the above-described AFE 209 is necessary. In contrast, in the case
where a CMOS image sensor which can output a digital output is
used, the sensor includes the function of the above-described AFE
209. In addition, an image pickup controller (not illustrated)
which controls the imaging sensor 208 is present, and not only
controls the operations of the imaging sensor 208 but also is
responsible for operation timing and control for a shutter speed, a
frame rate, and a region of interest (ROI), and the like.
[0064] The development processing unit 216 includes a black
correction unit 211, a white balance adjustment unit 212, a
demosaicing unit 213, a filtering unit 214, and a .gamma.
correction unit 215. The black correction unit 211 subtracts data
for black correction obtained with light being shielded, from each
of the pixels of the RAW data. The white balance adjustment unit
212 adjusts a gain of each of the RGB colors in accordance with the
color temperature of light from the lighting unit 201 so as to
reproduce desired white. Specifically, data for white balance
correction is added to the RAW data after the black correction. In
the case where a monochrome image is handled, the white balance
adjustment process is not necessary. The development processing
unit 216 generates divided image data for an imaging target
photographed by the image pickup unit 210.
[0065] The demosaicing unit 213 generates image data for each of
the RGB colors from the RAW data according to the Bayer array. The
demosaicing unit 213 calculates RGB-color values of a target pixel
through interpolation using values of the surrounding pixels
(including pixels of the same color and pixels of the other colors)
in the RAW data. In addition, the demosaicing unit 213 performs a
correction process (interpolation process) on a defective pixel. In
the case where the imaging sensor 208 has no color filters and
where a monochrome image is obtained, the demosaicing process is
not necessary.
[0066] The filtering unit 214 is a digital filter which achieves
suppression of high-frequency components included in an image,
noise reduction, and emphasis of high resolution. The .gamma.
correction unit 215 adds the inverse of gradation expression
characteristics of a typical display device to an image, and
performs gradation conversion in accordance with the visual
property of a man through gradation compression in a high-luminance
portion or dark processing. According to the present embodiment, to
obtain an image for morphological observation, image data is
subjected to gradation conversion which is adequate for a combined
process and a display process which are located downstream.
[0067] The pre-measurement unit 217 performs pre-measurement for
calculating position information of an imaging target on the
preparation 206, distance information to a desired focal position,
and parameters for light-quantity adjustment caused by the
thickness of the imaging target. The pre-measurement unit 217
obtains information before the main measurement, enabling images to
be efficiently captured. To obtain position information in a
two-dimensional plane, a two-dimensional imaging sensor having a
resolution lower than that of the imaging sensor 208 is used. The
pre-measurement unit 217 determines the position of an imaging
target in the XY plane from an obtained image. A laser displacement
sensor or a Shack-Hartmann measuring device is used to obtain the
distance information and the thickness information.
[0068] The main control system 218 controls various units described
above. The functions of the main control system 218 and the
development processing unit 216 are achieved by a control circuit
having a CPU, a ROM, and a RAM. That is, the ROM stores programs
and data, and the CPU uses the RAM as a work memory so as to
execute the programs, achieving the functions of the main control
system 218 and the development processing unit 216. A device, such
as an EEPROM or a flash memory, is used as the ROM, and a DRAM
device such as DDR3 is used as the RAM.
[0069] The data output unit 219 is an interface for transmitting an
RGB color image generated by the development processing unit 216 to
the image processing apparatus 102. The imaging apparatus 101 and
the image processing apparatus 102 are connected to each other
through an optical communications cable. Alternatively, an
interface, such as a USB or a GigabitEthernet (registered
trademark) is used.
[0070] Configuration of Image Processing Apparatus
[0071] FIG. 3 is a block diagram illustrating the functional
configuration of the image processing apparatus 102 according to
the present invention.
[0072] The image processing apparatus 102 generally includes data
input/output units 301 and 308, a storage holding unit 302, a
combining processor 303, a combined-boundary-region extracting unit
304, a combined-boundary-region rendering unit 305, a superimposing
processor 306, and a mode selector 307.
[0073] The storage holding unit 302 receives, via the data input
unit 301, divided color image data of RGB which is obtained from an
external apparatus and which is obtained by photographing divided
portions of an imaging target, and stores and holds the data. The
color image data includes not only image data but also position
information. The position information is information describing
which piece of divided image data corresponds to which photographed
portion of the imaging target. For example, the position
information may be obtained by recording the XY coordinates of the
stage 202 which is being driven, as well as divided image data,
when the divided image data is captured.
[0074] The combining processor 303 generates combined-image data
for the imaging target by using the color image data (divided image
data) obtained by photographing the divided portions of the imaging
target, on the basis of the position information of the pieces of
divided image data.
[0075] The combined-boundary-region extracting unit 304 extracts
combined boundary regions which have been subjected to, for
example, interpolation, in the combined-image data generated by the
combining processor 303. For example, when the pieces of divided
image data are connected in a simple manner, connecting regions are
extracted as a combined boundary region. When the pieces of divided
image data are smoothly connected through, for example,
interpolation, connecting regions to which the interpolation or the
like has been applied are extracted as a combined boundary region.
In the present embodiment, it is assumed that images are captured
in such a manner that areas corresponding to connecting regions
overlap each other, and that an interpolation process is applied to
the obtained pieces of divided image data so that the pieces of
divided image data are smoothly connected.
[0076] The mode selector 307 selects a mode with which combined
boundary regions are displayed. As the mode with which combined
boundary regions are displayed, change of the color, change of the
brightness, display of a dotted line, blinking, or the like is
specified. The detail will be described by using FIGS. 5A and
5B.
[0077] The combined-boundary-region rendering unit 305 renders
combined boundary regions extracted by the combined-boundary-region
extracting unit 304, by using the mode selected by the mode
selector 307.
[0078] The superimposing processor 306 superimposes the rendered
data for combined boundary regions which have been rendered by the
combined-boundary-region rendering unit 305 on the combined-image
data generated by the combining processor 303. In the
combined-image data which is obtained after the superimposing
process and in which the connecting regions which have been
generated overlap, regions which have been subjected to the
combining process are distinguished from the original divided image
data which is not subjected to the combining process. The
combined-image data which is obtained after the superimposing
process and in which the combined boundary regions are
distinguished is transmitted via the data output unit 308 to, for
example, an external monitor.
[0079] Hardware Configuration of Image Processing Apparatus
[0080] FIG. 4 is a block diagram illustrating the hardware
configuration of the image processing apparatus according to the
present invention. For example, a personal computer (PC) 400 is
used as an information processing apparatus.
[0081] The PC 400 includes a central processing unit (CPU) 401, a
hard disk drive (HDD) 402, a random access memory (RAM) 403, a data
input/output unit 405, and a bus 404 which connects these to each
other.
[0082] The CPU 401 accesses, for example, the RAM 403 when
necessary as appropriate, and has overall control of entire blocks
in the PC 400 while performing various computation processes. The
hard disk drive (HDD) 402 is an auxiliary storage which permanently
stores the OS executed by the CPU 401 and firmware, such as
programs and various parameters, and which records and reads out
information. The RAM 403 is used, for example, as a work area for
the CPU 401, and temporarily stores the OS, various programs which
are being executed, and various data which is to be processed, such
as the combined image that is obtained after the superimposing
process and that is a feature of the present invention.
[0083] The image display apparatus 103, an input apparatus 407, the
imaging apparatus 101 which is an external apparatus, and the like
are connected to the data input/output unit 405.
[0084] The image display apparatus 103 is a display device using,
for example, liquid crystal, electro-luminescence (EL), or a
cathode ray tube (CRT). It is assumed that the image display
apparatus 103 is connected as an external apparatus. Alternatively,
it may be assumed that the PC 400 is integrated with an image
display apparatus.
[0085] Examples of the input apparatus 407 include a pointing
device such as a mouse, a keyboard, a touch panel, and other
operation input apparatuses. When the input apparatus 407 includes
a touch panel, the touch panel may be integrated with the image
display apparatus 103.
[0086] The imaging apparatus 101 is image pickup equipment, such as
a microscope apparatus or a virtual slide scanner.
[0087] Display of Combined Boundary Region
[0088] The combined image after the superimposing process which is
display data generated by the superimposing processor 306 included
in the image processing apparatus according to the present
invention and which is to be displayed on the image display
apparatus 103 will be described by using FIGS. 5A and 5B.
[0089] In the image processing apparatus according to the present
invention, combined-image data is generated by combining pieces of
image data obtained by photographing divided portions (FIG. 5A). By
rendering combined boundary regions and superimposing them on the
obtained combined-image data, combined-image data after the
superimposing process in which the combined boundary regions are
distinguished is obtained (FIG. 5B).
[0090] As a method for generating data to be used to display
combined boundary regions and displaying the generated data, for
example, the following methods may be used: a method in which the
data is generated by changing the color information of the combined
boundary regions; a method in which the data is generated by
changing the brightness information of the combined boundary
regions; a method in which the data is generated by displaying
grids in the center regions (center lines) of the combined boundary
regions; a method in which the combined boundary regions are
displayed with markers such as arrows; and a method in which the
rendered combined boundary regions are switched in a time division
manner and in which blinking is displayed to indicate whether or
not superimposition has been performed. The method in which the
display data is generated by changing the color of the combined
boundary regions is preferable because the areas for the combined
boundary regions are distinguished from the other regions. The
method in which the display data is generated by changing the
brightness of the combined boundary regions is preferable because
the areas for the combined boundary regions are distinguished from
the other regions and the image data for the combined boundary
regions required in diagnosis can be used.
[0091] Method for Displaying Combined Boundary Regions
[0092] The flow of generation of the combined-boundary-region
display data in the image processing apparatus according to the
present invention will be described using the flowchart in FIG.
6.
[0093] In step S601, in the image processing apparatus 102, pieces
of image data (divided image data) which are obtained by dividing
the region of an imaging target to be photographed into multiple
regions and photographing the obtained regions are obtained from,
for example, the imaging apparatus 101 which is an external
apparatus via the data input/output unit 301, and are transmitted
to the storage holding unit 302.
[0094] In step S602, the position information included in the
divided image data stored in the storage holding unit 302 or the
position information attached to the divided image data as separate
data is grasped. The position information is information describing
which piece of divided image data corresponds to which photographed
portion of the imaging target.
[0095] In step S603, the combining processor 303 combines the
divided image data on the basis of the grasped position
information, and generates combined-image data for the imaging
target. Examples of the combining method include connection of
pieces of partial image data, superimposition of pieces of partial
image data, alpha-blending of pieces of partial image data, and
interpolation to combine pieces of partial image data smoothly.
Examples of the method for connecting pieces of image data to be
overlapped include a method in which the pieces of image data are
connected by aligning them on the basis of the position information
of the stage, a method in which the pieces of image data are
connected by associating the corresponding points or the
corresponding lines in the pieces of divided image, and a method in
which the pieces of image data are connected on the basis of the
position information of the divided image data. Superimposition
means that pieces of image data overlap in a broad sense. Examples
of the method for superimposing pieces of image data include a case
where portions or the entireties of the pieces of image data
overlap in a region having overlapped image data. Alpha-blending
indicates that two images are combined using a coefficient (.alpha.
value). Examples of the method in which interpolation is performed
to connect pieces of image data smoothly include a process using
zero-order interpolation, a process using linear interpolation, and
a process using higher-degree interpolation. To connect images
smoothly, a process using higher-degree interpolation is
preferable.
[0096] In step S604, the mode selector 307 selects a method for
displaying the combined boundary regions. The mode selector 307
first selects whether or not the combined boundary regions are to
be displayed. If the combined boundary regions are to be displayed,
the mode selector 307 selects how the display appears. For example,
as the display method, a display mode, such as change of the color
or change of the brightness, is selected.
[0097] In step S605, it is determined whether or not the combined
boundary regions are to be displayed in the combined image. If it
is determined that the combined boundary regions are not to be
displayed in the combined image, the image data is transmitted to
the outside via the data output unit 308 without rendering the
combined boundary regions or superimposing the rendered combined
boundary regions in the superimposing processor 306. If it is
determined that the combined boundary regions are to be displayed
in the combined image, the process proceeds to the next step
S606.
[0098] In step S606, areas for the combined boundary regions are
extracted from the generated combined-image data on the basis of
the position information.
[0099] In step S607, the combined-boundary-region rendering unit
305 generates rendered data for the combined boundary regions
extracted in step S606 by using the display method selected in step
S604. The detail of generation of the combined-boundary-region
rendered data will be described below using another flowchart.
[0100] In step S608, the superimposing processor 306 superimposes
the combined-boundary-region rendered data generated in step S607
on the combined-image data obtained in step S603, and obtains
combined-image data which is obtained after the superimposing
process and in which the combined boundary regions are
distinguished from the other regions. The detail of the
superimposing process will be described below using another
flowchart.
[0101] Rendering Connecting Regions
[0102] FIG. 7 is a flowchart of generation of the rendered data for
the connecting regions which are combined boundary regions. The
flow in the case where display is performed by changing the
brightness or the color of the combined boundary regions will be
described using FIG. 7.
[0103] In step S701, the mode which has been set by the mode
selector 307 is determined. In this step, whether the brightness or
the color is to be changed in rendering combined boundary regions
is determined.
[0104] In step S702, it is determined whether or not the brightness
is to be changed in rendering of the combined boundary regions. If
the brightness is to be changed, the process proceeds to step S703.
If the brightness is not to be changed, the process proceeds to
step S706.
[0105] In step S703, it is determined whether or not the brightness
is to be changed by decreasing the brightness of the combined
boundary regions. If the brightness of the combined boundary
regions is to be relatively decreased compared with that of the
area other than the combined boundary regions, the process proceeds
to step S704. If the brightness of the combined boundary regions is
not to be decreased, that is, the brightness of the area other than
the combined boundary regions is to be changed or the brightness of
the combined boundary regions is to be increased, the process
proceeds to step S705.
[0106] In step S704, rendered data for the combined boundary
regions in the combined image is generated with the brightness
being decreased.
[0107] In step S705, rendered data for the combined boundary
regions is generated with the brightness being not changed or being
increased.
[0108] In step S706, to change the color, the color used when the
combined boundary regions are displayed is set.
[0109] In step S707, rendered data for the combined boundary
regions is generated on the basis of the color which has been set
in step S706.
[0110] Superimposing Combined-Boundary-Region Rendered Data
[0111] FIG. 8 is a flowchart of superimposition of the
combined-boundary-region rendered data on the combined image. In
FIG. 8, the combined-boundary-region rendered data is superimposed
on the combined-image data. When change of the brightness is
selected as the rendering method for combined boundary regions, one
of the display methods is to decrease the brightness of the
combined image so that the combined boundary regions are
distinguished from the other regions. Decrease in the brightness of
the combined image is advantageous when the combined boundary
regions are to be investigated while the displayed image is being
observed.
[0112] In step S801, the combined-image data which has been
generated through the combination performed by the combining
processor 303 is obtained.
[0113] In step S802, the combined-boundary-region rendered data
generated by the combined-boundary-region rendering unit 305 is
obtained.
[0114] In step S803, it is determined whether or not the brightness
of the combined boundary regions is to be decreased on the basis of
the setting of the rendering method for the combined boundary
regions which has been determined in step S701. If the brightness
of the combined boundary regions is to be relatively decreased
compared with that of the area other than the combined boundary
regions, the process proceeds to step S804. If the brightness of
the combined boundary regions is not to be decreased, that is, the
brightness of the area other than the combined boundary regions is
to be changed, the process proceeds to step S805. In step S805, the
brightness of the combined boundary regions in the combined image
is decreased.
[0115] In step S804, to decrease the brightness of the combined
boundary regions relatively compared with that of the area other
than the combined boundary regions, the combined-boundary-region
rendered data in which the brightness of the combined boundary
regions is decreased is superimposed on the combined-image data
generated in step S603. Examples of the superimposing process
include a process in which each of the pieces of image data is
subjected to alpha-blending so that a superimposed image is
generated, in addition to a process in which a superimposed image
is generated by overwriting the combined-boundary-region rendered
data on the combined-image data.
[0116] Display Screen Layout
[0117] FIGS. 9A to 9E describe an example of the case where image
data for display which is generated by the image processing
apparatus 102 according to the present invention is displayed on
the image display apparatus 103.
[0118] FIG. 9A illustrates a layout of the screen of the image
display apparatus 103. In a whole window 901 in the screen, a
display area 902 for image-pickup-target image data which is used
for detailed observation, a thumbnail image 903 for the imaging
target to be observed, and a display setting area 904 are
displayed. These areas may be displayed in the single document
interface in such a manner that the display area in the whole
window 901 is separated into functional areas, or may be displayed
in the multiple document interface in such a manner that each area
is displayed in a separate window. In the display area 902 for
image-pickup-target image data, the image data for the imaging
target which is used for detailed observation is displayed.
Instructions for operation from a user cause movement of the
display area (selection and movement of a partial area which is to
be observed in the entire imaging target), and display of an
enlarged or reduced image due to a change of display magnification.
The thumbnail image 903 indicates the position and the size of the
display area 902 for image-pickup-target image data with respect to
the entire image of the imaging target. In the display setting area
904, for example, a setup button 905 is selected and pressed
through a user instruction from the input apparatus 407, such as a
touch panel or a mouse, which is externally connected, whereby the
display setting can be changed.
[0119] FIG. 9B illustrates a display setting screen which is
displayed as a dialog box when the setup button 905 is selected and
pressed and in which whether or not connecting regions that are
combined boundary regions in the combined image are to be displayed
is selected. In the present embodiment, it is assumed that a setup
button is provided and a setting screen is opened by pressing the
button. Alternatively, a UI in which various detailed settings
illustrated in FIG. 9C can be displayed, selected, and changed may
be provided directly on the display setting area 904. Instead, a
screen may be displayed in which a list for detailed settings
including whether or not combined boundary regions are to be
displayed is integrally displayed.
[0120] FIG. 9C illustrates a display setting screen which is for
connecting regions and which is displayed as a dialog box when the
connecting regions that are combined boundary regions are to be
displayed. In the display setting screen, how the connecting
regions which are the combined boundary regions in the combined
image are displayed is selected or set. Specifically, a display
method for connecting regions which are combined boundary regions
is selected from, for example, the following choices: change of the
color of the connecting regions; change of the brightness of the
connecting regions; and change of the brightness of the combined
image. The selection of change of the color and the selection of
change of the brightness are mutually exclusive. When the color of
connecting regions is to be changed, a color for the connecting
regions can be selected. In the change of color, it is possible to
further provide a list of color samples to allow a user to select a
desired color. In the case where the brightness of the areas for
the combined boundary regions is to be changed, or where the
brightness of the area other than the combined boundary regions is
to be changed, the degree to which the brightness is to be changed
can be set. In change of the brightness, the following cases may be
assumed: a case where an intuitive interface using, for example, a
slider is used; and a case where a numeric value is input to change
the brightness relatively with respect to the current brightness.
In addition, it is possible to set an .alpha. value for images
which are to be superimposed, when the combined boundary regions
are subjected to the superimposing process, so as to set display of
a semitransparent superimposed image.
[0121] FIG. 9D illustrates an exemplary display screen displayed
when the color of combined boundary regions is changed. In this
example, an image is rendered using a particular color as the color
of the cross-shaped connecting region which is a combined boundary
region in four pieces of partial image data, whereby the position
of the connecting region and relationship between the pieces of
partial image data can be grasped. FIG. 9E illustrates an exemplary
display screen in which the combined boundary region is
distinguished from the other regions by decreasing the brightness
of the combined image other than the combined boundary region. In
this example, the brightness of the area other than the combined
boundary region in four pieces of partial image data is decreased,
whereby cells and pieces of tissue in the combined boundary regions
can be intensively observed.
[0122] Change of Display of Combined-Boundary-Region Area
[0123] After the combined boundary regions in the combined-image
data are displayed in a manner desired by a user, the mode for the
display of the combined boundary regions can be further changed
through an instruction from the user. The flow in which the mode
for the display of the combined-boundary-region area (superimposing
process) is changed will be described by using the flowchart in
FIG. 10.
[0124] In step S1001, it is determined whether or not an
instruction from a user to change the display of the combined
boundary regions has been received. If an instruction has been
received, the process proceeds to step S1002. If an instruction has
not been received, the current display is maintained.
[0125] In step S1002, the instruction from the user about the mode
for the display of the combined boundary regions is grasped.
[0126] In step S1003, image data for the combined boundary regions
is obtained. This process is the same as that in step S606 in FIG.
6.
[0127] In step S1004, the combined-boundary-region rendering unit
305 generates combined-boundary-region rendered data by using the
display method determined in step S1002. This process of generating
combined-boundary-region rendered data is the same as that in FIG.
7.
[0128] In step S1005, the combined-boundary-region rendered data
generated in step S1004 is superimposed on the combined image
described in step S603 in FIG. 6, and combined-image data which is
obtained after the superimposing process and in which the combined
boundary regions are distinguished from the other regions is
obtained. This superimposing process is the same as that in FIG.
8.
[0129] Thus, the display method for combined boundary regions in an
observation image can be changed depending on an instruction or an
intention of a user. For example, in the case where an image is
observed in the state in which a display method in which the
brightness of combined boundary regions is decreased is selected,
when the area of interest is shifted from an area other than the
combined boundary regions to an area in the combined boundary
regions, the brightness of the combined boundary regions is
returned back to the original value, and instead, the brightness of
the area other than the combined boundary regions is decreased.
Accordingly, a smooth morphological observation of pieces of tissue
and cells can be performed while attention is being paid to the
combined boundary regions.
[0130] In the present embodiment, it is possible to perform the
following sequence of processes: combined-boundary-region rendered
data is superimposed on the entire combined-image data in advance;
a region of the superimposed combined image which is to be
displayed on the image display apparatus 103 is selected; and the
selected region is output on the image display apparatus 103.
Alternatively, when necessary, combined-boundary-region rendered
data corresponding to a region to be displayed on the image display
apparatus 103 can be superimposed and output.
[0131] By distinguishing combined boundary regions in the
observation image from the other regions, the case is prevented in
which it is difficult to perform highly accurate diagnosis using
the combined boundary regions. In particular, in the present
embodiment, the case where a combined boundary region has a certain
amount of connected area obtained through interpolation is assumed.
Accordingly, an observation image is displayed, and at the same
time, the combined boundary regions are distinguished through
change of the brightness thereof, whereby diagnostic imaging can be
performed without hindering the diagnosis process.
Second Embodiment
[0132] An image display system according to a second embodiment of
the present invention will be described using figures.
[0133] In the first embodiment, display data to be displayed by
changing the color or the brightness of combined boundary regions
is generated for combined image data in which, for example,
interpolation is applied to pieces of image data obtained by
photographing divided portions. In the second embodiment, using a
combined image in which images obtained by photographing divided
portions are aligned along one-dimensional connecting regions
(lines), display data to be displayed in such a manner that the
lines for the connecting regions are distinguished is
generated.
[0134] Examples of the image combining method include a method in
which, depending on only the position accuracy of the stage, images
are aligned on the basis of the position information of the stage,
and a method in which the positions of the pixels in obtained
divided image data are changed through a geometric transform such
as an affine transform and in which the images are combined at
ideal positions of connecting regions. Other than components that
are different from those in the first embodiment, the configuration
described in the first embodiment can be used in the second
embodiment.
[0135] System Configuration of Image Processing Apparatus
[0136] FIG. 11 is an overall view of the apparatus configuration of
an image display system according to the second embodiment of the
present invention.
[0137] In FIG. 11, the image display system using an image
processing apparatus according to the present invention includes an
image server 1101, the image processing apparatus 102, and the
image display apparatus 103. The image processing apparatus 102 can
obtain divided images for an imaging target from the image server
1101, and can generate image data to be displayed on the image
display apparatus 103. The image server 1101 and the image
processing apparatus 102 are connected with a LAN cable 1103 which
is a general-purpose I/F via a network 1102. The image server 1101
is a computer including a large-capacity storage device which
stores divided image data captured by the imaging apparatus 101
which is a virtual slide apparatus. The image server 1101 may store
divided images as a group in a local storage connected to the image
server 1101. Alternatively, the image server 1101 may be
constituted by servers (cloud servers) that are separately present
somewhere on the network, and may have each piece of the divided
image data itself and its link information separately. It is not
necessary for the divided image data itself to be stored in one
server. The image processing apparatus 102 and the image display
apparatus 103 are similar to those of the image pickup system
according to the first embodiment.
[0138] In the example in FIG. 11, an image processing system is
constituted by three apparatuses of the image server 1101, the
image processing apparatus 102, and the image display apparatus
103. However, the configuration of the present invention is not
limited to this. For example, an image processing apparatus into
which an image display apparatus is integrated may be used, or part
of the function of the image processing apparatus 102 may be
incorporated into the image server 1101. In contrast, the functions
of the image server 1101 and the image processing apparatus 102 may
be divided into small functions which are performed in multiple
apparatuses.
[0139] Display of Combined Boundary Regions
[0140] A combined image which is obtained after the superimposing
process, which is display data generated by the superimposing
processor 306 included in the image processing apparatus according
to the second embodiment, and which is displayed on the image
display apparatus 103 will be described using FIG. 12.
[0141] In FIG. 12, pieces of image data obtained by photographing
divided portions are connected after being subjected to, for
example, a coordinate transformation in any manner. Actually, a
combined image is generated by aligning pieces of image data
obtained after a transformation process along any border. By
superimposing rendered data which corresponds to lines for
connecting regions which are combined boundary regions on the
connected combined-image data, combined-image data in which the
combined boundary regions are distinguished from other regions is
obtained. The color, the line width, and the line type of a line
for the connecting region may be set. For example, the line type
may be a single line or a multiplet line, may be a dotted line, a
dashed line, or a dot-dash line, or may be a combination of these.
Further, a line for the connecting region is switched in a time
division manner, and blinking may be displayed to indicate whether
or not a line is present.
[0142] Rendering Connecting Regions
[0143] FIG. 13 is a flowchart of the process in the second
embodiment which corresponds to the process in step S607 in FIG. 6
according to the first embodiment and in which rendered data for
combined boundary regions is generated.
[0144] In steps S1301, S1302, and S1303, the color, the line width,
and the line type, respectively, are selected in accordance with
the selection performed by the mode selector 307.
[0145] In step S1304, the setting values selected in steps S1301,
S1302, and S1303 are reflected so as to generate rendered data for
the lines which corresponds to combined boundary regions.
[0146] Superimposition of Combined-Boundary-Region Rendered
Data
[0147] FIG. 14 is a flowchart of superimposition of
combined-boundary-region rendered data on a combined image
according to the second embodiment. This process flow corresponds
to that in FIG. 8 according to the first embodiment.
[0148] In step S1401, the combined-image data obtained through a
combining process performed by the combining processor 303 is
obtained.
[0149] In step S1402, the combined-boundary-region rendered data
generated by the combined-boundary-region rendering unit 305 is
obtained. The process of generating combined-boundary-region
rendered data is described with reference to FIG. 13.
[0150] In step S1403, the combined-image data obtained in step
S1401 is superimposed on the combined-boundary-region rendered data
obtained in step S1402.
[0151] Screen Layout
[0152] FIG. 15A illustrates an exemplary screen layout used when
image data generated by the image processing apparatus 102 is
displayed on the image display apparatus 103, according to the
second embodiment. The display area has a whole window 1501 in
which a display area 1502 for image-pickup-target image data for
detailed observation, a thumbnail image 1503 for the imaging target
which is to be observed, and a display setting area 1504 are
included. In the display area 1502 for image-pickup-target image
data, in addition to image-pickup-target image data for detailed
observation, a connecting region 1505 in the combined image is
displayed as a line. FIG. 15B illustrates a display setting screen
displayed as a dialog box when the setup button is selected and
pressed, and whether or not the connecting regions which are
combined boundary regions in the combined image are to be displayed
as a line is selected. FIG. 15C illustrates a screen which is for
various display settings for the lines for connecting regions and
which is displayed as a dialog box when the lines for connecting
regions which are combined boundary regions are to be displayed,
and how the lines for connecting regions which are combined
boundary regions in the combined image are to be displayed is set.
Specifically, for example, the color, the line width, and the line
type of a line can be selected and set. In the present embodiment,
similarly to as in the first embodiment, it is assumed that a setup
button is provided and a setting screen is opened by pressing the
button. However, a UI in which various detailed settings
illustrated in FIG. 15C can be displayed, selected, and changed may
be provided directly on the display setting area 1504.
Alternatively, a screen may be displayed in which a list for
detailed settings including whether or not combined boundary
regions are to be displayed is integrally provided.
[0153] In the present embodiment, it is possible to determine that
a strange image caused by a position deviation between partial
images and a focus deviation between partial images which
inevitably occur appears. As a result, the case is prevented in
which it is difficult to perform highly accurate diagnosis using
combined boundary regions.
Third Embodiment
[0154] In a third embodiment, the mode for display of combined
boundary regions in a combined image is changed by setting a
boundary.
[0155] In the third embodiment, similarly to as in the first
embodiment, on the precondition that a connecting region has a
certain degree of width, a display magnification is set as a
boundary. When the display magnification is high, combined boundary
regions are displayed as in the first embodiment. When the display
magnification becomes lower, combined boundary regions are
displayed as a line as in the second embodiment.
[0156] For example, in the case of data in which the width of a
connecting region is 64 pixels when a display magnification of 40
times is used, for detailed observation, the number of pixels used
to display the connecting region is 64 in a display magnification
(observation magnification in the field of optical microscope) of
40 times, and is 32 in a display magnification of 20 times.
Accordingly, since a connecting region has a sufficient width,
although change of the brightness is provided, it is desirable to
display the image for the connecting region in terms of
observation. However, in a screening process in pathological
diagnosis, bird's-eye view observation is typically performed with
a display magnification from 5 to 10 times. In the above-described
case, the number of pixels used to display a connecting region is 8
to 16. This is not sufficient for morphological observation of
tissue and cells. As the display magnification becomes lower, this
tendency noticeably appears, and the visibility for a connecting
region in which the brightness is changed significantly decreases.
To complement this, in a magnification with which the width of a
combined boundary region is not sufficient for observation, it is
effective to switch to the method in which combined boundary
regions are displayed as a line as described in the second
embodiment. In general, it is desirable to change the mode in which
combined boundary regions are displayed, by using, as a boundary, a
magnification of 10 times with which a screening is performed.
[0157] In the present embodiment, the method for rendering combined
boundary regions is switched in accordance with a display
magnification serving as a boundary, whereby it is possible to
display the connecting regions in a manner suitable for the purpose
of observation for each of the display magnifications.
Other Embodiments
[0158] In the first to third embodiments, a connecting region of
image data equivalent to that which can be visually observed by a
user, such as a microscope image, is described. The present
invention can be applied not only to a connecting region of an
image based on such visual information, but also to a connecting
region of display data obtained by an apparatus, such as a magnetic
resonance imaging apparatus (MRI), an X-ray diagnostic apparatus,
or diagnostic equipment using optical ultrasound, which visualizes
information about a generally invisible object such as an internal
structure of a human body by using various types of means or
principle, achieving similar effects. In particular, unlike a
visible image, this is image data generated from information based
on intensity. Therefore, when change of contrast or brightness
around a connecting region, image degradation caused by a
correction process, or an error caused by the alignment for
stitching occurs, it is very difficult to determine whether such an
image change or a singular point occurs due to the connecting
region, or indicates an abnormal state of the diagnosis site,
compared with image information obtained using, for example, a
microscope having color information. Therefore, it is very
important that a connecting region is clearly presented to a user
so that the user is given an indication that the image area around
the connecting region may have low reliability.
[0159] An object of the present invention may be achieved as
follows. That is, recording medium (or storage medium) in which
software program codes which achieve some or all of the functions
of the above-described embodiments are recorded is supplied to a
system or an apparatus. Then, a computer (or a CPU or an MPU) in
the system or the apparatus reads out and executes the program
codes stored in the recording medium. In this case, the program
codes themselves which are read out from the recording medium
achieve the functions of the above-described embodiments, and the
recording medium in which the program codes are recorded is
included in the present invention.
[0160] A computer executes the program codes which are read out,
whereby, for example, an operating system (OS) which is operating
on the computer executes some or all of the actual processes on the
basis of instructions of the program codes. The case where these
processes achieve the functions of the above-described embodiments
may be included in the present invention.
[0161] In addition, the program codes which are read out from the
recording medium may be written into a function expansion card
inserted into the computer or a memory included in a function
expansion unit which is connected to the computer. Then, for
example, a CPU included in the function expansion card or the
function expansion unit executes some or all of the actual
processes on the basis of instructions of the program codes, and
such processes achieve the functions of the above-described
embodiments. Such a case is also included in the present
invention.
[0162] In the case where the present invention is applied to the
above-described recording medium, program codes corresponding to
the flowcharts described above are stored in the recording
medium.
[0163] In the preferable image processing apparatus, the preferable
image display system, the preferable image processing method, and
the preferable image processing program according to the present
invention, the case where it is difficult to perform highly
accurate diagnosis using combined boundary regions in a combined
image can be prevented.
[0164] While the present invention has been described with
reference to exemplary embodiments, it is to be understood that the
invention is not limited to the disclosed exemplary embodiments.
The scope of the following claims is to be accorded the broadest
interpretation so as to encompass all such modifications and
equivalent structures and functions.
* * * * *