U.S. patent application number 17/679026 was filed with the patent office on 2022-06-09 for image processing device, image processing method, and program.
This patent application is currently assigned to Sony Group Corporation. The applicant listed for this patent is Sony Group Corporation. Invention is credited to Suguru Aoki, Atsushi Ito, Hideki Oyaizu, Ryuta Satoh, Takeshi Uemori.
Application Number | 20220180483 17/679026 |
Document ID | / |
Family ID | 1000006157024 |
Filed Date | 2022-06-09 |
United States Patent
Application |
20220180483 |
Kind Code |
A1 |
Aoki; Suguru ; et
al. |
June 9, 2022 |
IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM
Abstract
Image processing methods and apparatus are described. The image
processing method comprises receiving input of a visible-ray image
and an infrared-ray image obtained by photographing a same subject,
estimating, based on the visible-ray image, the infrared-ray image
and motion information, a blur estimate associated with the
visible-ray image, and generating, based on the estimated blur
estimate, a corrected visible-ray image.
Inventors: |
Aoki; Suguru; (Tokyo,
JP) ; Satoh; Ryuta; (Kanagawa, JP) ; Ito;
Atsushi; (Kanagawa, JP) ; Oyaizu; Hideki;
(Tokyo, JP) ; Uemori; Takeshi; (Tokyo,
JP) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Sony Group Corporation |
Tokyo |
|
JP |
|
|
Assignee: |
Sony Group Corporation
Tokyo
JP
|
Family ID: |
1000006157024 |
Appl. No.: |
17/679026 |
Filed: |
February 23, 2022 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
16642392 |
Feb 27, 2020 |
11288777 |
|
|
PCT/JP2018/032077 |
Aug 30, 2018 |
|
|
|
17679026 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06T 2207/20201
20130101; G06T 2207/10048 20130101; G06T 5/50 20130101; G06T 5/003
20130101; G06T 2207/10024 20130101 |
International
Class: |
G06T 5/00 20060101
G06T005/00; G06T 5/50 20060101 G06T005/50 |
Foreign Application Data
Date |
Code |
Application Number |
Sep 5, 2017 |
JP |
2017-170035 |
Claims
1-19. (canceled)
20. An image processing device, comprising: image processing
circuitry configured to: receive input of a first image and a
second image obtained by photographing a same subject, wherein the
first image is a visible-ray image and the second image is an image
captured by detecting at least far-infrared light from the object;
estimate, based on the first image, the second image, and motion
information, a blur estimate associated with the first image; and
generate, based on the blur estimate, a corrected visible-ray
image.
21. The image processing device of claim 20, wherein the second
image is an image captured by detecting far-infrared ray light from
the object.
22. The image processing device of claim 20, further comprising
estimating, based on the first image and the second image, an
image-based blur estimate.
23. The image processing device of claim 22, wherein estimating the
image-based blur estimate comprises: applying each of a plurality
of filters having different blurring characteristics to the second
image to produce a plurality of blurred second images; comparing
the first image to the plurality of blurred second images; and
selecting a filter from among the plurality of filters that
produced the blurred second image having blurring most similar to
the first image.
24. The image processing device of claim 23, wherein the different
blurring characteristics correspond to different point-spread
functions.
25. The image processing device of claim 24, wherein comparing the
first image to the plurality of blurred second images comprises:
calculating correlation values between the first image and each of
the plurality of blurred second images, and wherein selecting a
filter from among the plurality of filters comprises selecting the
filter that produced the blurred second image having a highest
correlation value from among the calculated correlation values.
26. The image processing device of claim 20, wherein the image
processing circuitry is further configured to: estimate, based on
the motion information, a motion-based blur estimate.
27. The image processing device of claim 22, wherein estimating the
motion-based blur estimate comprises: determining, based on the
motion information, a direction and magnitude of blur in the first
image.
28. An image processing method that is performed in an image
processing device, the image processing method comprising:
receiving input of a first image and a second image obtained by
photographing a same subject, wherein the first image is a
visible-ray image and the second image is an image captured by
detecting at least far-infrared light from the object; estimating,
based on the first image, the second image, and motion information,
a blur estimate associated with the first image; and generating,
based on the blur estimate, a corrected visible-ray image.
29. The image processing method of claim 28, wherein the second
image is an image captured by detecting far-infrared ray light from
the object.
30. The image processing method of claim 28, further comprising
estimating, based on the first image and the second image, an
image-based blur estimate.
31. The image processing method of claim 30, wherein estimating the
image-based blur estimate comprises: applying each of a plurality
of filters having different blurring characteristics to the second
image to produce a plurality of blurred second images; comparing
the first image to the plurality of blurred second images; and
selecting a filter from among the plurality of filters that
produced the blurred second image having blurring most similar to
the first image.
32. The image processing method of claim 31, wherein the different
blurring characteristics correspond to different point-spread
functions.
33. The image processing method of claim 32, wherein comparing the
first image to the plurality of blurred second images comprises:
calculating correlation values between the first image and each of
the plurality of blurred second images, and wherein selecting a
filter from among the plurality of filters comprises selecting the
filter that produced the blurred second image having a highest
correlation value from among the calculated correlation values.
34. The image processing method of claim 33, further comprising:
estimating, based on the motion information, a motion-based blur
estimate.
35. The image processing method of claim 34, wherein estimating the
motion-based blur estimate comprises: determining, based on the
motion information, a direction and magnitude of blur in the first
image.
36. A non-transitory computer readable medium encoded with a
plurality of instructions that, when executed by image processing
circuitry of an image processing device, perform an image
processing method, the image processing method comprising:
receiving input of a first image and a second image obtained by
photographing a same subject, wherein the first image is a
visible-ray image and the second image is an image captured by
detecting at least far-infrared light from the object; estimating,
based on the first image, the second image, and motion information,
a blur estimate associated with the first image; and generating,
based on the blur estimate, a corrected visible-ray image.
37. The non-transitory computer readable medium of claim 36,
wherein the second image is an image captured by detecting
far-infrared ray light from the object.
38. The non-transitory computer readable medium of claim 36,
further comprising estimating, based on the first image and the
second image, an image-based blur estimate.
39. The non-transitory computer readable medium of claim 38,
wherein estimating the image-based blur estimate comprises:
applying each of a plurality of filters having different blurring
characteristics to the second image to produce a plurality of
blurred second images; comparing the first image to the plurality
of blurred second images; and selecting a filter from among the
plurality of filters that produced the blurred second image having
blurring most similar to the first image.
40. The non-transitory computer readable medium of claim 39,
wherein the different blurring characteristics correspond to
different point-spread functions.
41. The non-transitory computer readable medium of claim 40,
wherein comparing the first image to the plurality of blurred
second images comprises: calculating correlation values between the
first image and each of the plurality of blurred second images, and
wherein selecting a filter from among the plurality of filters
comprises selecting the filter that produced the blurred second
image having a highest correlation value from among the calculated
correlation values.
42. The non-transitory computer readable medium of claim 41,
further comprising: estimating, based on the motion information, a
motion-based blur estimate.
43. The non-transitory computer readable medium of claim 42,
wherein estimating the motion-based blur estimate comprises:
determining, based on the motion information, a direction and
magnitude of blur in the first image.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefit under 35 U.S.C.
.sctn. 120 as a continuation application of U.S. application Ser.
No. 16/642,392, filed on Feb. 27, 2020, which claims the benefit
under 35 U.S.C. .sctn. 371 as a U.S. National Stage Entry of
International Application No. PCT/JP2018/032077, filed in the
Japanese Patent Office as a Receiving Office on Aug. 30, 2018,
which claims priority to Japanese Patent Application Number
JP2017-170035, filed in the Japanese Patent Office on Sep. 5, 2017,
each of which applications is hereby incorporated by reference in
its entirety.
TECHNICAL FIELD
[0002] The present disclosure relates to an image processing
device, an image processing method, and a program. Specifically,
the present disclosure relates to an image processing device, an
image processing method, and a program for inputting camera motion
information together with a visible light image and an infrared
image to reduce a blur (defocusing) on the visible light image.
BACKGROUND ART
[0003] In the case of capturing a visible light image in dark
environment such as the night-time, an exposure time needs to be
long, and as a result, a blur as defocusing due to camera motion or
object motion is easily generated.
[0004] For example, there is, as a related technology for solving
such a problem, a technique disclosed in Patent Literature (JP
2003-209735 A).
[0005] Patent Literature 1 discloses the technology of analyzing
image motion by means of multiple images continuously captured by a
visible light camera, thereby correcting defocusing on the basis of
such a motion analysis result.
[0006] However, in the above-described configuration described in
Patent Literature 1, the multiple continuously-captured images are
necessary, and there is a problem that processing cannot be
executed for a stationary image. Moreover, the processing of
analyzing the image motion from the multiple continuously-captured
images is necessary, and therefore, there is a problem that
immediate processing for each image cannot be performed.
[0007] Further, the above-described configuration described in
Patent Literature 1 discloses defocusing correction for the images
captured by a fixed camera, and fails to disclose processing in the
case of a moving camera. For example, there is a problem that an
effect is less exerted for defocusing reduction in the case of a
moving camera such as an in-vehicle camera.
CITATION LIST
Patent Literature
[0008] [PTL 1] [0009] JP 2003-209735 A
SUMMARY
Technical Problem
[0010] The present disclosure has been, for example, made in view
of the above-described problems, and is intended to provide an
image processing device, an image processing method, and a program
for accomplishing elimination or reduction of a blur (defocusing)
on a visible light image by means of the visible light image and an
infrared image without the need for use of multiple
continuously-captured images.
[0011] Further, the present disclosure is, for example, intended to
provide an image processing device, an image processing method, and
a program for accomplishing effective elimination or reduction of a
blur (defocusing) on a visible light image by processing
considering camera motion even in the case of a moving camera such
as an in-vehicle camera.
Solution to Problem
[0012] According to the present disclosure, an image processing
device is provided. The image processing device comprises image
processing circuitry configured to receive input of a visible-ray
image and an infrared-ray image obtained by photographing a same
subject, estimate, based on the visible-ray image, the infrared-ray
image and motion information, a blur estimate associated with the
visible-ray image, and generate, based on the estimated blur
estimate, a corrected visible-ray image.
[0013] According to the present disclosure, an image processing
method performed in an image processing device is provided.
[0014] The image processing method comprises receiving input of a
visible-ray image and an infrared-ray image obtained by
photographing a same subject, estimating, based on the visible-ray
image, the infrared-ray image and motion information, a blur
estimate associated with the visible-ray image, and generating,
based on the estimated blur estimate, a corrected visible-ray
image.
[0015] According to the present disclosure, a non-transitory
computer readable medium is provided. The non-transitory computer
readable medium is encoded with a plurality of instructions that,
when executed by image processing circuitry of an image processing
device, perform an image processing method. The image processing
method comprises receiving input of a visible-ray image and an
infrared-ray image obtained by photographing a same subject,
estimating, based on the visible-ray image, the infrared-ray image
and motion information, a blur estimate associated with the
visible-ray image, and generating, based on the estimated blur
estimate, a corrected visible-ray image.
[0016] Note that the program of the present disclosure is, for
example, a program which can be provided by a storage medium or a
communication medium configured to provide various program codes in
a computer readable form to an information processing device or a
computer system which can execute the program codes. Such a program
is provided in the computer readable form to implement processing
according to the program on the information processing device or
the computer system.
[0017] Other objects, features, and advantageous effects of the
present disclosure will be obvious from more detailed description
based on embodiments of the present disclosure and the attached
drawings as described later. Note that a system in the present
specification is a logical configuration set of multiple devices,
and is not limited to each device configuration in the same
housing.
Advantageous Effects of Invention
[0018] According to the configuration of one embodiment of the
present disclosure, the device and method for executing the image
quality improvement processing of removing or reducing the blur
(defocusing) on the visible light image are implemented.
[0019] Specifically, the visible light image and the far-infrared
image captured by simultaneous photographing of the same object and
the camera motion information are input. The camera motion-based
blur as the blur (defocusing) on the visible light image due to the
camera motion is estimated. The visible light image, the
far-infrared image, and the camera motion-based blur are utilized
to estimate the integrated filter as the filter for generating the
blur corresponding to the integrated blur of the visible light
image-based blur and the camera motion-based blur. The opposite
characteristic filter with the characteristics opposite to those of
the estimated integrated filter is applied to the visible light
image to generate the corrected visible light image whose blur has
been removed or reduced.
[0020] By these types of processing, the device and method for
executing the image quality improvement processing of removing or
reducing the blur (defocusing) on the visible light image are
implemented.
[0021] Note that the advantageous effects described in the present
specification have been set forth merely as examples, and are not
limited. Moreover, additional advantageous effects may be
provided.
BRIEF DESCRIPTION OF DRAWINGS
[0022] FIG. 1 illustrates a diagram for describing the outline of
processing executed by an image processing device of the present
disclosure.
[0023] FIG. 2 illustrates a diagram for describing a correspondence
between the type of captured image and a light wavelength.
[0024] FIG. 3 illustrates views for describing examples of a
visible light image and a far-infrared image.
[0025] FIG. 4 illustrates diagrams for describing a blur on the
captured image due to camera motion.
[0026] FIG. 5 illustrates diagrams for describing the blur on the
captured image due to the camera motion and influence of object
motion.
[0027] FIG. 6 illustrates views for describing an example of camera
attachment to an automobile and detection of the camera motion.
[0028] FIG. 7 illustrates diagrams for describing configuration and
processing examples of the image processing device of the present
disclosure.
[0029] FIG. 8 illustrates a diagram for describing configuration
and processing examples of the image processing device of the
present disclosure.
[0030] FIG. 9 illustrates a diagram for describing configuration
and processing examples of an image processing device of a first
embodiment of the present disclosure.
[0031] FIG. 10 illustrates diagrams for describing a camera motion
blur map.
[0032] FIG. 11 illustrates a diagram for describing configuration
and processing examples of the image processing device of the first
embodiment of the present disclosure.
[0033] FIG. 12 illustrates a flowchart for describing a sequence of
processing executed by the image processing device of the first
embodiment of the present disclosure.
[0034] FIG. 13 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the first
embodiment of the present disclosure.
[0035] FIG. 14 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the first
embodiment of the present disclosure.
[0036] FIG. 15 illustrates a diagram for describing configuration
and processing examples of an image processing device of a second
embodiment of the present disclosure.
[0037] FIG. 16 illustrates diagrams for describing data stored in a
filter bank pool and an example of a filter bank selection
processing based on camera motion.
[0038] FIG. 17 illustrates a diagram for describing configuration
and processing examples of the image processing device of the
second embodiment of the present disclosure.
[0039] FIG. 18 illustrates diagrams for describing examples of
processing executed by the image processing device of the second
embodiment of the present disclosure.
[0040] FIG. 19 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the
second embodiment of the present disclosure.
[0041] FIG. 20 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the
second embodiment of the present disclosure.
[0042] FIG. 21 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the
second embodiment of the present disclosure.
[0043] FIG. 22 illustrates a diagram for describing configuration
and processing examples of an image processing device of a third
embodiment of the present disclosure.
[0044] FIG. 23 illustrates diagrams for describing examples of
processing executed by the image processing device of the third
embodiment of the present disclosure.
[0045] FIG. 24 illustrates a diagram for describing configuration
and processing examples of the image processing device of the third
embodiment of the present disclosure.
[0046] FIG. 25 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the third
embodiment of the present disclosure.
[0047] FIG. 26 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the third
embodiment of the present disclosure.
[0048] FIG. 27 illustrates a diagram for describing an example of
processing executed by an image processing device of a fourth
embodiment of the present disclosure.
[0049] FIG. 28 illustrates a diagram for describing an example of
the processing executed by the image processing device of the
fourth embodiment of the present disclosure.
[0050] FIG. 29 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the
fourth embodiment of the present disclosure.
[0051] FIG. 30 illustrates a flowchart for describing a sequence of
the processing executed by the image processing device of the
fourth embodiment of the present disclosure.
[0052] FIG. 31 illustrates a diagram for describing a hardware
configuration example of the image processing device.
[0053] FIG. 32 illustrates a diagram for describing a configuration
example of a vehicle control system having the function of the
image processing device of the present disclosure.
DESCRIPTION OF EMBODIMENTS
[0054] Hereinafter, details of an image processing device, an image
processing method, and a program according to an embodiment of the
present disclosure will be described with reference to the
drawings. Note that description will be made in accordance with the
following contents:
[0055] 1. Outline of configuration and processing of image
processing device of present disclosure
[0056] 2. Specific examples of blur generated due to camera
motion
[0057] 3. Example of camera attachment to vehicle
[0058] 4. Outline of configuration and processing of image
processing device of present disclosure
[0059] 5. (First Embodiment) Configuration and processing of image
processing device corresponding to configuration A
[0060] 6. (Second Embodiment) Configuration and processing of image
processing device corresponding to configuration B
[0061] 7. (Third Embodiment) Configuration and processing of image
processing device corresponding to configuration A+C
[0062] 8. (Fourth Embodiment) Configuration and processing of image
processing device corresponding to configuration B+C
[0063] 9. Hardware configuration example of image processing
device
[0064] 10. Configuration example of vehicle control system
including image processing device of present disclosure in
vehicle
[0065] 11. Summary of configuration of present disclosure
[0066] (1. Outline of Configuration and Processing of Image
Processing Device of Present Disclosure)
[0067] First, the outline of the configuration and processing of
the image processing device of the present disclosure will be
described with reference to FIG. 1 and subsequent figures.
[0068] FIG. 1 is a diagram for describing the outline of the
processing executed by the image processing device of the present
disclosure.
[0069] The image processing device of the present disclosure is
configured to execute the processing of receiving a visible light
image and an infrared image captured by photographing of the same
object and receiving camera motion information of cameras having
captured these images, thereby reducing a blur (defocusing) on the
visible light image.
[0070] In the case of capturing the visible light image in dark
environment such as the night-time, an exposure time needs to be
long, and as a result, a blur due to camera motion or object motion
is easily generated.
[0071] The image processing device of the present disclosure
utilizes a far-infrared image captured by simultaneous
photographing of the same object to reduce the blur on the visible
light image captured under such environment, for example.
[0072] The infrared image is an image for which a pixel value is
set according to heat generated from the object, and allows
detection of a human body temperature, for example. Thus, e.g., a
person or the like generating heat in the dark or the like can be
photographed, and the infrared image is utilized for a security
camera and the like.
[0073] Of infrared light, far-infrared light with a long wavelength
exhibits high sensitivity to heat. Thus, even in photographing with
a short exposure time, the object generating heat, such as a
person, can be relatively clearly photographed.
[0074] In the case of capturing the visible light image in the
dark, such as the night-time, the exposure time needs to be long,
and the blur (defocusing) spreads in association with the camera or
object motion.
[0075] However, the far-infrared image can be clearly captured for
the object generating heat, such as a person, for example, even
when photographing is performed with a short exposure time in the
dark.
[0076] The image processing device of the present disclosure
utilizes a difference in characteristics between the visible light
image and the far-infrared image, thereby correcting the visible
light image with greater blurs. That is, correction (blur removing)
processing is performed using, as a reference image, the infrared
image with less blurs, and in this manner, the visible light image
whose blur has been eliminated or reduced is generated.
[0077] The outline of the processing executed by the image
processing device of the present disclosure will be described with
reference to FIG. 1.
[0078] As illustrated in FIG. 1, the image processing device of the
present disclosure receives a blurred visible light image 11 and a
blur-less far-infrared image 12, these images being simultaneously
captured by photographing of the same object.
[0079] Further, camera motion information 13 on cameras having
captured these images is also input.
[0080] The image processing device of the present disclosure first
utilizes, at a step S20, the two input images and the camera motion
information to perform visible light image blur estimation.
[0081] Specifically, a point spread function (PSF) as a function
indicating an image defocusing amount is estimated, for
example.
[0082] The PSF is a function indicating the degree of spread about
a pixel value of a certain pixel position, i.e., the defocusing
amount or a defocusing form.
[0083] At the step S20, filters corresponding to various point
spread functions (PSFs), i.e., filters for generating blurs
(defocusing), are applied to the blur-less far-infrared image 12,
thereby generating an intentionally-blurred far-infrared image.
Such a filter-applied far-infrared image and the blurred visible
light image 11 are compared with each other (a correlation is
calculated).
[0084] On the basis of such comparison processing (correlation
calculation), the filter corresponding to the point spread function
(PSF) for generating a blur (defocusing) similar to that of the
blurred visible light image 11 is selected.
[0085] Note that the filter to be selected at the step S20
corresponds to a filter for generating the blurred visible light
image 11 in the case of applying such a filter to a blur-less
(defocusing-less) visible light image.
[0086] However, the blur-less (defocusing-less) visible light image
is not captured as a captured image, and therefore, the blur-less
far-infrared image 12 is utilized as a substitute image.
[0087] That is, by such application to the blur-less far-infrared
image 12, the filter for generating the blur (defocusing) similar
to that on the blurred visible light image 11 is selected, or the
point spread function (PSF) for generating the blur (defocusing)
similar to that on the blurred visible light image 11 is
calculated.
[0088] Next, the processing of removing the blur on the visible
light image is performed at a step S40.
[0089] The blur removing processing is the processing of generating
an inverse filter with characteristics opposite to filter
characteristics indicated by the above-described point spread
function: PSF=p(x, y), thereby applying the generated inverse
filter to the blurred visible light image 11.
[0090] By such inverse filter application processing, the blur
(defocusing) is removed from the blurred visible light image 11,
and therefore, a blur-reduced visible light image 15 is
generated.
[0091] Note that as the visible light image blur removing
processing of the step S40, filter processing, which is called
"deconvolution processing," in a frequency domain is
applicable.
[0092] When the point spread function of the blurred visible light
image 11 is (PSF)=p(x, y), the blurred visible light image 11 is
b(x, y), and a blur-less proper visible light image is s(x, y), in
a case where each Fourier transform is P(u, v), B(u, v), S(u, v),
the following relational expressions are satisfied:
b(x,y)=p(x,y)*s(x,y)
B(u,v)=P(u,v).times.S(U,v)
[0093] where "*" indicates convolution operation.
B(u,v)=FT[b(x,y)]
P(u,v)=FT[p(x,y)]
S(u,v)=FT[s(x,y)]
[0094] Further, when the Fourier transform is FT[ ], the
above-described relational expressions are satisfied.
[0095] The processing of calculating the blur-less proper visible
light image: s(x, y) is the processing (=similar to the processing
of calculating S(u, v) from B(u, v)) of calculating the blur-less
proper visible light image: s(x, y) from the blurred visible light
image 11: b(x, y). A filter for performing such processing is
called a "deconvolution filter," and the processing of applying
such a filter is called "deconvolution processing."
[0096] The deconvolution filter is an inverse filter with
characteristics opposite to filter characteristics indicated by
PSF=p(x, y).
[0097] As described above, at the step S40, the inverse filter with
the characteristics opposite to the filter characteristics
indicated by PSF=p(x, y) indicating the blur form of the blurred
visible light image 11 estimated at the step S20 is generated, and
the generated inverse filter is applied to the blurred visible
light image 11. That is, the "deconvolution processing" is executed
to generate, from the blurred visible light image 11, the
blur-reduced visible light image 15 whose blur (defocusing) has
been removed.
[0098] Next, the visible light image and the infrared image will be
described with reference to FIG. 2.
[0099] As illustrated in FIG. 2, the visible light image is an
image with a wavelength range of about 0.4 .mu.m to 0.7 .mu.m, and
is a color image captured by a typical camera, such as a RGB
image.
[0100] On the other hand, the infrared image is an image having
long-wavelength light with a wavelength of equal to or greater than
0.7 .mu.m. An infrared image camera configured to capture the
infrared image can photograph, e.g., a person or the like
generating heat in the dark or the like, and is utilized for a
security camera and the like.
[0101] Note that as illustrated in FIG. 2, the infrared light is
divided into:
[0102] near-infrared light with a wavelength of about 0.7 to 1
.mu.m;
[0103] middle-infrared light with a wavelength of about 3 to 5
.mu.m; and
[0104] far-infrared light with a wavelength of about 8 to 14
.mu.m.
[0105] In the embodiments described below, an image processing
example utilizing a far-infrared image as a captured image with
far-infrared light having a wavelength of about 8 to 14 .mu.m will
be mainly described.
[0106] Note that the processing of the present disclosure is not
limited to the far-infrared image, and is also applicable to
processing utilizing other infrared images.
[0107] As described earlier, in the case of capturing the visible
light image in the dark environment such as the night-time, the
exposure time needs to be long, and as a result, the blur due to
the camera motion or the object motion is easily generated. On the
other hand, the far-infrared image can be clearly captured for the
object generating heat, such as a person, even upon photographing
with a short exposure time.
[0108] Specific captured image examples will be illustrated in FIG.
3.
[0109] FIG. 3 illustrates the examples of the visible light image
and the far-infrared image as the captured images upon
photographing at an intersection during the night-time. These two
images are captured images in dark environment, and long exposure
is performed for the visible light image. When (1) the visible
light image and (2) the far-infrared image are compared with each
other, (1) the visible light image shows great blurs (defocusing),
and shows almost no recognizable person appearances. However, (2)
the far-infrared image clearly shows the person appearances.
[0110] This is because the exposure time for the far-infrared image
is short and almost no blurs (defocusing) are generated.
[0111] As described above, the image processing device of the
present disclosure uses, as the reference image, the far-infrared
image with almost no blurs to correct the blurred visible light
image, thereby generating the visible light image whose blur has
been removed or reduced.
[0112] (2. Specific examples of blur generated due to camera
motion) Next, specific examples of the blur generated due to motion
of the camera configured to capture the image will be described
with reference to FIG. 4 and subsequent figures.
[0113] The configuration of the present disclosure implements, for
example, a configuration in which a visible light image camera and
a far-infrared image camera are mounted on a vehicle to generate a
blur-less visible light image from images captured by these cameras
and provide the blur-less visible light image to a driver.
[0114] The vehicle moves at high speed, and therefore, the images
are captured while the cameras are also moving at high speed. Thus,
the captured images are blurred (defocused) according to the camera
motion.
[0115] FIG. 4 illustrates diagrams for describing blur forms
corresponding to various types of camera motion, and the direction
and size of the blur generated on the captured image in association
with the following four types of camera motion are indicated by a
vector:
[0116] (1) the camera moves straight forward;
[0117] (2) the camera moves straight backward;
[0118] (3) the camera rotates to the right; and
[0119] (4) the camera rotates upward.
[0120] For example, in the case of "(1) the camera moves straight
forward," the blur is generated as if the image is in a fluid
motion in a direction from a center portion to a peripheral portion
of the captured image as illustrated in FIG. 4(1). The size of the
blur is smaller at the center of the image, and becomes greater
toward the periphery of the image.
[0121] Moreover, in the case of "(2) the camera moves straight
backward," the blur is generated as if the image is in a fluid
motion in a direction from the peripheral portion to the center
portion of the captured image as illustrated in FIG. 4(2). The size
of the blur is smaller at the center of the image, and becomes
greater toward the periphery of the image.
[0122] Further, in the case of "(3) the camera rotates to the
right," the blur is generated as if the image is in a fluid motion
in a direction from the right to the left of the captured image as
illustrated in the figure. The size of the blur is substantially
uniform across the image.
[0123] In addition, in the case of "(4) the camera rotates upward,"
the blur is generated as if the image is in a fluid motion in a
direction from the top to the bottom of the captured image as
illustrated in the figure. The size of the blur is substantially
uniform across the image.
[0124] As described above, the captured image is blurred
(defocused) according to the camera motion.
[0125] Note that the setting example of the vector of each image as
illustrated in FIG. 4 shows a vector indicating the direction and
size of the blur in a case where no moving object is present on the
image captured by the camera.
[0126] In a case where a moving object is present on the image
captured by the camera, the vector indicating the direction and
size of the blur varies according to the object motion.
[0127] FIG. 5 illustrates diagrams of examples of the vector
indicating the direction and size of the blur in a case where the
camera is moving and the object on the image captured by the camera
is also moving.
[0128] FIG. 5 illustrates the following two image examples:
[0129] (1) while the camera is moving straight forward, the object
on the front left side is moving forward at higher speed than the
camera; and
[0130] (2) while the camera is moving straight backward, the object
on the front left side is moving backward at higher speed than the
camera.
[0131] The example of FIG. 5(1) corresponds, for example, to a
situation where a motorbike is passing at high speed on the left
front side of an in-vehicle camera.
[0132] The direction and size of the blur in a case where the
camera is moving straight forward and no moving object is present
are set as described earlier with reference to FIG. 4(1). However,
in a case where the moving object is present, a setting illustrated
in FIG. 5(1) is made.
[0133] A dashed frame illustrated in FIG. 5(1) corresponds to a
vector indicating the direction and size of the blur generated due
to the object moving forward at higher speed than the camera.
[0134] Such a vector points in a direction opposite to those of
vectors in other regions.
[0135] Moreover, the example of FIG. 5(2) is an example of an image
captured by an in-vehicle camera for backward photographing, and
corresponds to a situation where a motorbike is approaching at high
speed from the back left side of a vehicle, for example.
[0136] The direction and size of the blur in a case where the
camera is moving straight backward and no moving object is present
are set as described earlier with reference to FIG. 4(2). However,
in a case where the moving object is present, a setting illustrated
in FIG. 5(2) is made.
[0137] A dashed frame illustrated in FIG. 5(2) corresponds to a
vector indicating the direction and size of the blur generated due
to the object moving backward at higher speed than the camera.
[0138] Such a vector points in a direction opposite to those of
vectors in other regions.
[0139] As described above, in a case where the camera is moving and
the object on the image captured by the camera is also moving, the
direction and size of the blur are determined according to these
types of motion.
[0140] Thus, for eliminating or reducing the blur with high
accuracy, the processing needs to be performed considering the
camera motion or the object motion.
[0141] (3. Example of Camera Attachment to Vehicle)
[0142] Next, an example of the image processing device configured
to perform the processing by means of the input images captured by
the cameras attached to the vehicle will be described as one
configuration example of the image processing device of the present
disclosure.
[0143] As described above, one configuration example of the image
processing device of the present disclosure implements, for
example, a configuration in which images captured by a visible
light image camera and a far-infrared image camera mounted on a
vehicle are input to generate a blur-less visible light image and
provide the blur-less visible light image to a driver.
[0144] FIG. 6 illustrates views of one example of camera attachment
to the vehicle.
[0145] As illustrated in FIG. 6(a) as an upper view, two cameras
including the visible light camera and the far-infrared camera are
mounted on the vehicle. The images captured by these cameras are
input to the image processing device provided inside the vehicle,
and then, the processing of removing or reducing the blur
(defocusing) of the visible light image is executed.
[0146] The visible light image whose blur (defocusing) has been
removed or reduced is displayed on a display provided at a driver's
seat. Moreover, such a visible light image is output to an
automatic driving controller, and is utilized as information for
automatic driving, such as obstacle detection information, for
example.
[0147] Note that the example illustrated in FIG. 6 shows such a
setting that the camera performs photographing only in a forward
direction, but has been set forth as an example. A setting may be
made such that each camera is also set for a backward or lateral
direction of the vehicle to capture images in all directions.
[0148] Note that as described above, the image processing device of
the present disclosure receives the visible light image and the
far-infrared image, as well as receiving the camera motion
information. In this manner, the image processing device implements
high-accuracy blur removing processing.
[0149] For example, the camera is set to move together with the
vehicle, and the camera motion information is acquired from the
camera or a sensor attached to the vehicle.
[0150] The sensor includes, for example, a gyro, an IMU, an
acceleration sensor, an inclination sensor, and the like. Note that
the inertial measurement unit (IMU) is a sensor configured to
detect an angle, an angular speed, or an acceleration in a triaxial
direction.
[0151] As illustrated in FIG. 6(c), the sensor detects a camera
movement direction, a camera movement speed, a camera rotation
radius, and the like, for example, and inputs, as the camera motion
information, these types of information to a data processor of the
image processing device.
[0152] (4. Outline of configuration and processing of image
processing device of present disclosure) Next, the outline of the
configuration and processing of the image processing device of the
present disclosure will be described with reference to FIG. 7 and
subsequent figures.
[0153] As described earlier with reference to FIG. 1, the image
processing device of the present disclosure receives the visible
light image and the far-infrared image captured by simultaneous
photographing of the same object and the camera motion information
of the cameras having captured these images, thereby executing the
processing of eliminating or reducing the blur of the visible light
image. Multiple configuration examples of the image processing
device configured to execute such processing will be described with
reference to FIGS. 7 and 8.
[0154] FIGS. 7 and 8 illustrate the following multiple
configuration examples of the image processing device of the
present disclosure:
[0155] (Configuration A) a configuration example where camera
motion-based blur estimation information based on the camera motion
information is, after image-based blur estimation using the visible
light image and the far-infrared image, utilized to execute final
integrated blur estimation;
[0156] (Configuration B) a configuration example where image-based
blur estimation using the visible light image and the far-infrared
image and camera motion-based blur estimation based on the camera
motion information are executed in combination to execute
integrated blur estimation;
[0157] (Configuration A+C) the configuration A+a configuration
example where the degree of reliability of a filter (blur)
estimation result is calculated to perform the blur removing
processing according to the degree of reliability; and
[0158] (Configuration B+C) the configuration B+a configuration
example where the degree of reliability of the filter (blur)
estimation result is calculated to perform the blur removing
processing according to the degree of reliability.
[0159] The image processing device of the present disclosure
includes various configuration examples as illustrated in FIGS. 5
and 6.
[0160] Specific configuration and processing of each of these
configuration examples will be described later, but the outline of
the processing according to these four types of configurations will
be first described.
[0161] (Configuration A) The configuration example where the camera
motion-based blur estimation information based on the camera motion
information is, after image-based blur estimation using the visible
light image and the far-infrared image, utilized to execute final
integrated blur estimation.
[0162] In this configuration A, two processing steps described with
reference to FIG. 1, i.e., the blur estimation processing of the
step S20 and the blur removing processing of the step S40, are
performed.
[0163] Note that in the configuration A illustrated in FIG. 7, the
blur estimation processing of the step S20 described with reference
to FIG. 1 includes the following three steps:
[0164] (Step S21) estimation of image-based blur (Et);
[0165] (Step S22) estimation of camera motion-based blur (Ec);
and
[0166] (Step S23) estimation of integrated blur (Eall).
[0167] The meaning of each reference character used in FIG. 7 and
other drawings and the following description will be described:
[0168] Et: an image-based blur as a visible light image blur
estimated on the basis of the visible light image and the
far-infrared image, specifically defocusing form information such
as a point spread function (PSF);
[0169] Ec: a camera motion-based blur as a visible light image blur
estimated on the basis of motion of the camera having captured the
image (the visible light image and the far-infrared image),
specifically defocusing form information such as a PSF; and
[0170] Eall: an integrated blur as an integrated visible light
image blur of the image-based blur (Et) and the camera motion-based
blur (Ec), specifically defocusing form information such as a
PSF.
[0171] Further,
[0172] a filter for generating an image blur (defocusing) similar
to the image-based blur (Et) or a filter coefficient forming the
filter is taken as an image-based filter (Etf),
[0173] a filter for generating an image blur (defocusing) similar
to the camera motion-based blur (Ec) or a filter coefficient
forming the filter is taken as a camera motion-based filter (Ecf),
and
[0174] a filter for generating an image blur (defocusing) similar
to the integrated blur (Eall) or a filter coefficient forming the
filter is taken as an integrated filter (Eallf).
[0175] Note that each blur and each filter can be
acquired/calculated/applied for each unit block as a divided image
region.
[0176] Moreover, from a single blur form, a single filter for
generating such a blur is uniquely determined.
[0177] Thus, the processing of comparing the image-based filter
(Etf) in a certain form and the camera motion-based blur (Ec) to
determining whether or not the blur form generated by application
of the image-based filter (Etf) is coincident with or similar to
the camera motion-based blur (Ec) can be performed, for example.
That is, the processing of determining the degree of similarity
between the blur and the filter is also performed.
[0178] In the embodiments described later, such processing is also
performed.
[0179] The processing executed in the configuration A illustrated
in FIG. 7 will be described.
[0180] First, at the step S21, the blurred visible light image 11
and the blur-less far-infrared image 12 captured by simultaneous
photographing of the same object are input, and then, are compared
with each other. In this manner, the processing of estimating the
image-based blur (Et) on the blurred visible light image 11 is
performed.
[0181] Specifically, the filters corresponding to various point
spread functions (PSFs), i.e., the filters for generating the blurs
(defocusing), are applied to the blur-less far-infrared image 12,
thereby generating the intentionally-blurred far-infrared image.
Such a filter-applied far-infrared image and the blurred visible
light image 11 are compared with each other (the correlation is
calculated).
[0182] On the basis of such comparison processing (correlation
calculation), the filter corresponding to the point spread function
(PSF) for generating the blur (defocusing) similar to that of the
blurred visible light image 11 is selected.
[0183] That is, various filters are applied to the blur-less
far-infrared image 12 to generate the blur, and comparison with the
blurred visible light image 11 is performed. In this manner, the
filter for generating the blur (defocusing) similar to the blur
form of the blurred visible light image 11 is selected, or the
point spread function (PSF) for generating the blur (defocusing)
similar to the blur form of the blurred visible light image 11 is
calculated.
[0184] Note that filter selection is executed for each of
predetermined unit pixel blocks, for example.
[0185] Next, at the step S22, the processing of estimating the
camera motion-based blur (Ec) is performed.
[0186] Specifically, the blur estimated to be generated on the
captured image is estimated according to the camera motion acquired
by the sensor such as an IMU, for example.
[0187] For example, the vector indicating the direction and size of
the blur as described earlier with reference to FIG. 4 is
estimated. That is, the blur generated on the image according to
various types of camera motion is estimated for each unit block of
the image.
[0188] Next, at the step S23, the integrated blur (Eall) of the
image-based blur (Et) estimated at the step S21 and the camera
motion-based blur (Ec) estimated at the step S22 is estimated.
[0189] Such integration processing is executed for each unit block
as the divided image region.
[0190] By such integration processing, the blur is estimated for
each unit block of the visible light image 11, considering even the
camera motion.
[0191] When the processing of these steps S21 to S23 ends, the blur
removing processing of the step S40 is subsequently executed.
[0192] At the step S40, the "inverse filter" with the
characteristics opposite to the filter characteristics similar to
PSF characteristics indicating the blur form (Eall) of the blurred
visible light image 11 estimated at the step S23 is calculated or
is selected from an inverse filter bank storing various inverse
filters, and then, the calculated or selected inverse filter is
applied to the blurred visible light image 11.
[0193] Note that as described above, such inverse filter
application processing is called the deconvolution processing. The
deconvolution processing is executed to generate, from the blurred
visible light image 11, the blur-reduced visible light image 15
whose blur (defocusing) has been removed.
[0194] Note that the processing of calculating or selecting and
applying the inverse filter is executed for each of the
predetermined unit pixel blocks.
[0195] (Configuration B) The configuration example where
image-based blur estimation using the visible light image and the
far-infrared image and camera motion-based blur estimation based on
the camera motion information are executed in combination to
execute integrated blur estimation.
[0196] Next, the configuration B will be described.
[0197] A difference between the configuration B and the
configuration A described above is that the blur estimation
processing of the step S20 described with reference to FIG. 1
includes the following two steps:
[0198] (Step S22) estimation of camera motion-based blur (Ec);
and
[0199] (Step S25) estimation of integrated blur (Eall).
[0200] First, at the step S22, the processing of estimating the
camera motion-based blur (Ec) is performed.
[0201] Such processing is processing similar to the processing of
the step S22 in the above-described (configuration A). That is, the
blur generated on the image according to various types of camera
motion is estimated for each unit block of the image.
[0202] Next, at the step S25, the processing of estimating the
integrated blur (Eall) is executed.
[0203] Such processing is for performing processing utilizing the
camera motion-based blur (Ec) having been already acquired at the
step S22 in the processing of estimating the blur of the blurred
visible light image 11 based on the blurred visible light image 11
and the blur-less far-infrared image 12 captured by simultaneous
photographing of the same object.
[0204] Although specific processing will be described later, e.g.,
any of the following two types of processing is executed:
[0205] (1) the processing of selecting, on the basis of the camera
motion-based blur (Ec) having been already acquired at the step
S22, the filter to be applied to the far-infrared image, i.e., the
filter for generating the blur (defocusing); and
[0206] (2) the processing of correcting, on the basis of the camera
motion-based blur (Ec) having been already acquired at the step
S22, the value of correlation between the filter-applied
far-infrared image and the blurred visible light image 11.
[0207] For example, at the step S25, any of these two types of
processing is executed to execute the processing of estimating the
integrated blur (Eall).
[0208] Specific processing will be described later.
[0209] At the step S40, the "inverse filter" with the
characteristics opposite to the filter characteristics similar to
the PSF characteristics indicating the blur form (Eall) of the
blurred visible light image 11 estimated at the step S25 is
calculated or is selected from the inverse filter bank storing
various inverse filters, and then, the calculated or selected
inverse filter is applied to the blurred visible light image
11.
[0210] (Configuration A+C) the configuration A+the configuration
example where the degree of reliability of the filter (blur)
estimation result is calculated to perform the blur removing
processing according to the degree of reliability. Next, the
(configuration A+C) will be described.
[0211] This configuration is a configuration example where the
degree of reliability of the filter (blur) estimation result is, in
addition to the configuration A, calculated to perform the blur
removing processing according to the degree of reliability.
[0212] The (Configuration A+C) is characterized in that the blur
estimation reliability calculation processing is, as illustrated in
a diagram of (A+C) of FIG. 8, executed at a step S31 before the
blur removing processing at the step S40.
[0213] Other configurations are similar to those of the
configuration A.
[0214] In the blur estimation reliability calculation processing at
the step S31, the image-based blur estimation result (Et) of the
step S21 and the camera motion-based blur estimation result (Ec) of
the step S22 are compared with each other, and it is determined as
a higher degree of reliability of the estimation result of the
integrated blur (Eall) generated at the step S23 in the case of a
higher degree of coincidence between the two blur estimation
results.
[0215] The reliability information generated at the step S31 is
utilized in the blur removing processing of the step S40. That is,
the strength of the inverse filter applied in the blur removing
processing of the step S40 is adjusted by the reliability
information calculated at the step S31.
[0216] Specifically, in a case where the degree of reliability of
the estimation result of the integrated blur (Eall) generated at
the step S23 is low, the processing of decreasing the strength of
the inverse filter applied in the blur removing processing of the
step S40 is performed. Note that calculation of the degree of
reliability is executed for each of the predetermined unit pixel
blocks, for example.
[0217] By such processing, the inverse filter can be applied
according to the degree of reliability of the filter (blur)
estimation result.
[0218] (Configuration B+C) The configuration B+the configuration
example where the degree of reliability of the filter (blur)
estimation result is calculated to perform the blur removing
processing according to the degree of reliability. Next, the
(configuration B+C) will be described.
[0219] This configuration is a configuration example where the
degree of reliability of the filter (blur) estimation result is, in
addition to the configuration B, calculated to perform the blur
removing processing according to the degree of reliability.
[0220] The (Configuration B+C) is characterized in that the blur
estimation reliability calculation processing is, as illustrated in
a diagram of (B+C) of FIG. 8, executed at a step S32 before the
blur removing processing at the step S40.
[0221] Other configurations are similar to those of the
configuration B.
[0222] In the blur estimation reliability calculation processing at
the step S32, the camera motion-based blur estimation result (Ec)
of the step S22 and the integrated blur estimation result (Eall) of
the step S25 are compared with each other, and it is determined as
a higher degree of reliability of the estimation result of the
integrated blur (Eall) generated at the step S23 in the case of a
higher degree of coincidence between the two blur estimation
results.
[0223] The reliability information generated at the step S32 is
utilized in the blur removing processing of the step S40. That is,
the strength of the inverse filter applied in the blur removing
processing of the step S40 is adjusted by the reliability
information calculated at the step S32.
[0224] Specifically, in a case where the degree of reliability of
the estimation result of the integrated blur (Eall) generated at
the step S23 is low, the processing of decreasing the strength of
the inverse filter applied in the blur removing processing of the
step S40 is performed. Note that calculation of the degree of
reliability is executed for each of the predetermined unit pixel
blocks, for example.
[0225] By such processing, the inverse filter can be applied
according to the degree of reliability of the filter (blur)
estimation result.
[0226] Note that specific configuration examples or processing
examples will be described later.
[0227] (5. (First Embodiment) Configuration and Processing of Image
Processing Device Corresponding to Configuration A)
[0228] Next, specific configuration and processing of an image
processing device corresponding to the configuration A described
with reference to FIG. 7 will be described as a first embodiment of
the image processing device of the present disclosure.
[0229] As described earlier with reference to FIG. 7, the
configuration A is a configuration in which camera motion-based
blur estimation information based on camera motion information is,
after image-based blur estimation using a visible light image and a
far-infrared image, utilized to execute final integrated blur
estimation.
[0230] Note that the processing form of the integrated blur (Eall)
estimation processing of the configuration A includes the following
two types:
[0231] (A1) a configuration in which an integrated blur (Eall) to
be applied or selected is switched between an image-based blur (Et)
and a camera motion-based blur (Ec) according to the amount of
object motion; and
[0232] (A2) a configuration in which the weighted average of the
image-based blur (Et) and the camera motion-based blur (Ec) is
utilized as the integrated blur (Eall) to change a weighted average
form according to the amount of object motion.
[0233] As described earlier with reference to FIG. 4, the camera
motion-based blur (Ec) is for estimation of the blur form on the
image on the basis of only the camera motion. In a case where an
object on the captured image is in motion as described earlier with
reference to FIG. 5, the blur generated on the image is set
depending on the object motion.
[0234] Thus, in the case of much motion of the object on the
captured image, when the camera motion-based blur (Ec) is directly
applied, blur estimation is erroneously performed. The
above-described (A1), (A2) are two types of methods for solving
such a problem.
[0235] Note that the amount of object motion is, for example,
determined by the following input environment information:
[0236] (a) map information: map information stored in advance in a
storage unit or map information input via a network determines that
much object motion is present in an area with many
vehicles/pedestrians, such as an urban area, and that little object
motion is present in a mountain area or the like;
[0237] (b) time information: time information acquired in the image
processing device or an external device or via a network determines
the amount of object motion according to a period of time, and for
example, determines that much object motion is present in the
daytime and that little object motion is present at night;
[0238] (c) traffic information: traffic information is input via a
network and it is determined as much object motion in a situation
where a road is crowded and as little object motion in a situation
where the road is not crowded; and
[0239] (d) a position (block) on the image: the amount of object
motion is defined in advance for each unit region (each unit block)
on the image, and for example, it is determined as much object
motion for the blocks in the horizontal direction and as little
object motion for the blocks in an upper-to-lower direction.
[0240] The image processing device with the configuration A
receives, for example, the above-described environment information
(a) to (d), thereby executing the integrated blur (Eall) estimation
processing.
[0241] In the above-described configuration (A1), the
above-described environment information (a) to (d) is, for example,
input to determine the amount of object motion. In a case where it
is determined that the object motion is equal to or greater than a
given threshold, the image-based blur (Et) is applied as the
integrated blur (Eall). In a case where the object motion is less
than the threshold, the camera motion-based blur (Ec) is
selected.
[0242] Moreover, in the above-described configuration (A2), the
above-described environment information (a) to (d) is, for example,
input to determine the amount of object motion. The form of the
weighted average of the image-based blur (Et) and the camera
motion-based blur (Ec) is changed according to the determined
amount of object motion, and a weighted average result is taken as
the integrated blur (Eall).
[0243] Hereinafter, the configuration of the image processing
device configured to execute these two types of processing will be
sequentially described.
[0244] (A1) The configuration in which the integrated blur (Eall)
to be applied or selected is switched between the image-based blur
(Et) and the camera motion-based blur (Ec) according to the amount
of object motion;
[0245] the above-described configuration (A1) will be first
described with reference to FIG. 9.
[0246] FIG. 9 is a diagram for describing the configuration and
processing of the image processing device corresponding to the
configuration A1.
[0247] The image processing device A1, 20-A1 illustrated in FIG. 9
has a visible light image input unit 21, a far-infrared image input
unit 22, a camera motion information input unit 23, an image-based
blur estimator 30, a camera motion-based blur estimator 40, an
integrated blur estimator 50, a blur remover 60, a filter bank 35,
a camera motion blur map storage unit 45, and an environment
information storage unit/input unit 55.
[0248] Further, the image-based blur estimator 30 has a filter
processor 31, a correlation computer 32, and an image-based filter
(Etf) determinator 33.
[0249] The camera motion-based blur estimator 40 has a camera
motion blur map acquirer 41.
[0250] The integrated blur estimator 50 has an object motion
determinator 51 and an integrated filter (Eallf) determinator
52.
[0251] Moreover, the blur remover 60 has an inverse filter
calculator 61 and an inverse filter processor 62.
[0252] The visible light image input unit 21 inputs a
pre-correction visible light image 25 to the image-based blur
estimator 30 and the blur remover 60.
[0253] Moreover, the far-infrared image input unit 22 inputs a
far-infrared image 26 to the image-based blur estimator 30.
Further, the camera motion information input unit 23 inputs the
camera motion information acquired by a motion sensor such as an
IMU, for example, to the camera motion-based blur estimator 40.
[0254] The pre-correction visible light image 25 and the
far-infrared image 26 input from the visible light image input unit
21 and the far-infrared image input unit 22 are images captured by
simultaneous photographing of the same object. These images are,
for example, images captured in the dark, and the pre-correction
visible light image 25 input from the visible light image input
unit 21 is blurred (defocused) due to long exposure.
[0255] On the other hand, the far-infrared image 26 input from the
far-infrared image input unit 22 is an image exposed for a short
period of time, and is an image with little blur (defocusing).
[0256] Note that any of the pre-correction visible light image 25
and the far-infrared image 26 is an image with W.times.H pixels,
i.e., W pixels in a crosswise direction and H pixels in a
lengthwise direction. In the figure, the pre-correction visible
light image 25 and the far-infrared image 26 are illustrated as a
pre-correction visible light image [W*H] 25 and a far-infrared
image [W*H] 26.
[0257] Moreover, [W.sub.B*H.sub.B] illustrated in the figure
indicates a single block region as a divided image region.
[0258] The number of blocks per image frame is N.
[0259] Next, the processing executed by the image-based blur
estimator 30 will be described.
[0260] The filter processor 31 of the image-based blur estimator 30
sequentially applies, to the far-infrared image 26, various filters
(blur (defocusing) generation filters) stored in the filter bank
35. That is, various forms of blur are intentionally generated on
the far-infrared image 26.
[0261] In the filter bank 35, many blur generation filters for
different blur (defocusing) sizes and directions are stored. That
is, many filters corresponding to various PSFs are stored.
[0262] A far-infrared image intentionally blurred by application of
the filters to the far-infrared image 26 by the filter processor 31
of the image-based blur estimator 30 is output to the correlation
computer 32.
[0263] The correlation computer 32 calculates a correlation between
the far-infrared image intentionally blurred by filter application
and the pre-correction visible light image 25.
[0264] Note that the filter application processing and the
correlation calculation processing executed by the filter processor
31 and the correlation computer 32 are executed for each
corresponding unit block of the N block regions of the
pre-correction visible light image 25 and the N block regions of
the far-infrared image 26.
[0265] The filter processor 31 sequentially applies, for each of
the N blocks of the far-infrared image 26, various filters (blur
(defocusing) generation filters) stored in the filter bank 35.
[0266] For each of the N blocks of the far-infrared image 26, the
correlation computer 32 calculates the correlation between a result
obtained by sequential application of various filters (blur
(defocusing) generation filters) stored in the filter bank 35 and
the pre-correction visible light image 25, and outputs the
correlation value corresponding to each filter for each of the N
blocks to the image-based filter (Etf) determinator 33.
[0267] For each block, the image-based filter (Etf) determinator 33
selects, from the input data from the correlation computer 32,
i.e., the correlation values corresponding to each filter for each
of the N blocks, a filter corresponding to a block with the highest
correlation.
[0268] The N filters selected for each of the N blocks by the
image-based filter (Etf) determinator 33 are input to the
integrated filter (Eallf) determinator 52 of the integrated blur
estimator 50.
[0269] The integrated filter (Eallf) determinator 52 of the
integrated blur estimator 50 receives the image-based filter (Etf)
corresponding to the block with the highest correlation from the
image-based filter (Etf) determinator 33, and further receives a
camera motion blur map corresponding to the camera motion from the
camera motion blur map acquirer 41 of the camera motion-based blur
estimator 40.
[0270] The camera motion blur map acquirer 41 of the camera
motion-based blur estimator 40 acquires, on the basis of the camera
motion information input from the camera motion information input
unit 23, a single camera motion blur map from blur maps stored
corresponding to various types of camera motion in the camera
motion blur map storage unit 45, and inputs such a camera motion
blur map to the integrated filter (Eallf) determinator 52 of the
integrated blur estimator 50.
[0271] Examples of the blur maps stored corresponding to various
types of camera motion in the camera motion blur map storage unit
45 are illustrated in FIG. 10.
[0272] FIG. 10 illustrates diagrams of settings of a vector
indicating the direction and size of the blur generated on the
captured image according to the camera motion similar to that
described earlier with reference to FIG. 4.
[0273] Each image is divided into blocks with a predetermined size.
Note that although illustrated in a simple manner in the figure,
the vector indicating the direction and size of the blur is set for
each unit block.
[0274] Data with the above-described settings of the vector
indicating the direction and size of the blur according to the
camera motion for each unit block as the divided image region will
be referred to as a camera motion blur map. Note that the block is
the same block as the block for which the correlation calculation
processing in the correlation computer 32 of the image-based blur
estimator 30 is executed, and is also a unit of integrated filter
(Eall) generated in the integrated blur estimator 50.
[0275] FIG. 10 illustrates the examples of the blur maps
corresponding to the following four types of camera motion:
[0276] (1) a camera motion blur map in the case of moving a camera
straight forward;
[0277] (2) a camera motion blur map in the case of moving the
camera straight backward;
[0278] (3) a camera motion blur map in the case of rotating the
camera to the right; and
[0279] (4) a camera motion blur map in the case of rotating the
camera upward.
[0280] Note that in addition to above, blur maps corresponding to
various types of camera motion are also stored in the camera motion
blur map storage unit 45.
[0281] The camera motion blur map acquirer 41 of the camera
motion-based blur estimator 40 selects, on the basis of the camera
motion information input from the camera motion information input
unit 23, a blur map corresponding to a camera motion setting
coincident with or closest to the detected camera motion, and
inputs the selected blur map to the integrated filter (Eallf)
determinator 52 of the integrated blur estimator 50.
[0282] The integrated filter (Eallf) determinator 52 of the
integrated blur estimator 50 determines, on the basis of object
motion amount determination information input from the object
motion determinator 51, i.e., determination information regarding
the amount of object motion, whether the integrated filter (Eallf)
is set to the image-based filter (Etf) or the camera motion-based
filter (Ecf) based on the camera motion-based blur (Ec).
[0283] Note that such determination processing is executed for each
unit block.
[0284] The object motion determinator 51 acquires the environment
information from the environment information storage unit/input
unit 55, and for each block of the image, determines whether the
block includes much object motion or little object motion.
[0285] As described earlier, the environment information includes,
for example, the map information, the time information, the traffic
information, and the information regarding the block position on
the image.
[0286] For each unit block, the object motion determinator 51
executes a pre-defined algorithm to determine, on the basis of
various types of environment information described above, whether
the block includes much object motion or little object motion, and
outputs such block-by-block determination information to the
integrated filter (Eallf) determinator 52.
[0287] The integrated filter (Eallf) determinator 52 performs the
following processing on the basis of the block-by-block
determination information input from the object motion determinator
51, i.e., the determination information regarding whether the block
includes much object motion or little object motion.
[0288] For the block with much object motion, the image-based
filter (Etf) input from the image-based blur estimator 30 is set as
the integrated filter (Eallf).
[0289] On the other hand, for the block with little object motion,
the camera motion-based filter (Ecf) determined on the basis of the
block-by-block camera motion-based blur (Ec) included in the camera
motion blur map input from the camera motion-based blur estimator
40 is set as the integrated filter (Eallf).
[0290] As described above, the integrated filter (Eallf)
determinator 52 performs, for each unit block, the filter selection
processing of taking any of the image-based filter (Etf) and the
camera motion-based filter (Ecf) as the integrated filter
(Eallf).
[0291] The block-by-block integrated filter (Eallf) determined by
the integrated filter (Eallf) determinator 52 is input to the
inverse filter calculator 61 of the blur remover 60.
[0292] The inverse filter calculator 61 of the blur remover 60
receives the block-by-block integrated filter (Eallf) determined by
the integrated filter (Eallf) determinator 52, thereby generating
an inverse filter with characteristics opposite to those of each
block-by-block integrated filter (Eallf).
[0293] That is, the inverse filter calculator 61 generates a
deconvolution filter, and outputs the deconvolution filter to the
inverse filter processor 62.
[0294] The deconvolution filter is a filter with characteristics
opposite to those of each block-by-block integrated filter
(Eallf).
[0295] The inverse filter processor 62 of the blur remover 60
receives the inverse filter corresponding to each block from the
inverse filter calculator 61, and applies the input inverse filter
to a corresponding block of the pre-correction visible light image
25 input from the visible light image input unit 21.
[0296] That is, the inverse filter processor 62 applies, to the
corresponding block of the pre-correction visible light image 25,
the inverse filter with the characteristics opposite to those of
any of the image-based filter (Etf) or the camera motion-based
filter (Ecf) as the block-by-block filter determined by the
integrated filter (Eallf) determinator 52 of the integrated blur
estimator 50.
[0297] When the inverse filter application processing is completed
for all of the N blocks of the pre-correction visible light image
25 input from the visible light image input unit 21, the resultant
image is output as a post-correction visible light image 27.
[0298] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
is generated from the pre-correction visible light image 25, and is
output.
[0299] (A2) The configuration in which the weighted average of the
image-based blur (Et) and the camera motion-based blur (Ec) is
utilized as the integrated blur (Eall) to change the weighted
average form according to the amount of object motion:
[0300] the above-described configuration (A2) will be described
next with reference to FIG. 11.
[0301] FIG. 11 is a diagram for describing the configuration and
processing of the image processing device corresponding to the
configuration A2.
[0302] The image processing device A2, 20-A2 illustrated in FIG. 11
has the visible light image input unit 21, the far-infrared image
input unit 22, the camera motion information input unit 23, the
image-based blur estimator 30, the camera motion-based blur
estimator 40, the integrated blur estimator 50, the blur remover
60, the filter bank 35, the camera motion blur map storage unit 45,
and the environment information storage unit/input unit 55.
[0303] Further, the image-based blur estimator 30 has the filter
processor 31, the correlation computer 32, and the image-based
filter (Etf) determinator 33.
[0304] The camera motion-based estimator 40 has the camera motion
blur map acquirer 41.
[0305] The integrated blur estimator 50 has a weight calculator 53
and an integrated filter (Eallf) calculator 54.
[0306] Moreover, the blur remover 60 has the inverse filter
calculator 61 and the inverse filter processor 62.
[0307] A difference from the image processing device A1, 20-A1
corresponding to the configuration A1 described earlier with
reference to FIG. 9 is only the configuration of the integrated
blur estimator 50, i.e., a point that the integrated blur estimator
50 has the weight calculator 53 and the integrated filter (Eallf)
calculator 54.
[0308] Other configurations are similar to those of the image
processing device A1, 20-A1 corresponding to the configuration A1
described earlier with reference to FIG. 9, and similar processing
is also applied.
[0309] Hereinafter, the processing of the weight calculator 53 and
the integrated filter (Eallf) calculator 54 of the integrated blur
estimator 50 in the image processing device A2, 20-A2 illustrated
in FIG. 11 will be described.
[0310] The integrated filter (Eallf) calculator 54 of the
integrated blur estimator 50 receives the image-based filter (Etf)
corresponding to the block with the highest correlation from the
image-based filter (Etf) determinator 33, and further receives the
camera motion blur map corresponding to the camera motion from the
camera motion blur map acquirer 41 of the camera motion-based blur
estimator 40.
[0311] The camera motion blur map acquirer 41 of the camera
motion-based blur estimator 40 acquires, on the basis of the camera
motion information input from the camera motion information input
unit 23, a single camera motion blur map from the blur maps stored
corresponding to various types of camera motion in the camera
motion blur map storage unit 45, and inputs such a camera motion
blur map to the integrated filter (Eallf) calculator 54 of the
integrated blur estimator 50.
[0312] The integrated filter (Eallf) calculator 54 calculates, from
the block-by-block camera motion blur (Ec) acquired from the camera
motion blur map, the camera motion filter (Ecf) as a filter
corresponding to such a blur, i.e., a filter for generating such a
blur.
[0313] Further, the integrated filter (Eallf) calculator 54 uses
weight information input from the weight calculator 53 to perform
weighted averaging for the image-based filter (Etf) and the camera
motion-based filter (Ecf), thereby calculating the integrated
filter (Eallf).
[0314] Specifically, the integrated filter (Eallf) is, using, e.g.,
weight coefficients .alpha., .beta., calculated according to the
following calculation expression:
Eallf=.alpha.(Etf)+.beta.(Ecf)
[0315] Note that the above-described expression corresponds to the
expression for calculating a filter coefficient of each filter.
[0316] Moreover, the processing of calculating the integrated
filter (Eallf) is executed for each unit block.
[0317] The weight coefficients .alpha., .beta. are values of 0 to
1, and are input from the weight calculator 53.
[0318] The weight calculator 53 acquires the environment
information from the environment information storage unit/input
unit 55, and calculates the weight coefficients .alpha., .beta. for
each block of the image.
[0319] As described earlier, the environment information includes,
for example, the map information, the time information, the traffic
information, and the information regarding the block position on
the image.
[0320] The weight calculator 53 executes a pre-defined algorithm to
calculate, on the basis of various types of environment information
as described above, the weight coefficients .alpha., .beta. for
each unit block.
[0321] Specifically, for the block with much object motion, the
weight coefficient .alpha. is set to a greater value. That is, the
weight coefficient is set so that the integrated filter (Eallf)
with a setting of a high contribution rate of the image-based
filter (Etf) can be calculated.
[0322] On the other hand, for the block with little object motion,
the weight coefficient .beta. is set to a greater value. That is,
the weight coefficient is set so that the integrated filter (Eallf)
with a setting of a high contribution rate of the camera
motion-based filter (Ecf) can be calculated.
[0323] The integrated filter (Eallf) calculator 54 uses the
block-by-block weight information input from the weight calculator
53, i.e., each of the above-described block-by-block weight
coefficients .alpha., to perform weighted averaging for the
image-based filter (Etf) and the camera motion-based filter (Ecf),
thereby calculating the integrated filter (Eallf).
[0324] As a result, for the block with much object motion, the
integrated filter (Eallf) with a high contribution rate of the
image-based filter (Etf) input from the image-based blur estimator
30 is calculated.
[0325] On the other hand, for the block with little object motion,
the integrated filter (Eallf) with a high contribution rate of the
camera motion-based filter (Ecf) determined on the basis of the
block-by-block camera motion-based blur (Ec) included in the camera
motion blur map input from the camera motion-based blur estimator
40 is calculated.
[0326] As described above, the integrated filter (Eallf) calculator
54 calculates, for each unit block, the weighted average of the
image-based filter (Etf) and the camera motion-based filter (Ecf),
thereby performing the processing of calculating the integrated
filter (Eallf).
[0327] The block-by-block integrated filter (Eallf) calculated by
the integrated filter (Eallf) calculator 54 is input to the inverse
filter calculator 61 of the blur remover 60.
[0328] The inverse filter calculator 61 of the blur remover 60
receives the block-by-block integrated filter (Eallf) determined by
the integrated filter (Eallf) determinator 52, thereby generating
the inverse filter with the characteristics opposite to those of
each block-by-block integrated filter (Eallf).
[0329] The inverse filter processor 62 of the blur remover 60
receives the inverse filter corresponding to each block from the
inverse filter calculator 61, and applies the input inverse filter
to the corresponding block of the pre-correction visible light
image 25 input from the visible light image input unit 21.
[0330] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
is generated from the pre-correction visible light image 25, and is
output.
[0331] Next, a sequence of the processing executed by the image
processing device corresponding to the configuration A described
with reference to FIGS. 9 and 11 will be described with reference
to flowcharts illustrated in FIG. 12 and subsequent figures.
[0332] Note that the processing according to the flowcharts
illustrated in FIG. 12 and subsequent figures is, for example,
processing executable according to a program stored in the storage
unit of the image processing device, and can be executed under the
control of a controller (the data processor) including a CPU with a
program execution function and the like.
[0333] First, a sequence of the processing executed by the image
processing device-A1 configured such that (A1) the integrated blur
(Eall) to be applied or selected is switched between the
image-based blur (Et) and the camera motion-based blur (Ec)
according to the amount of object motion as described with
reference to FIG. 9 will be described with reference to the
flowchart of FIG. 12. Hereinafter, processing of each step of a
flow illustrated in FIG. 12 will be sequentially described.
[0334] (Step S101)
[0335] First, at a step S101, the visible light image as a
correction target is acquired.
[0336] This processing is performed by the visible light image
input unit 21 in the image processing device illustrated in FIG. 9.
Specifically, this processing is the processing of acquiring an
image captured by a visible light image camera, for example.
[0337] (Step S102)
[0338] Next, at a step S102, the far-infrared image to be utilized
as a reference image is acquired.
[0339] This processing is executed by the far-infrared image input
unit 22 in the image processing device illustrated in FIG. 9.
Specifically, this processing is the processing of acquiring an
image captured by a far-infrared image camera, for example.
[0340] Note that the visible light image and the far-infrared image
acquired at the step S101 and the step S102 are images captured by
simultaneous photographing of the same object.
[0341] These images are, for example, images captured in the dark,
and the visible light image is blurred (defocused) due to long
exposure. On the other hand, the far-infrared image is an image
exposed for a short period of time, and is an image with little
blur (defocusing).
[0342] (Step S103)
[0343] Next, at a step S103, the camera motion information is
acquired.
[0344] This processing is performed by the camera motion
information input unit 23 in the image processing device
illustrated in FIG. 9. Specifically, the camera motion information
is acquired and input by the sensor such as an IMU, for
example.
[0345] (Step S104)
[0346] Next, at a step S104, the camera motion blur map is acquired
on the basis of the camera motion information input at the step
S103.
[0347] This processing is processing executed by the camera motion
blur map acquirer 41 of the camera motion-based blur estimator 40
in the image processing device illustrated in FIG. 9.
[0348] The camera motion blur map acquirer 41 of the camera
motion-based blur estimator 40 acquires, on the basis of the camera
motion information input from the camera motion information input
unit 23, a single camera motion blur map from the blur maps stored
corresponding to various types of camera motion in the camera
motion blur map storage unit 45, and inputs the camera motion blur
map to the integrated filter (Eallf) determinator 52 of the
integrated blur estimator 50.
[0349] (Step S105)
[0350] Processing from a subsequent step S105 to a step S107 is
loop processing (loop 1) sequentially and repeatedly executed for
all blocks as the divided regions set for the visible light image
and the far-infrared image.
[0351] Note that the number of blocks is N.
[0352] (Step S106)
[0353] At the step S106, the image-based filter (Etf) is acquired
for each unit block.
[0354] This processing is processing executed by the image-based
blur estimator 30 in the image processing device illustrated in
FIG. 9.
[0355] Details of the processing of the step S106 will be described
with reference to a flow illustrated in FIG. 13. For this
processing of acquiring the image-based filter (Etf) for each unit
block at the step S106, the visible light image acquired at the
step S101 and the far-infrared image acquired at the step S102 are
utilized. Thus, FIG. 13 also illustrates the step S101 and the step
S102.
[0356] The detailed processing of the step S106 is processing of
steps S121 to S126 illustrated in FIG. 13.
[0357] Hereinafter, this processing will be sequentially
described.
[0358] (Step S121)
[0359] The processing from the step S121 to the step S125 is loop
processing (loop 1b) sequentially and repeatedly executed for all
filter IDs corresponding to all filters stored in the filter bank
35.
[0360] (Step S122)
[0361] At the step S122, the filter (coefficient) is acquired.
[0362] The processing of the steps S122 to S123 is processing
executed by the filter processor 31 of the image-based blur
estimator 30 illustrated in FIG. 9. The filter processor 31
sequentially acquires, from the filter bank 35, the filter (filter
(defocusing) generation filter) to be applied to each block of the
far-infrared image.
[0363] Note that data sequentially acquired from the filter bank 35
may be any of the filter itself or the filter coefficient as data
forming the filter.
[0364] (Step S123)
[0365] Next, at the step S123, the filter acquired at the step S122
is applied to a single block of the far-infrared image, i.e., a
block currently selected as a processing target. This processing is
filter processing for intentionally blurring the far-infrared
image.
[0366] (Step S124) Next, at the step S124, the correlation value
between the block of the far-infrared image as a filter application
result at the step S123 and a corresponding block of the visible
light image.
[0367] This processing is processing executed by the correlation
computer 32 of the image-based blur estimator 30 illustrated in
FIG. 9.
[0368] The correlation computer 32 calculates the correlation
between the far-infrared image intentionally blurred by filter
application and the visible light image.
[0369] (Step S125)
[0370] The step S125 is an end point of the loop 1b of the steps
S121 to S125.
[0371] That is, the processing of the steps S122 to S124 is
sequentially and repeatedly executed for all filter IDs
corresponding to all filters stored in the filter bank 35.
[0372] (Step S126)
[0373] When the processing of the loop 1b of the steps S121 to S125
for a single block is completed, the processing proceeds to the
step S126.
[0374] That is, for a single block, when the processing of
calculating the correlation values corresponding to all filters
stored in the filter bank 35 ends, the processing proceeds to the
step S126.
[0375] The processing of the step S126 is processing executed by
the image-based filter (Etf) determinator 33 of the image-based
blur estimator 30 illustrated in FIG. 9.
[0376] At the step S126, the image-based filter (Etf) determinator
33 selects, for the block for which the processing of the loop 1b
of the steps S121 to S125 has completed, a filter ID with the
highest correlation value among the correlation values
corresponding to all filters stored in the filter bank 35.
[0377] The processing of the step S106 of the flow illustrated in
FIG. 12 includes the processing of these steps S121 to S126.
Referring back to FIG. 12, processing after the processing of the
step S106 will be described.
[0378] (Step S107)
[0379] The step S107 is an end point of the loop 1 of the steps
S105 to S107.
[0380] That is, the processing of the step S106 is sequentially and
repeatedly executed for all blocks as the divided regions set for
the visible light image and the far-infrared image.
[0381] When this loop processing (loop 1) is completed, the
image-based filter (Etf) with the highest correlation value is
determined for all of the N blocks.
[0382] (Step S108)
[0383] Next, at a step S108, the environment information upon
capturing of the image is acquired.
[0384] This processing is executed utilizing the environment
information storage unit/input unit 55 of the image processing
device illustrated in FIG. 9.
[0385] As described earlier, the environment information includes,
for example, the map information, the time information, the traffic
information, and the information regarding the block position on
the image.
[0386] (Step S109)
[0387] Processing from a subsequent step S109 to a step S112 is
loop processing (loop 2) sequentially and repeatedly executed for
all blocks as the divided regions set for the visible light image
and the far-infrared image.
[0388] Note that the number of blocks is N.
[0389] (Step S110)
[0390] The processing of the steps S110 to S111 is processing
executed by the integrated blur estimator 50 illustrated in FIG.
9.
[0391] First, at the step S110, the object motion determinator 51
of the integrated blur estimator 50 executes the pre-defined
algorithm to determine whether the block includes much object
motion or little object motion for each unit block on the basis of
various types of environment information as described above. The
determination information for each unit block is output to the
integrated filter (Eallf) determinator 52.
[0392] (Step S111)
[0393] Next, at the step S111, the integrated filter (Eallf)
determinator 52 performs, for each unit block, the filter selection
processing of taking any of the image-based filter (Etf) and the
camera motion-based filter (Ecf) as the integrated filter
(Eallf).
[0394] Specifically, for the block with much object motion, the
image-based filter (Etf) input from the image-based blur estimator
30 is set as the integrated filter (Eallf).
[0395] On the other hand, for the block with little object motion,
the camera motion-based filter (Ecf) determined on the basis of the
block-by-block camera motion-based blur (Ec) included in the camera
motion blur map input from the camera motion-based blur estimator
40 is set as the integrated filter (Eallf).
[0396] (Step S112)
[0397] The step S112 is an end point of the loop 2 of the steps
S109 to S112.
[0398] That is, the processing of the steps S110 to S111 is
sequentially and repeatedly executed for all blocks as the divided
regions set for the visible light image and the far-infrared
image.
[0399] When this loop processing (loop 2) is completed, the
integrated filter (Eallf) is determined for all of the N
blocks.
[0400] (Step S113)
[0401] Processing from a subsequent step S113 to a step S116 is
loop processing (loop 3) sequentially and repeatedly executed for
all blocks as the divided regions set for the visible light image
and the far-infrared image.
[0402] Note that the number of blocks is N.
[0403] (Step S114)
[0404] The processing of the steps S114 to S115 is processing
executed by the blur remover 60 illustrated in FIG. 9.
[0405] First, at the step S114, the inverse filter calculator 61 of
the blur remover 60 receives the block-by-block integrated filter
(Eallf) determined by the integrated filter (Eallf) determinator
52, thereby calculating the inverse filter with the characteristics
opposite to those of each block-by-block integrated filter
(Eallf).
[0406] (Step S115)
[0407] Next, at the step S115, the inverse filter processor 62 of
the blur remover 60 applies the inverse filter calculated at the
step S114 to the processing target block of the visible light
image.
[0408] (Step S116)
[0409] The step S116 is an end point of the loop 3 of the steps
S113 to S116.
[0410] That is, the processing of the steps S114 to S115 is
sequentially and repeatedly executed for all blocks as the divided
regions set for the visible light image as a correction target
image.
[0411] When the inverse filter application processing is completed
for all of the N blocks of the visible light image, the resultant
image is output as the post-correction visible light image.
[0412] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
as illustrated in FIG. 9 is generated from the visible light image
as the input image at the step S101, i.e., the pre-correction
visible light image 25 illustrated in FIG. 9, and is output.
[0413] Next, a sequence of the processing executed by the image
processing device-A2 configured such that (A2) the weighted average
of the image-based blur (Et) and the camera motion-based blur (Ec)
is utilized as the integrated blur (Eall) to change the weighted
average form according to the amount of object motion as described
with reference to FIG. 11 will be described with reference to a
flowchart of FIG. 14.
[0414] Note that a difference between a flow illustrated in FIG. 14
and the flow illustrated in FIG. 12 for describing the processing
corresponding to the configuration A1 is only steps S110b to S111b
of the flow illustrated in FIG. 14. That is, the difference is that
the step S110 of the flow illustrated in FIG. 12 is substituted
with the step S110b of the flow illustrated in FIG. 14 and the step
S111 of the flow illustrated in FIG. 12 is substituted with the
step S111b of the flow illustrated in FIG. 14.
[0415] Other types of processing are processing similar to each
type of processing of the flow illustrated in FIG. 12, and
therefore, description thereof will not be repeated. Hereinafter,
the processing of the above-described steps S110b to S111b will be
described.
[0416] (Step S110b)
[0417] The processing from the subsequent step S109 to the step
S112 is the loop processing (loop 2) sequentially and repeatedly
executed for all blocks as the divided regions set for the visible
light image and the far-infrared image. Note that the number of
blocks is N.
[0418] The processing of the steps S110b to S111b is processing
executed by the integrated blur estimator 50 illustrated in FIG.
11.
[0419] First, at the step S110b, the weight calculator 53 of the
integrated blur estimator 50 acquires the environment information
from the environment information storage unit/input unit 55, and
calculates the weight information for each block of the image.
[0420] As described earlier, the environment information includes,
for example, the map information, the time information, the traffic
information, and the information regarding the block position on
the image.
[0421] For example, the expression for calculating the weighted
average of the image-based filter (Etf) and the camera motion-based
filter (Ecf), i.e., the expression for calculating the integrated
filter (Eallf) by means of the following weight coefficients
.alpha., .beta., is
Eallf=.alpha.(Etf)+.beta.(Ecf).
[0422] The weight information is each value of the weight
coefficients .alpha., .beta., in the above-described
expression.
[0423] The weight calculator 53 of the integrated blur estimator 50
executes the pre-defined algorithm to calculate the weight
coefficients .alpha., .beta., for each unit block on the basis of
various types of environment information as described above.
[0424] Specifically, for the block with much object motion, the
weight coefficient .alpha. is set to a greater value.
[0425] On the other hand, for the block with little object motion,
the weight coefficient .beta., is set to a greater value.
[0426] (Step S111b)
[0427] Next, at the step S111b, the integrated filter (Eallf)
calculator 54 uses the block-by-block weight information input from
the weight calculator 53, i.e., the weight coefficients .alpha.,
.beta., for each unit block as described above, to perform weighted
averaging for the image-based filter (Etf) and the camera
motion-based filter (Ecf), thereby calculating the integrated
filter (Eallf).
[0428] Specifically, for the block with much object motion, the
integrated filter (Eallf) with the setting of a high contribution
rate of the image-based filter (Etf) is calculated.
[0429] On the other hand, for the block with little object motion,
the integrated filter (Eallf) with the setting of a high
contribution rate of the camera motion-based filter (Ecf) is
calculated.
[0430] At the steps S109 to S112, the processing of the steps S110b
to S111b is sequentially and repeatedly executed for all blocks as
the divided regions set for the visible light image and the
far-infrared image.
[0431] When this loop processing (loop 2) is completed, the
integrated filter (Eallf) is determined for all of the N
blocks.
[0432] At the steps subsequent to the step S113, calculation of the
inverse filter of the integrated filter (Eallf) and application
processing of the integrated filter (Eallf) to the correction
target visible light image are performed in the blur remover
60.
[0433] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
as illustrated in FIG. 11 is generated from the visible light image
as the input image at the step S101, i.e., the pre-correction
visible light image 25 illustrated in FIG. 11, and is output.
[0434] (6. (Second Embodiment) Configuration and Processing of
Image Processing Device Corresponding to Configuration B)
[0435] Next, specific configuration and processing of an image
processing device corresponding to the configuration B described
with reference to FIG. 7 will be described as a second embodiment
of the image processing device of the present disclosure.
[0436] As described earlier with reference to FIG. 7, the
configuration B is a configuration in which image-based blur
estimation using a visible light image and a far-infrared image and
camera motion-based blur estimation based on camera motion
information are executed in combination to execute integrated blur
estimation.
[0437] Note that any of the following two types of processing is
executed in the processing of generating an integrated blur (Eall)
in the configuration B:
[0438] (B1) the processing of selecting a filter to be applied to
the far-infrared image, i.e., a filter for generating a blur
(defocusing), on the basis of a camera motion-based blur (Ec);
and
[0439] (B2) the processing of correcting a correlation value
between a filter-applied far-infrared image and the blurred visible
light image on the basis of the camera motion-based blur (Ec).
[0440] (B1) The processing of selecting the filter to be applied to
the far-infrared image, i.e., the filter for generating the blur
(defocusing), on the basis of the camera motion-based blur
(Ec):
[0441] the configuration of the image processing device-B1
configured to execute the above-described processing will be first
described with reference to FIG. 15.
[0442] FIG. 15 is a diagram for describing the configuration and
processing of the image processing device-B1.
[0443] The image processing device B1, 20-B1 illustrated in FIG. 15
has a visible light image input unit 21, a far-infrared image input
unit 22, a camera motion information input unit 23, a blur
estimator 30b, a camera motion-based blur estimator 40, a blur
remover 60, a filter bank pool 36, and a camera motion blur map
storage unit 45.
[0444] Further, the blur estimator 30b has a filter processor 31, a
correlation computer 32, a filter bank selector 34, and an
integrated filter (Eallf) determinator 37.
[0445] The camera motion-based blur estimator 40 has a camera
motion blur map acquirer 41.
[0446] Moreover, the blur remover 60 has an inverse filter
calculator 61 and an inverse filter processor 62.
[0447] The visible light image input unit 21 inputs a
pre-correction visible light image 25 to the blur estimator 30b and
the blur remover 60.
[0448] Moreover, the far-infrared image input unit 22 inputs a
far-infrared image 26 to the blur estimator 30b.
[0449] Further, the camera motion information input unit 23 inputs
camera motion information acquired by a motion sensor such as an
IMU, for example, to the camera motion-based blur estimator 40.
[0450] The pre-correction visible light image 25 and the
far-infrared image 26 input from the visible light image input unit
21 and the far-infrared image input unit 22 are images captured by
simultaneous photographing of the same object. These images are,
for example, images captured in the dark, and the pre-correction
visible light image 25 input from the visible light image input
unit 21 is blurred (defocused) due to long exposure.
[0451] On the other hand, the far-infrared image 26 input from the
far-infrared image input unit 22 is an image exposed for a short
period of time, and is an image with little blur (defocusing).
[0452] Note that any of the pre-correction visible light image 25
and the far-infrared image 26 is an image with W.times.H pixels,
i.e., W pixels in a crosswise direction and H pixels in a
lengthwise direction. In the figure, the pre-correction visible
light image 25 and the far-infrared image 26 are illustrated as a
pre-correction visible light image [W*H] 25 and a far-infrared
image [W*H] 26.
[0453] Moreover, [W.sub.B*H.sub.B] illustrated in the figure
indicates a single block region as a divided image region.
[0454] The number of blocks per image frame is N.
[0455] Next, the processing executed by the blur estimator 30b will
be described.
[0456] The filter processor 31 of the blur estimator 30b
sequentially applies, to the far-infrared image 26, various filters
(blur (defocusing) generation filters) stored in the filter bank 35
selected from the filter bank pool 36 by the filter bank selector
34. That is, various forms of blur are intentionally generated on
the far-infrared image 26.
[0457] In the filter bank 35, multiple different blur (defocusing)
generation filters for sizes and directions within specific ranges
are stored. That is, many filters corresponding to various PSFs are
stored.
[0458] The filter bank 35 is one filter bank selected from the
filter bank pool 36 by the filter bank selector 34.
[0459] The filter bank selector 34 receives a camera motion blur
map corresponding to camera motion from the camera motion blur map
acquirer 41 of the camera motion-based blur estimator 40, and
selects, from the filter bank pool 36, a filter bank storing
filters corresponding to blurs with directions and sizes similar to
those of a blur set to the input camera motion blur map.
[0460] Note that such filter bank selection processing is executed
for each unit block as the divided image region.
[0461] Examples of the filter banks stored in the filter bank pool
36 and an example of the processing of selecting the filter bank
based on the camera motion blur map will be described with
reference to FIG. 16.
[0462] FIG. 16(1) illustrates the examples of the filter banks
stored in the filter bank pool 36.
[0463] The filter banks stored in the filter bank pool 36 include,
for example, the following filter banks as illustrated in FIG.
16(1):
[0464] (a1) a filter bank corresponding to a horizontal blur;
[0465] (a2) a filter bank corresponding to a long horizontal
blur;
[0466] (b1) a filter bank corresponding to a vertical blur; and
[0467] (c) a filter bank corresponding to a blur in a diagonal
direction toward the upper right side.
[0468] As described above, many filter banks storing multiple
filters corresponding to multiple blurs with similar settings,
i.e., multiple filters with similar coefficient settings for
generating similar blurs, are stored in the filter bank pool
36.
[0469] The filter bank selector 34 receives the camera motion blur
map corresponding to the camera motion from the camera motion blur
map acquirer 41 of the camera motion-based blur estimator 40, and
selects, from the filter bank pool 36, the filter bank storing the
filters corresponding to the blurs with the directions and sizes
similar to those of the blur set to the input camera motion blur
map.
[0470] FIG. 16(2) is the example of the camera motion blur map
corresponding to the camera motion input from the camera motion
blur map acquirer 41 of the camera motion-based blur estimator 40
to the filter bank selector 34.
[0471] This camera motion blur map is a blur map acquired on the
basis of determination that a camera is moving straight
forward.
[0472] The filter bank selector 34 receives the camera motion blur
map, and for each block of the input camera motion blur map,
selects the filter bank from the filter bank pool 36, the filter
bank storing the filters corresponding to the blurs with the
directions and the sizes similar to those of the blur set to the
block.
[0473] In the examples illustrated in FIG. 16, the filter banks
each selected for four blocks indicated by thick dashed frames are
illustrated.
[0474] The center block at a left end portion is a block for which
a camera motion-based blur (Ec) elongated in the crosswise
direction is set.
[0475] The filter bank selector 34 selects, as the filter bank for
this block, (a2) the filter bank corresponding to the long
horizontal blur, such a filter bank storing the filters for
generating blurs similar to such a horizontally-elongated blur.
[0476] The third center block from the left is a block for which a
camera motion-based blur (Ec) in the crosswise direction is
set.
[0477] The filter bank selector 34 selects, as the filter bank for
this block, (a1) the filter bank corresponding to the horizontal
blur, such a filter bank storing the filters for generating blurs
similar to such a horizontal blur.
[0478] The block at the fifth position from the left and the second
position from the top is a block for which a camera motion-based
blur (Ec) in the lengthwise direction is set. The filter bank
selector 34 selects, as the filter bank for this block, (b1) the
filter bank corresponding to the vertical blur, such a filter bank
storing the filters for generating blurs similar to such a vertical
blur.
[0479] The block at the fourth position from the right and the
third position from the top is a block for which a camera
motion-based blur (Ec) in the diagonal direction toward the upper
right side is set.
[0480] The filter bank selector 34 selects, as the filter bank for
this block, (c) the filter bank corresponding to the blur in the
diagonal direction toward the upper right side, such a filter bank
storing the filters for generating blurs similar to such a blur in
the diagonal direction toward the upper right side.
[0481] As described above, the filter bank selector 34 receives the
camera motion blur map corresponding to the camera motion from the
camera motion blur map acquirer 41, and for each block of the input
camera motion blur map, selects the filter bank from the filter
bank pool 36, the filter bank storing the filters corresponding to
the blurs with the directions and sizes similar to those of the
blur set to the block.
[0482] The filter bank n, 35 illustrated in FIG. 15 is a filter
bank selected for a certain block.
[0483] The selected filter bank 35 is input to the filter processor
31.
[0484] The filter processor 31 of the blur estimator 30b
sequentially applies, to each block of the far-infrared image 26,
the filters stored in the filter bank 35 selected by the filter
bank selector 34.
[0485] The far-infrared image intentionally blurred by filter
application is output to the correlation computer 32.
[0486] The correlation computer 32 calculates the correlation
between the far-infrared image intentionally blurred by filter
application and the pre-correction visible light image 25.
[0487] Note that the filter application processing and the
correlation calculation processing executed by the filter processor
31 and the correlation computer 32 are executed for a corresponding
unit block of the N block regions of the pre-correction visible
light image 25 and the N block regions of the far-infrared image
26.
[0488] The filter processor 31 sequentially applies, for each of
the N blocks of the far-infrared image 26, various filters (blur
(defocusing) generation filters) stored in the filter bank 35.
[0489] For each of the N blocks of the far-infrared image 26, the
correlation computer 32 calculates the correlation between a result
obtained by sequential application of various filters (blur
(defocusing) generation filters) stored in the filter bank 35 and
the pre-correction visible light image 25, and outputs the
correlation value corresponding to each filter for each of the N
blocks to the integrated filter (Eallf) determinator 37.
[0490] For each block, the integrated filter (Eallf) determinator
37 selects, from the input data from the correlation computer 32,
i.e., the correlation values corresponding to each filter for each
of the N blocks, a filter corresponding to a block with the highest
correlation.
[0491] The N filters selected for each of the N blocks by the
integrated filter (Eallf) determinator 37 are input to the inverse
filter calculator 61 of the blur remover 60.
[0492] The inverse filter calculator 61 of the blur remover 60
receives the block-by-block integrated filter (Eallf) determined by
the integrated filter (Eallf) determinator 37, thereby generating
an inverse filter with characteristics opposite to those of each
block-by-block integrated filter (Eallf).
[0493] That is, a deconvolution filter is generated, and is output
to the inverse filter processor 62.
[0494] The deconvolution filter is a filter with characteristics
opposite to those of each block-by-block integrated filter
(Eallf).
[0495] The inverse filter processor 62 of the blur remover 60
receives the inverse filter corresponding to each block from the
inverse filter calculator 61, thereby applying the input inverse
filter to a corresponding block of the pre-correction visible light
image 25 input from the visible light image input unit 21.
[0496] That is, the inverse filter processor 62 applies, to the
corresponding block of the pre-correction visible light image 25,
the inverse filter with the characteristics opposite to those of
the block-by-block integrated filter (Eallf) determined by the
integrated filter (Eallf) determinator 37.
[0497] When the inverse filter application processing is completed
for all of the N blocks of the pre-correction visible light image
25 input from the visible light image input unit 21, the resultant
image is output as a post-correction visible light image 27.
[0498] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
is generated from the pre-correction visible light image 25, and is
output.
[0499] (B2) The processing of correcting the correlation value
between the filter-applied far-infrared image and the blurred
visible light image on the basis of the camera motion-based blur
(Ec):
[0500] the configuration of the image processing device-B2
configured to execute the above-described processing will be
described next with reference to FIG. 17.
[0501] FIG. 17 is a diagram for describing the configuration and
processing of the image processing device-B2.
[0502] The image processing device B2, 20-B2 illustrated in FIG. 17
has the visible light image input unit 21, the far-infrared image
input unit 22, the camera motion information input unit 23, the
blur estimator 30b, the camera motion-based blur estimator 40, the
blur remover 60, and the camera motion blur map storage unit
45.
[0503] Further, the blur estimator 30b has the filter processor 31,
the correlation computer 32, the integrated filter (Eallf)
determinator 37, a filter (blur) comparator 38, and a correlation
corrector 39.
[0504] The camera motion-based blur estimator 40 has the camera
motion blur map acquirer 41.
[0505] Moreover, the blur remover 60 has the inverse filter
calculator 61 and the inverse filter processor 62.
[0506] A difference from the image processing device B1, 20-B1
described earlier with reference to FIG. 15 is a point that no
filter bank pool 36 is provided and the configuration of the blur
estimator 30b.
[0507] The blur estimator 30b of the image processing device B2,
20-B2 illustrated in FIG. 17 does not have the filter bank selector
34 described earlier with reference to FIG. 15, and has the filter
(blur) comparator 38 and the correlation corrector 39.
[0508] Hereinafter, processing executed by the filter (blur)
comparator 38 and the correlation corrector 39 will be mainly
described.
[0509] The filter (blur) comparator 38 receives each of the
following types of information.
[0510] The blur (defocusing) generation filter to be applied to the
far-infrared image 26 in the filter processor 31 is input from the
filter bank 35.
[0511] Further, the camera motion blur map corresponding to the
camera motion is input from the camera motion blur map acquirer 41
of the camera motion-based blur estimator 40.
[0512] The filter (blur) comparator 38 determines the degree of
similarity between these two filters (blurs) to calculate a
correction coefficient (0 to 1) according to the degree of
similarity, and outputs the correction coefficient to the
correlation corrector 39.
[0513] A greater (closer to 1) correction coefficient is set for a
higher degree of similarity between the two filters (blurs), and a
smaller (closer to 0) correction coefficient is set for a lower
degree of similarity.
[0514] The correlation corrector 39 multiplies the correction
coefficient input from the filter (blur) comparator 38 by the
correlation value (the correlation value between the pre-correction
visible light image 25 and the corresponding block of the
filter-applied far-infrared image) input from the correlation
computer 32, thereby calculating a corrected correlation value and
outputting the corrected correlation value to the integrated filter
determinator 37.
[0515] A specific example of a series of processing described above
will be described with reference to FIG. 18.
[0516] FIG. 18 illustrates each of the following types of
information:
[0517] (1) a filter F1 acquired from the filter bank 35;
[0518] (2) a correlation value calculated by the correlation
computer 32;
[0519] (3) a camera motion-based blur;
[0520] (4) a calculated correction coefficient based on the degree
of filter (blur) similarity; and
[0521] (5) a corrected correlation value calculated by the
correlation corrector 39.
[0522] The filter F1 acquired from the filter bank 35 as
illustrated in FIG. 18(1) is the blur (defocusing) generation
filter to be applied to the far-infrared image 26 in the filter
processor 31 of the blur estimator 30b illustrated in FIG. 17.
[0523] The filter (blur) comparator 38 receives such a filter.
Although only three types are illustrated in FIG. 18, many filters
are, in addition to above, sequentially applied to the far-infrared
image 26 in the filter processor 31.
[0524] Note that such filter application processing is executed for
each unit block.
[0525] The correlation value calculated by the correlation computer
32 as illustrated in FIG. 18(2) is the correlation value calculated
in the correlation computer 32 of the blur estimator 30b
illustrated in FIG. 17.
[0526] That is, such a correlation value is the correlation value
between the pre-correction visible light image 25 and the
corresponding block of the filter-applied far-infrared image.
[0527] Correlation Value=0.5;
[0528] Correlation Value=0.4; and
[0529] Correlation Value=0.1:
[0530] the figure illustrates an example where these correlation
values are calculated in this order from the top as three types of
correlation values corresponding to each of three types of filters
applied to the far-infrared image 26.
[0531] The camera motion-based blur in FIG. 18(3) is the blur
acquired from the camera motion blur map input from the camera
motion blur map acquirer 41 of the camera motion-based blur
estimator 40 and corresponding to the camera motion.
[0532] Such a blur is the camera motion-based blur for the block
corresponding to a correlation value calculation target block in
the correlation computer 32, and is acquired from the camera motion
blur map.
[0533] In the example illustrated in FIG. 18(3), the camera
motion-based blur (Ec) extending in a diagonal direction toward the
lower right side is shown.
[0534] The filter (blur) comparator 38 compares the two filters
(blurs) of the filter F1 acquired from the filter bank 35 as
illustrated in FIG. 18(1) and the camera motion-based blur (Ec) as
illustrated in FIG. 18(3).
[0535] Note that the filter F1 is the blur generation filter, and
it is determined whether or not the blur generated by the filter F1
acquired from the filter bank 35 as illustrated in FIG. 18(1) is
similar to the camera motion-based blur (Ec) illustrated in FIG.
18(3).
[0536] The filter (blur) comparator 38 determines the degree of
similarity between these two filters (blurs), thereby calculating
the correction coefficient (0 to 1) according to the degree of
similarity and outputting the correction coefficient to the
correlation corrector 39.
[0537] The correction coefficient calculated based on the degree of
filter (blur) similarity as illustrated in FIG. 18(4) is the
above-described correction coefficient.
[0538] Correlation Coefficient=0.3;
[0539] Correlation Coefficient=0.9; and
[0540] Correlation Coefficient=0.9:
[0541] the example illustrated in FIG. 18 shows an example where
these correlation coefficients are calculated in this order from
the top as three types of correlation coefficients corresponding to
each of three types of filters applied to the far-infrared image
26.
[0542] A greater (closer to 1) correction coefficient is set for a
higher degree of similarity between the two filters (blurs). In the
example illustrated in FIG. 18, the camera motion-based blur (Ec)
illustrated in FIG. 18(3) is the blur in the diagonal direction
toward the lower right side.
[0543] Of the three types of filters illustrated as the filter F1
acquired from the filter bank 35 in FIG. 18(1), the first filter
corresponding to a blur in a crosswise direction is determined as
having a low degree of similarity to the camera motion-based blur
(Ec) in the diagonal direction toward the lower right side, and
therefore, Correlation Coefficient=0.3 is calculated.
[0544] On the other hand, the filter corresponding to the second
blur in the direction toward the lower right side and the filter
corresponding to the third blur in a lengthwise direction are
determined as having a high degree of similarity to the camera
motion-based blur (Ec) in the diagonal direction toward the lower
right side as illustrated in FIG. 18(3), and Correlation
Coefficient=0.9 is calculated.
[0545] The filter (blur) comparator 38 outputs these correction
coefficients to the correlation corrector 39.
[0546] The correlation corrector 39 multiplies the correction
coefficient input from the filter (blur) comparator 38 by the
correlation value input from the correlation computer 32, i.e., the
correlation value calculated by the correlation computer 32 as
illustrated in FIG. 18(2), thereby calculating the corrected
correlation value.
[0547] Such a corrected correlation value is as illustrated in FIG.
18(5).
[0548] Corrected Correlation Value=0.5.times.0.3=0.15;
[0549] Corrected Correlation Value=0.4.times.0.9=0.36; and
[0550] Corrected Correlation Value=0.1.times.0.9=0.09:
[0551] the example illustrated in FIG. 18(5) shows an example where
these corrected correlation values are calculated in this order
from the top as three types of corrected correlation values
corresponding to each of three types of filters applied to the
far-infrared image 26.
[0552] The corrected correlation values calculated by the
correlation corrector 39 are output to the integrated filter
determinator 37.
[0553] For each block, the integrated filter determinator 37
selects, from the input data from the correlation corrector 39,
i.e., the corrected correlation values corresponding to each filter
for each of the N blocks, a filter corresponding to a block with
the highest correlation.
[0554] The N filters selected for each of the N blocks by the
integrated filter (Eallf) determinator 37 are input to the inverse
filter calculator 61 of the blur remover 60.
[0555] In the example illustrated in FIG. 18, the second entry is
an entry with the highest corrected correlation value as
illustrated in FIG. 18(5).
[0556] In this case, the filter for the second entry, i.e., the
second filter indicated as the filter F1 acquired from the filter
bank in FIG. 18(1), is determined as the integrated filter
(Eallf).
[0557] The integrated filter (Eallf) determined by the integrated
filter (Eallf) determinator 37 is input to the inverse filter
calculator 61 of the blur remover 60.
[0558] The inverse filter calculator 61 of the blur remover 60
receives the block-by-block integrated filter (Eallf), thereby
generating the inverse filter with the characteristics opposite to
those of each block-by-block integrated filter (Eallf).
[0559] That is, the deconvolution filter is generated, and is
output to the inverse filter processor 62.
[0560] The deconvolution filter is a filter with characteristics
opposite to those of each block-by-block integrated filter
(Eallf).
[0561] The inverse filter processor 62 of the blur remover 60
receives the inverse filter corresponding to each block from the
inverse filter calculator 61, thereby applying the input inverse
filter to the corresponding block of the pre-correction visible
light image 25 input from the visible light image input unit
21.
[0562] That is, the inverse filter processor 62 applies, to the
corresponding block of the pre-correction visible light image 25,
the inverse filter with the characteristics opposite to those of
the block-by-block integrated filter (Eallf) determined by the
integrated filter (Eallf) determinator 37.
[0563] When the inverse filter application processing is completed
for all of the N blocks of the pre-correction visible light image
25 input from the visible light image input unit 21, the resultant
image is output as the post-correction visible light image 27.
[0564] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
is generated from the pre-correction visible light image 25, and is
output.
[0565] Next, a sequence of the processing executed by the image
processing device corresponding to the configuration B described
with reference to FIGS. 15 and 17 will be described with reference
to flowcharts of FIG. 19 and subsequent figures.
[0566] (B1) The processing of selecting the filter to be applied to
the far-infrared image, i.e., the filter for generating the blur
(defocusing), on the basis of the camera motion-based blur
(Ec):
[0567] a sequence of the processing executed by the image
processing device-B1 configured to execute the above-described
processing as described with reference to FIG. 15 will be first
described with reference to the flowchart illustrated in FIG.
19.
[0568] Note that in a flow illustrated in FIG. 19, the same step
number is used to represent the step of executing processing
similar to that executed by the image processing device-A1
corresponding to the configuration A described earlier with
reference to FIG. 12.
[0569] Hereinafter, the processing of each step of the flow
illustrated in FIG. 19 will be sequentially described.
[0570] (Step S101) First, at the step S101, the visible light image
as a correction target is acquired.
[0571] This processing is performed by the visible light image
input unit 21 in the image processing device illustrated in FIG.
15. Specifically, this processing is the processing of acquiring an
image captured by a visible light image camera, for example.
[0572] (Step S102)
[0573] Next, at the step S102, the far-infrared image to be
utilized as a reference image is acquired.
[0574] This processing is executed by the far-infrared image input
unit 22 in the image processing device illustrated in FIG. 15.
Specifically, this processing is the processing of acquiring an
image captured by a far-infrared image camera, for example.
[0575] Note that the visible light image and the far-infrared image
acquired at the step S101 and the step S102 are images captured by
simultaneous photographing of the same object.
[0576] These images are, for example, images captured in the dark,
and the visible light image is blurred (defocused) due to long
exposure. On the other hand, the far-infrared image is an image
exposed for a short period of time, and is an image with little
blur (defocusing).
[0577] (Step S103)
[0578] Next, at the step S103, the camera motion information is
acquired.
[0579] This processing is performed by the camera motion
information input unit 23 in the image processing device
illustrated in FIG. 15. Specifically, the camera motion information
is acquired and input by the sensor such as an IMU, for
example.
[0580] (Step S104)
[0581] Next, at the step S104, the camera motion blur map is
acquired on the basis of the camera motion information input at the
step S103.
[0582] This processing is processing executed by the camera motion
blur map acquirer 41 of the camera motion-based blur estimator 40
in the image processing device illustrated in FIG. 15.
[0583] The camera motion blur map acquirer 41 of the camera
motion-based blur estimator 40 acquires, on the basis of the camera
motion information input from the camera motion information input
unit 23, a single camera motion blur map from the blur maps stored
corresponding to various types of camera motion in the camera
motion blur map storage unit 45, and inputs the camera motion blur
map to the filter bank selector 34 of the blur estimator 30b.
[0584] (Step S105)
[0585] The processing from the subsequent step S105 to the step
S107 is the loop processing (loop 1) sequentially and repeatedly
executed for all blocks as divided regions set for the visible
light image and the far-infrared image. Note that the number of
blocks is N.
[0586] (Step S105b)
[0587] The processing of the step S105b is processing executed by
the filter bank selector 34 of the blur estimator 30b illustrated
in FIG. 15.
[0588] The filter bank selector 34 receives the camera motion blur
map corresponding to the camera motion from the camera motion blur
map acquirer 41 of the camera motion-based blur estimator 40,
thereby selecting, from the filter bank pool 36, the filter bank
storing the filters corresponding to the blurs with the directions
and sizes similar to those of the blur set to the input camera
motion blur map.
[0589] Note that such filter bank selection processing is executed
for each unit block as the divided image region.
[0590] (Step S106b)
[0591] Next, at the step S106b, the processing of determining the
integrated filter (Eallf) is executed.
[0592] Such processing of the step S106b is processing executed by
the filter processor 31, the correlation computer 32, and the
integrated filter determinator 37 of the blur estimator 30b
illustrated in FIG. 15.
[0593] A sequence of the processing of the step S106b is the same
as the processing of the step S106 described earlier with reference
to FIG. 13, i.e., the processing of the steps S121 to S126
illustrated in FIG. 13.
[0594] Note that in the processing of the steps S121 to S126
described with reference to FIG. 13, the output filter is the
image-based filter (Etf). However, at the step S106b of the flow of
FIG. 19, the processing of selecting, as the integrated filter
(Eallf), the filter with the maximum correlation value by the
integrated filter determinator 37 illustrated in FIG. 15 at the
step S126 illustrated in FIG. 13 is performed.
[0595] (Step S107) The step S107 is the end point of the loop 1 of
the steps S105 to S107.
[0596] That is, the processing of the steps S105b to S106b is
sequentially and repeatedly executed for all blocks as the divided
regions set for the visible light image and the far-infrared
image.
[0597] When this loop processing (loop 1) is completed, the
integrated filter (Eallf) with the highest correlation value is
determined for all of the N blocks.
[0598] (Step S113)
[0599] The processing from the subsequent step S113 to the step
S116 is the loop processing (loop 3) sequentially and repeatedly
executed for all blocks as the divided regions set for the visible
light image and the far-infrared image. Note that the number of
blocks is N.
[0600] (Step S114)
[0601] The processing of the steps S114 to S115 is processing
executed by the blur remover 60 illustrated in FIG. 15.
[0602] First, at the step S114, the inverse filter calculator 61 of
the blur remover 60 receives the block-by-block integrated filter
(Eallf) determined by the integrated filter (Eallf) determinator
37, thereby calculating the inverse filter with the characteristics
opposite to those of each block-by-block integrated filter
(Eallf).
[0603] (Step S115)
[0604] Next, at the step S115, the inverse filter processor 62 of
the blur remover 60 applies the inverse filter calculated at the
step S114 to the processing target block of the visible light
image.
[0605] (Step S116) The step S116 is the end point of the loop 3 of
the steps S113 to S116.
[0606] That is, the processing of the steps S114 to S115 is
sequentially and repeatedly executed for all blocks as the divided
regions set for the visible light image as the correction target
image.
[0607] When the inverse filter application processing is completed
for all of the N blocks of the visible light image, the resultant
image is output as the post-correction visible light image.
[0608] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
as illustrated in FIG. 15 is generated from the visible light image
as the input image at the step S101, i.e., the pre-correction
visible light image 25 illustrated in FIG. 15, and is output.
[0609] (B2) The processing of correcting the correlation value
between the filter-applied far-infrared image and the blurred
visible light image on the basis of the camera motion-based blur
(Ec):
[0610] a sequence of the processing executed by the image
processing device-B2 configured to execute the above-described
processing as described with reference to FIG. 17 will be described
with reference to a flowchart of FIG. 20.
[0611] A flow illustrated in FIG. 20 is different in that the
processing of the steps S105b to S106b of the flow of FIG. 19 as
described with reference to FIG. 19 is substituted with a step
S106c illustrated in the flow of FIG. 20.
[0612] Other types of processing are similar to those described
with reference to the flow illustrated in FIG. 19, and therefore,
description thereof will not be repeated. The processing of the
step S106c illustrated in the flow of FIG. 20 will be mainly
described.
[0613] (Step S106c)
[0614] The step S106c is executed as the loop processing (loop 1)
sequentially and repeatedly executed for all blocks as the divided
regions set for the visible light image and the far-infrared
image.
[0615] The step S106 is the processing of calculating the
block-by-block integrated filter (Eallf) by the blur estimator 30b
illustrated in FIG. 17.
[0616] A detailed sequence of the processing of the step S106c will
be described with reference to a flow illustrated in FIG. 21.
[0617] For the processing of calculating each block-by-block
integrated filter (Eallf) at the step S106c, the visible light
image acquired at the step S101, the far-infrared image acquired at
the step S102, and the camera motion blur map acquired at the steps
S103 to S104 are utilized, and therefore, FIG. 21 also illustrates
the steps S101 to S104.
[0618] The detailed processing of the step S106c is the processing
of the steps S121 to S126 illustrated in FIG. 21. Hereinafter, such
processing will be sequentially described.
[0619] (Step S121)
[0620] The processing from the step S121 to the step S125 is the
loop processing (loop 1b) sequentially and repeatedly executed for
all filter IDs corresponding to all filters stored in the filter
bank 35.
[0621] (Step S122)
[0622] At the step S122, the filter (coefficient) is acquired. The
processing of the steps S122 to S123 is processing executed by the
filter processor 31 of the blur estimator 30b illustrated in FIG.
17. The filter processor 31 sequentially acquires, from the filter
bank 35, the filter (filter (defocusing) generation filter) applied
to each block of the far-infrared image.
[0623] Note that data sequentially acquired from the filter bank 35
may be any of the filter itself or the filter coefficient as data
forming the filter.
[0624] (Step S123)
[0625] Next, at the step S123, the filter acquired at the step S122
is applied to a single block of the far-infrared image, i.e., a
block currently selected as a processing target.
[0626] This processing is filter processing for intentionally
blurring the far-infrared image.
[0627] (Step S124)
[0628] Next, at the step S124, the correlation value between the
block of the far-infrared image as a filter application result at
the step S123 and the corresponding block of the visible light
image.
[0629] This processing is processing executed by the correlation
computer 32 of the blur estimator 30b illustrated in FIG. 17.
[0630] The correlation computer 32 calculates the correlation
between the far-infrared image intentionally blurred by filter
application and the visible light image.
[0631] (Step S125)
[0632] The step S125 is the end point of the loop 1b of the steps
S121 to S125.
[0633] That is, the processing of the steps S122 to S124 is
sequentially and repeatedly executed for all filter IDs
corresponding to all filters stored in the filter bank 35.
[0634] (Step S125c)
[0635] When the processing of the loop 1b of the steps S121 to S125
is completed for a single block, the processing proceeds to the
step S125c.
[0636] That is, when the processing of calculating the correlation
values corresponding to all filters stored in the filter bank 35
ends for a single block, the processing proceeds to the step
S125c.
[0637] The processing of the step S125c is processing executed by
the filter (blur) comparator 38 and the correlation corrector 39 of
the blur estimator 30b illustrated in FIG. 17.
[0638] The filter (blur) comparator 38 receives each of the
following types of information.
[0639] The blur (defocusing) generation filter to be applied to the
far-infrared image 26 in the filter processor 31 is input from the
filter bank 35.
[0640] Further, the camera motion blur map corresponding to the
camera motion is input from the camera motion blur map acquirer 41
of the camera motion-based blur estimator 40.
[0641] The filter (blur) comparator 38 determines the degree of
similarity between these two filters (blurs) to calculate the
correction coefficient (0 to 1) according to the degree of
similarity, and outputs the correction coefficient to the
correlation corrector 39.
[0642] A greater (closer to 1) correction coefficient is set for a
higher degree of similarity between the two filters (blurs), and a
smaller (closer to 0) correction coefficient is set for a lower
degree of similarity.
[0643] The correlation corrector 39 multiplies the correction
coefficient input from the filter (blur) comparator 38 by the
correlation value (the correlation value between the pre-correction
visible light image 25 and the corresponding block of the
filter-applied far-infrared image) input from the correlation
computer 32, thereby calculating the corrected correlation value
and outputting the corrected correlation value to the integrated
filter determinator 37.
[0644] (Step S126)
[0645] The processing of the step S126 is processing executed by
the integrated filter (Eallf) determinator 37 of the blur estimator
30b illustrated in FIG. 17.
[0646] At the step S126, the integrated filter (Eallf) determinator
37 determines, as the integrated filter (Eallf) corresponding to
the block, the filter with the highest correlation value among the
corrected correlation values calculated corresponding to the blocks
by the correlation corrector 39 at the step S125b.
[0647] The integrated filter (Eallf) determined corresponding to
the block by the integrated filter (Eallf) determinator 37 is input
to the inverse filter calculator 61 of the blur remover 60
illustrated in FIG. 17.
[0648] (Step S107)
[0649] The step S107 is the end point of the loop 1 of the steps
S105 to S107.
[0650] That is, the processing of the steps S105b to S106b is
sequentially and repeatedly executed for all blocks as the divided
regions set for the visible light image and the far-infrared
image.
[0651] When this loop processing (the loop 1) is completed, the
integrated filter (Eallf) is determined for all of the N
blocks.
[0652] Next, the processing of the steps S113 to S116 illustrated
in FIG. 20 is executed.
[0653] That is, the inverse filter calculator 61 of the blur
remover 60 receives the block-by-block integrated filter (Eallf)
determined by the integrated filter (Eallf) determinator 37,
thereby calculating the inverse filter with the characteristics
opposite to those of each block-by-block integrated filter (Eallf).
The inverse filter processor 62 of the blur remover 60 applies the
inverse filter to the processing target block of the visible light
image.
[0654] When the inverse filter application processing is completed
for all of the N blocks of the visible light image, the resultant
image is output as the post-correction visible light image.
[0655] By these types of processing, the post-correction visible
light image 27 whose blur (defocusing) has been removed or reduced
as illustrated in FIG. 17 is generated from the visible light image
as the input image at the step S101, i.e., the pre-correction
visible light image 25 illustrated in FIG. 17, and is output.
[0656] (7. (Third Embodiment) Configuration and processing of image
processing device corresponding to configuration A+C)
[0657] Next, the configuration and processing of an image
processing device with the configuration (A+C) described with
reference to FIG. 8, i.e., an image processing device configured,
with the configuration A as a basic configuration, to further
calculate the degree of reliability of a filter (blur) estimation
result to perform blur removing processing according to the degree
of reliability, will be described as a third embodiment of the
image processing device of the present disclosure.
[0658] The configuration A is a configuration in which an
integrated filter (Eallf) of an image-based filter (Etf) and a
camera motion-based filter (Ecf) is estimated to apply an inverse
filter with characteristics opposite to those of the integrated
filter (Eallf) to a blurred visible light image.
[0659] The configuration A+C described below as the third
embodiment is a configuration in which the degree of reliability of
the image-based filter (Etf) or the camera motion-based filter
(Ecf) estimated in the configuration A is calculated for the blur
removing processing according to the degree of reliability.
[0660] Note that for the configuration A, the following two types
of processing forms of the processing of estimating the integrated
blur (Eall) are provided as described earlier with reference to
FIGS. 9 to 14:
[0661] (A1) a configuration in which the integrated blur (Eall) to
be applied or selected is switched between the image-based blur
(Et) and the camera motion-based blur (Ec) according to the amount
of object motion; and
[0662] (A2) a configuration in which the weighted average of the
image-based blur (Et) and the camera motion-based blur (Ec) is
utilized as the integrated blur (Eall) to change a weighted average
form according to the amount of object motion.
[0663] Hereinafter, the configuration C, i.e., the configuration
and processing of the image processing device configured to
calculate the degree of reliability of the filter (blur) estimation
result to perform the blur removing processing according to the
degree of reliability, will be described in association with each
of these configurations A1, A2.
[0664] Configuration A1+C; and
[0665] Configuration A2+C:
[0666] these two configuration examples and these two types of
processing will be, in other words, described with reference to
FIG. 22 and subsequent figures.
[0667] (A1) The configuration in which the integrated blur (Eall)
to be applied or selected is switched between the image-based blur
(Et) and the camera motion-based blur (Ec) according to the amount
of object motion:
[0668] the configuration A1+C, i.e., the configuration of the image
processing device configured to calculate the degree of reliability
of the filter (blur) estimation result to perform the blur removing
processing according to the degree of reliability in the
above-described configuration, will be described with reference to
FIG. 22.
[0669] The image processing device A1+C, 20-A1C illustrated in FIG.
22 is configured such that a reliability calculator 70 is added to
the configuration of the image processing device A1, 20-A1
described earlier with reference to FIG. 9 and an inverse filter
corrector 63 is further added to the blur remover 60.
[0670] Other configurations are the same as those of the image
processing device A1, 20-A1 illustrated in FIG. 9.
[0671] Processing executed by the reliability calculator 70 and the
inverse filter corrector 63 will be described.
[0672] The reliability calculator 70 has a filter (blur) comparator
71.
[0673] The filter (blur) comparator 71 of the reliability
calculator 70 receives the following two pieces of data:
[0674] (1) the image-based filter (Etf) generated by the
image-based blur estimator 30; and
[0675] (2) the camera motion blur map acquired by the camera
motion-based blur estimator 40.
[0676] The image-based filter (Etf) generated by the image-based
blur estimator 30 is a filter for each unit block as a divided
image region.
[0677] The camera motion blur map acquired by the camera
motion-based blur estimator 40 is data having, for each unit block,
blur information corresponding to the camera motion as described
earlier with reference to FIG. 10.
[0678] The filter (blur) comparator 71 of the reliability
calculator 70 compares the block-by-block image-based filter (Etf)
generated by the image-based blur estimator 30 and the camera
motion-based blur (Ec) of the block in the camera motion blur map
corresponding to such a block.
[0679] Note that the filter as a comparison target at this point is
a blur generation filter. For example, a filter coefficient and a
function (PSF) indicating the blur can be directly compared with
each other, and such comparison allows determination on similarity
between the filter and the blur.
[0680] Note that the processing of converting the camera
motion-based blur (Ec) into the camera motion-based filter (Ecf) to
compare the camera motion-based filter (Ecf) with the image-based
filter (Etf) may be performed, or the processing of converting the
image-based filter (Etf) into the image-based blur (Ec) to compare
the image-based blur (Ec) with the camera motion-based blur (Ec)
may be performed.
[0681] In either case, the processing of comparing the image-based
blur (Et) or the image-based filter (Etf) with the camera
motion-based blur (Ec) or the camera motion-based filter (Ecf) is
performed between the corresponding blocks positioned at the same
position on the image.
[0682] In a case where the degree of similarity between the two
filters (blurs) as such a comparison processing result is high, the
value of the degree of reliability is set high. On the other hand,
in a case where the degree of similarity between the two filters
(blurs) is low, the value of the degree of reliability is set
low.
[0683] Note that the degree of reliability is a value of
Reliability Degree=0 to 1, for example.
[0684] FIG. 23 illustrates examples of the reliability calculation
processing executed by the filter (blur) comparator 71 of the
reliability calculator 70.
[0685] FIG. 23 illustrates two examples as the examples of the
reliability calculation processing executed by the filter (blur)
comparator 71.
[0686] (First Example) is an example where the image-based filter
(blur) generated by the image-based blur estimator 30 is a
horizontally-elongated filter (blur) and the camera motion-based
filter (blur) acquired from the block in the camera motion blur map
is a filter (blur) in a diagonal direction toward the lower right
side.
[0687] In this case, it is determined that the degree of similarity
between these two filters (blurs) is low, and the degree of
reliability is set to a low value of 0.1, for example.
[0688] (Second Example) is an example where the image-based filter
(blur) generated by the image-based blur estimator 30 is a filter
(blur) in the diagonal direction toward the lower right side and
the camera motion-based filter (blur) acquired from the block in
the camera motion blur map is a filter (blur) in the diagonal
direction toward the lower right side.
[0689] In this case, it is determined that the degree of similarity
between these two filters (blurs) is high, and the degree of
reliability is set to a high value of 0.9, for example.
[0690] The filter (blur) comparator 71 of the reliability
calculator 70 outputs the calculated degree of reliability to the
inverse filter corrector 63 of the blur remover 60. The inverse
filter corrector 63 adjusts, according to the degree of reliability
input from the reliability calculator 70, the strength of the
inverse filter applied in the inverse filter processor 62.
[0691] For example, in the case of a high degree of reliability
input from the reliability calculator 70, the coefficient set to
the inverse filter calculated by the inverse filter calculator 61
is directly utilized without a decrease in the strength of the
inverse filter applied in the inverse filter processor 62. That is,
the inverse filter calculated by the inverse filter calculator 61
is directly applied to the processing target block of the
pre-correction visible light image 25.
[0692] On the other hand, in the case of a low degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor 62 is
decreased. That is, the coefficient set to the inverse filter
calculated by the inverse filter calculator 61 is adjusted to lower
an inverse filter application effect.
[0693] Specifically, in a case where the degree of reliability of
the correlation value calculated by the reliability calculator 70
is, for example, set as Correlation Value Reliability Degree
.alpha.=1 (High Reliability) to 0 (Low Reliability), the inverse
filter corrector 63 multiplies the coefficient set to the inverse
filter calculated by the inverse filter calculator 61 by the degree
.alpha. of reliability to generate a corrected inverse filter,
thereby outputting the corrected inverse filter to the inverse
filter processor 62.
[0694] The inverse filter processor 62 applies the corrected
inverse filter input from the inverse filter corrector 63 to the
processing target block of the pre-correction visible light image
25.
[0695] Note that the reliability calculation processing of the
reliability calculator 70, the inverse filter correction processing
of the inverse filter corrector 63, and the inverse filter
application processing of the inverse filter processor 62 are
executed as processing for each unit block.
[0696] As described above, in the present embodiment, the inverse
filter application processing according to the degree of
reliability of the estimated filter (blur) is implemented. The
inverse filter application effect can be enhanced for the block
with a high degree of reliability, and can be suppressed low for
the block with a low degree of reliability. Effective blur
elimination processing can be performed according to the degree of
reliability.
[0697] (A2) The configuration in which the weighted average of the
image-based blur (Et) and the camera motion-based blur (Ec) is
utilized as the integrated blur (Eall) to change the weighted
average form according to the amount of object motion:
[0698] the configuration A2+C, i.e., the configuration of the image
processing device configured to calculate the degree of reliability
of the filter (blur) estimation result to perform the blur removing
processing according to the degree of reliability in the
above-described configuration, will be described with reference to
FIG. 24.
[0699] The image processing device A2+C, 20-A2C illustrated in FIG.
24 is configured such that the reliability calculator 70 is added
to the configuration of the image processing device A2, 20-A2
described earlier with reference to FIG. 11 and the inverse filter
corrector 63 is further added to the blur remover 60.
[0700] Other configurations are the same as those of the image
processing device A2, 20-A2 illustrated in FIG. 11.
[0701] The processing executed by the reliability calculator 70 and
the inverse filter corrector 63 is processing similar to that
described regarding the above-described image processing device
A1+C, 20-A1C with reference to FIGS. 22 and 23.
[0702] That is, the reliability calculator 70 performs, between the
corresponding blocks positioned at the same position on the image,
the processing of comparing the image-based blur (Et) or the
image-based filter (Etf) with the camera motion-based blur (Ec) or
the camera motion-based filter (Ecf).
[0703] In a case where the degree of similarity between the two
filters (blurs) as the result of such comparison processing is
high, the value of the degree of reliability is set high. On the
other hand, in a case where the degree of similarity is low, the
value of the degree of reliability is set low.
[0704] The filter (blur) comparator 71 of the reliability
calculator 70 outputs the calculated degree of reliability to the
inverse filter corrector 63 of the blur remover 60. The inverse
filter corrector 63 adjusts, according to the degree of reliability
input from the reliability calculator 70, the strength of the
inverse filter applied in the inverse filter processor 62.
[0705] For example, in the case of a high degree of reliability
input from the reliability calculator 70, the coefficient set to
the inverse filter calculated by the inverse filter calculator 61
is directly utilized without a decrease in the strength of the
inverse filter applied in the inverse filter processor 62. That is,
the inverse filter calculated by the inverse filter calculator 61
is directly applied to the processing target block of the
pre-correction visible light image 25.
[0706] On the other hand, in the case of a low degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor 62 is
decreased. That is, the coefficient set to the inverse filter
calculated by the inverse filter calculator 61 is adjusted to lower
the inverse filter application effect.
[0707] As described above, in the present embodiment, the inverse
filter application processing according to the degree of
reliability of the estimated filter (blur) is implemented. The
inverse filter application effect can be enhanced for the block
with a high degree of reliability, and can be suppressed low for
the block with a low degree of reliability. The effective blur
elimination processing can be performed according to the degree of
reliability.
[0708] Next, a sequence of the processing in the third embodiment
described with reference to FIGS. 22 to 24, i.e., in the image
processing device with the configuration (A1+C) or the
configuration (A2+C), will be described with reference to
flowcharts illustrated in FIG. 25 and a subsequent figure.
[0709] (A1) The configuration in which the integrated blur (Eall)
to be applied or selected is switched between the image-based blur
(Et) and the camera motion-based blur (Ec) according to the amount
of object motion:
[0710] the flowchart illustrated in FIG. 25 is a flowchart for
describing a sequence of the processing executed by the
configuration A1+C described earlier with reference to FIG. 22,
i.e., the image processing device configured to calculate the
degree of reliability of the filter (blur) estimation result to
perform the blur removing processing according to the degree of
reliability in the above-described configuration.
[0711] The flowchart illustrated in FIG. 25 is a flowchart that
processing of a step S112b and a step S114b is added to the
processing of the steps S101 to S116 of the processing flow
executed by the image processing device corresponding to the
"configuration A1" of FIG. 9 as described earlier with reference to
FIG. 12.
[0712] Other processing than the step S112b and the step S114b as
additional processing is similar to the processing of the steps
S101 to S116 of the flow described with reference to FIG. 12, and
therefore, description thereof will not be repeated. Only the
processing of the step S112b and the step S114b as the additional
processing will be described.
[0713] (Step S112b)
[0714] The step S112b is the reliability calculation processing,
and is repeatedly executed for each unit block.
[0715] The filter (blur) comparator 71 of the reliability
calculator 70 illustrated in FIG. 22 performs, between the
corresponding blocks positioned at the same position on the image,
the processing of comparing the image-based blur (Et) or the
image-based filter (Etf) with the camera motion-based blur (Ec) or
the camera motion-based filter (Ecf). In a case where the degree of
similarity between the two filters (blurs) as the result of such
comparison processing is high, the value of the degree of
reliability is set high. In a case where the degree of similarity
is low, the value of the degree of reliability is set low. When the
reliability calculation processing is completed for all blocks, the
processing proceeds to the step S113.
[0716] (Step S114b)
[0717] Next, the processing of the step S114b as the processing of
another additional step of the flow illustrated in FIG. 25 will be
described.
[0718] The step S114b is processing executed in the inverse filter
corrector 63 of the blur remover 60 illustrated in FIG. 22. The
inverse filter corrector 63 adjusts, according to the degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor
62.
[0719] For example, in the case of a high degree of reliability
input from the reliability calculator 70, the coefficient set to
the inverse filter calculated by the inverse filter calculator 61
is directly utilized without a decrease in the strength of the
inverse filter applied in the inverse filter processor 62. That is,
the inverse filter calculated by the inverse filter calculator 61
is directly applied to the processing target block of the
pre-correction visible light image 25.
[0720] On the other hand, in the case of a low degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor 62 is
decreased. That is, the coefficient set to the inverse filter
calculated by the inverse filter calculator 61 is adjusted to lower
the inverse filter application effect.
[0721] (A2) The configuration in which the weighted average of the
image-based blur (Et) and the camera motion-based blur (Ec) is
utilized as the integrated blur (Eall) to change the weighted
average form according to the amount of object motion:
[0722] a sequence of the processing executed by the configuration
A2+C described earlier with reference to FIG. 24, i.e., the image
processing device configured to calculate the degree of reliability
of the filter (blur) estimation result to perform the blur removing
processing according to the degree of reliability in the
above-described configuration, will be described with reference to
the flowchart illustrated in FIG. 26.
[0723] The flowchart illustrated in FIG. 26 is a flowchart that the
processing of the step S112b and the step S114b is added to the
processing of the steps S101 to S116 of the processing flow
executed by the image processing device corresponding to the
"configuration A2" of FIG. 11 as described earlier with reference
to FIG. 14.
[0724] Other processing than the step S112b and the step S114b as
the additional processing is similar to the processing of the steps
S101 to S116 of the flow described with reference to FIG. 14.
[0725] Moreover, the processing of the step S112b and the step
S114b as the additional processing is processing similar to the
processing of the step S112b and the step S114b as the processing
of the image processing device of the configuration A1+C as
described with reference to the flowchart of FIG. 25.
[0726] That is, at the step S112b, the degree of reliability is
calculated on the basis of the degree of similarity between the
image-based filter (blur) and the camera motion-based filter
(blur).
[0727] Further, at the step S114b, the inverse filter correction
processing of controlling the applied strength of the inverse
filter according to the degree of reliability is performed.
[0728] By these types of processing, the inverse filter application
processing according to the degree of reliability of the estimated
filter (blur) is implemented. The inverse filter application effect
can be enhanced for the block with a high degree of reliability,
and can be suppressed low for the block with a low degree of
reliability. The effective blur elimination processing can be
performed according to the degree of reliability.
(8. (Fourth Embodiment) Configuration and Processing of Image
Processing Device Corresponding to Configuration B+C)
[0729] Next, the configuration and processing of an image
processing device with the configuration (B+C) described with
reference to FIG. 8, i.e., an image processing device configured,
with the configuration B as a basic configuration, to further
calculate the degree of reliability of a filter (blur) estimation
result to perform blur removing processing according to the degree
of reliability, will be described as a fourth embodiment of the
image processing device of the present disclosure.
[0730] The configuration B is the configuration described earlier
with reference to FIGS. 15 to 21, the configuration B being a
configuration in which image-based blur estimation using a visible
light image and a far-infrared image and camera motion-based blur
estimation based on camera motion information are executed in
combination to execute integrated blur estimation.
[0731] The configuration B+C as the fourth embodiment described
below is a configuration in which the degree of reliability of an
image-based filter (Etf) or a camera motion-based filter (Ecf)
estimated in the configuration B is calculated to perform the blur
removing processing according to the degree of reliability.
[0732] Note that as described earlier with reference to FIGS. 15 to
21, in the processing of generating an integrated blur (Eall) in
the configuration B, any of the following two types of processing
is executed:
[0733] (B1) the processing of selecting a filter to be applied to
the far-infrared image, i.e., a filter for generating a blur
(defocusing), on the basis of a camera motion-based blur (Ec) (FIG.
15); and
[0734] (B2) the processing of correcting a correlation value
between a filter-applied far-infrared image and the blurred visible
light image on the basis of the camera motion-based blur (Ec) (FIG.
17).
[0735] Hereinafter, the configuration C, i.e., the configuration
and processing of the image processing device configured to
calculate the degree of reliability of the filter (blur) estimation
result to perform the blur removing processing according to the
degree of reliability, will be described in association with each
of these configurations B1, B2.
[0736] Configuration B1+C; and
[0737] Configuration B2+C:
[0738] these two configuration examples and these two types of
processing will be, in other words, described with reference to
FIG. 27 and a subsequent figure.
[0739] First, the configuration B1+C, i.e., the configuration of
the image processing device configured to calculate the degree of
reliability of the filter (blur) estimation result to perform the
blur removing processing according to the degree of reliability in
(B1) the device configured to select the filter to be applied to
the far-infrared image, i.e., the filter for generating the blur
(defocusing), on the basis of the camera motion-based blur (Ec)
will be described with reference to FIG. 27.
[0740] The image processing device B1+C, 20-B1C illustrated in FIG.
27 is configured such that a reliability calculator 70 is added to
the configuration of the image processing device B1, 20-B1
described earlier with reference to FIG. 15 and an inverse filter
corrector 63 is further added to a blur remover 60.
[0741] Other configurations are the same as those of the image
processing device B1, 20-B1 illustrated in FIG. 15.
[0742] Processing executed by the reliability calculator 70 and the
inverse filter corrector 63 is processing substantially similar to
the processing executed by the image processing device A1+C, 20-A1C
described earlier with reference to FIGS. 22 and 23.
[0743] Note that the reliability calculator 70 of the image
processing device A1+C, 20-A1C described earlier with reference to
FIGS. 22 and 23 executes the processing of comparing the
image-based filter (blur) and the camera motion-based filter
(blur), but the reliability calculator 70 of the image processing
device B1+C, 20-B1C illustrated in FIG. 27 executes the processing
of comparing an integrated filter (blur) and the camera
motion-based filter (blur).
[0744] As illustrated in FIG. 27, the reliability calculator 70
receives the integrated filter generated by a blur estimator 30b
and a camera motion blur map output from a camera motion-based blur
estimator 40, thereby determining the degree of similarity between
these filters (blurs).
[0745] That is, the reliability calculator 70 performs, between
corresponding blocks positioned at the same position on the image,
the processing of comparing the integrated filter (blur) and the
camera motion-based filter (blur).
[0746] In a case where the degree of similarity between the two
filters (blurs) as the result of such comparison processing is
high, the value of the degree of reliability is set high.
[0747] In a case where the degree of similarity is low, the value
of the degree of reliability is set low.
[0748] A filter (blur) comparator 71 of the reliability calculator
70 outputs the calculated degree of reliability to the inverse
filter corrector 63 of the blur remover 60.
[0749] The inverse filter corrector 63 adjusts, according to the
degree of reliability input from the reliability calculator 70, the
strength of an inverse filter applied in an inverse filter
processor 62.
[0750] For example, in the case of a high degree of reliability
input from the reliability calculator 70, a coefficient set to the
inverse filter calculated by an inverse filter calculator 61 is
directly utilized without a decrease in the strength of the inverse
filter applied in the inverse filter processor 62. That is, the
inverse filter calculated by the inverse filter calculator 61 is
directly applied to a processing target block of a pre-correction
visible light image 25.
[0751] On the other hand, in the case of a low degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor 62 is
decreased. That is, the coefficient set to the inverse filter
calculated by the inverse filter calculator 61 is adjusted to lower
an inverse filter application effect.
[0752] As described above, in the present embodiment, the inverse
filter application processing according to the degree of
reliability of the estimated filter (blur) is also implemented. The
inverse filter application effect can be enhanced for the block
with a high degree of reliability, and can be suppressed low for
the block with a low degree of reliability. Effective blur
elimination processing can be performed according to the degree of
reliability.
[0753] Next, the configuration B2+C, i.e., the configuration of the
image processing device configured to calculate the degree of
reliability of the filter (blur) estimation result to perform the
blur removing processing according to the degree of reliability in
(B2) the device configured to correct the correlation value between
the filter-applied far-infrared image and the blurred visible light
image on the basis of the camera motion-based blur (Ec) will be
described with reference to FIG. 28.
[0754] The image processing device B2+C, 20-B2C illustrated in FIG.
28 is configured such that the reliability calculator 70 is added
to the configuration of the image processing device B2, 20-B2
described earlier with reference to FIG. 17 and the inverse filter
corrector 63 is further added to the blur remover 60.
[0755] Other configurations are the same as those of the image
processing device B2, 20-B2 illustrated in FIG. 17.
[0756] The processing executed by the reliability calculator 70 and
the inverse filter corrector 63 is processing similar to the
processing executed by the image processing device B1+C, 20-B1C
described earlier with reference to FIG. 27.
[0757] That is, the reliability calculator 70 of the image
processing device B2+C, 20-B2C illustrated in FIG. 28 performs,
between the corresponding blocks positioned at the same position on
the image, the processing of comparing the integrated filter (blur)
and the camera motion-based filter (blur).
[0758] In a case where the degree of similarity between the two
filters (blurs) as the result of such comparison processing is
high, the value of the degree of reliability is set high.
[0759] In a case where the degree of similarity is low, the value
of the degree of reliability is set low.
[0760] The filter (blur) comparator 71 of the reliability
calculator 70 illustrated in FIG. 28 outputs the calculated degree
of reliability to the inverse filter corrector 63 of the blur
remover 60.
[0761] The inverse filter corrector 63 adjusts, according to the
degree of reliability input from the reliability calculator 70, the
strength of the inverse filter applied in the inverse filter
processor 62.
[0762] For example, in the case of a high degree of reliability
input from the reliability calculator 70, the coefficient set to
the inverse filter calculated by the inverse filter calculator 61
is directly utilized without a decrease in the strength of the
inverse filter applied in the inverse filter processor 62. That is,
the inverse filter calculated by the inverse filter calculator 61
is directly applied to the processing target block of the
pre-correction visible light image 25.
[0763] On the other hand, in the case of a low degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor 62 is
decreased. That is, the coefficient set to the inverse filter
calculated by the inverse filter calculator 61 is adjusted to lower
the inverse filter application effect.
[0764] As described above, in the present embodiment, the inverse
filter application processing according to the degree of
reliability of the estimated filter (blur) is also implemented. The
inverse filter application effect can be enhanced for the block
with a high degree of reliability, and can be suppressed low for
the block with a low degree of reliability. The effective blur
elimination processing can be performed according to the degree of
reliability.
[0765] Next, a sequence of the processing in the fourth embodiment
described with reference to FIGS. 27 to 28, i.e., the image
processing device with the configuration (B1+C) and the
configuration (B2+C), will be described with reference to
flowcharts illustrated in FIG. 29 and a subsequent figure.
[0766] The flowchart illustrated in FIG. 29 is a flowchart for
describing a sequence of the processing executed by the
configuration B1+C described earlier with reference to FIG. 27,
i.e., the image processing device configured to calculate the
degree of reliability of the filter (blur) estimation result to
perform the blur removing processing according to the degree of
reliability in (B1) the device configured to select the filter to
be applied to the far-infrared image, i.e., the filter for
generating the blur (defocusing), on the basis of the camera
motion-based blur (Ec).
[0767] The flowchart illustrated in FIG. 29 is a flowchart that
processing of a step S112b and a step S114b is added to the
processing of the steps S101 to S116 of the processing flow
executed by the image processing device corresponding to the
"configuration B1" of FIG. 15 as described earlier with reference
to FIG. 19.
[0768] Other processing than the step S112b and the step S114b as
additional processing is similar to the processing of the steps
S101 to S116 of the flow described with reference to FIG. 19.
[0769] Moreover, the processing of the step S112b and the step
S114b as the additional processing is processing similar to the
processing of the step S112b and the step S114b as the processing
of the image processing device of the configuration A1+C as
described with reference to the flowchart of FIG. 25.
[0770] Note that data as a comparison target upon calculation of
the degree of reliability is the image-based filter (blur) and the
camera motion-based filter (blur) at the step S112b as the
processing of the image processing device with the configuration
A1+C as described with reference to the flowchart of FIG. 25, but
in the configuration B1+C, the data as the comparison target upon
calculation of the degree of reliability at the step S112b is the
integrated filter (blur) and the camera motion-based filter
(blur).
[0771] A difference is only the above-described point.
[0772] At the step S112b in a flow illustrated in FIG. 29, the
processing of comparing the integrated filter (blur) and the camera
motion-based filter (blur) is performed between the corresponding
blocks at the same position on the image. In a case where the
degree of similarity between the two filters (blurs) as the result
of such comparison processing is high, the value of the degree of
reliability is set high. In a case where the degree of similarity
is low, the value of the degree of reliability is set low.
[0773] The filter (blur) comparator 71 of the reliability
calculator 70 outputs the calculated degree of reliability to the
inverse filter corrector 63 of the blur remover 60. The inverse
filter corrector 63 adjusts, according to the degree of reliability
input from the reliability calculator 70, the strength of the
inverse filter applied in the inverse filter processor 62.
[0774] For example, in the case of a high degree of reliability
input from the reliability calculator 70, the coefficient set to
the inverse filter calculated by the inverse filter calculator 61
is directly utilized without a decrease in the strength of the
inverse filter applied in the inverse filter processor 62. That is,
the inverse filter calculated by the inverse filter calculator 61
is directly applied to the processing target block of the
pre-correction visible light image 25.
[0775] On the other hand, in the case of a low degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor 62 is
decreased. That is, the coefficient set to the inverse filter
calculated by the inverse filter calculator 61 is adjusted to lower
the inverse filter application effect.
[0776] As described above, in the present embodiment, the inverse
filter application processing according to the degree of
reliability of the estimated filter (blur) is also implemented. The
inverse filter application effect can be enhanced for the block
with a high degree of reliability, and can be suppressed low for
the block with a low degree of reliability. The effective blur
elimination processing can be performed according to the degree of
reliability.
[0777] Next, a sequence of the processing executed by the
configuration B2+C described with reference to FIG. 28, i.e., the
image processing device configured to calculate the degree of
reliability of the filter (blur) estimation result to perform the
blur removing processing according to the degree of reliability in
(B2) the device configured to correct the correlation value between
the filter-applied far-infrared image and the blurred visible light
image on the basis of the camera motion-based blur (Ec), will be
described with reference to the flowchart illustrated in FIG.
30.
[0778] The flowchart illustrated in FIG. 30 is a flowchart that the
processing of the step S112b and the step S114b is added to the
processing of the steps S101 to S116 of the processing flow
executed by the image processing device corresponding to the
"configuration B2" of FIG. 17 as described earlier with reference
to FIG. 20.
[0779] Other processing than the step S112b and the step S114b as
the additional processing is similar to the processing of the steps
S101 to S116 of the flow described with reference to FIG. 20.
[0780] Moreover, the processing of the step S112b and the step
S114b as the additional processing is processing similar to the
processing of the step S112b and the step S114b as the processing
of the image processing device of the configuration B1+C as
described above with reference to FIG. 29.
[0781] At the step S112b in a flow illustrated in FIG. 30, the
processing of comparing the integrated filter (blur) and the camera
motion-based filter (blur) is performed between the corresponding
blocks at the same position on the image. In a case where the
degree of similarity between the two filters (blurs) as the result
of such comparison processing is high, the value of the degree of
reliability is set high. In a case where the degree of similarity
is low, the value of the degree of reliability is set low.
[0782] The filter (blur) comparator 71 of the reliability
calculator 70 outputs the calculated degree of reliability to the
inverse filter corrector 63 of the blur remover 60. The inverse
filter corrector 63 adjusts, according to the degree of reliability
input from the reliability calculator 70, the strength of the
inverse filter applied in the inverse filter processor 62.
[0783] For example, in the case of a high degree of reliability
input from the reliability calculator 70, the coefficient set to
the inverse filter calculated by the inverse filter calculator 61
is directly utilized without a decrease in the strength of the
inverse filter applied in the inverse filter processor 62. That is,
the inverse filter calculated by the inverse filter calculator 61
is directly applied to the processing target block of the
pre-correction visible light image 25.
[0784] On the other hand, in the case of a low degree of
reliability input from the reliability calculator 70, the strength
of the inverse filter applied in the inverse filter processor 62 is
decreased. That is, the coefficient set to the inverse filter
calculated by the inverse filter calculator 61 is adjusted to lower
the inverse filter application effect.
[0785] As described above, in the present embodiment, the inverse
filter application processing according to the degree of
reliability of the estimated filter (blur) is also implemented. The
inverse filter application effect can be enhanced for the block
with a high degree of reliability, and can be suppressed low for
the block with a low degree of reliability. The effective blur
elimination processing can be performed according to the degree of
reliability.
[0786] (9. Hardware Configuration Example of Image Processing
Device)
[0787] Next, a hardware configuration example of the image
processing device will be described with reference to FIG. 31. FIG.
31 is a diagram of the hardware configuration example of the image
processing device configured to execute the processing of the
present disclosure.
[0788] A central processing unit (CPU) 81 functions as a controller
or a data processor configured to execute various types of
processing according to a program stored in a read only memory
(ROM) 82 or a storage unit 88. For example, the CPU 81 executes the
processing according to the sequence described in the
above-described embodiments.
[0789] In a random access memory (RAM) 83, e.g., the program or
data to be executed by the CPU 81 is stored. The CPU 81, the ROM
82, and the RAM 83 are connected together via a bus 84.
[0790] The CPU 81 is connected to an input/output interface 85 via
the bus 84. An input unit 86 configured to input an image captured
by an imager 95 including a visible light camera, an infrared
(far-infrared) camera, and the like, and including various user
imputable switches, a keyboard, a mouse, a microphone, and the
like, and an output unit 87 configured to execute data output to a
display 96, a speaker, and the like are connected to the
input/output interface 85. The CPU 81 executes various types of
processing in response to an instruction input from the input unit
86, and outputs a processing result to, e.g., the output unit
87.
[0791] The storage unit 88 connected to the input/output interface
85 includes a hard drive and the like, for example, and stores the
program or various types of data to be executed by the CPU 81. A
communication unit 89 functions as a transmitter/receiver for data
communication via Wi-Fi communication, the Bluetooth (registered
trademark) (BT) communication, or other networks such as the
Internet and a local area network, thereby communicating with an
external device.
[0792] A drive 90 connected to the input/output interface 85 drives
a removable medium 91 such as a magnetic disc, an optical disc, a
magneto-optical disc, or a semiconductor memory such as a memory
card, thereby executing data recording or reading.
[0793] (10. Configuration Example of Vehicle Control System
Including Image Processing Device of Present Disclosure in
Vehicle)
[0794] Next, one configuration example of a vehicle control system
including, in a vehicle, the above-described image processing
device of the present disclosure will be described.
[0795] FIG. 32 is a schematic block diagram of a functional
configuration example of a vehicle control system 100 including the
image processing device configured to execute the above-described
processing.
[0796] Note that the above-described image processing device of the
present disclosure corresponds to part of configurations of a
detector 131, a data acquirer 102, an output controller 105, and an
output unit 106 of the vehicle control system 100 illustrated in
FIG. 32.
[0797] The processing executed by the above-described image
processing device of the present disclosure is mainly executed by
an outer-vehicle information detector 141 of the detector 131 of
the vehicle control system 100 illustrated in FIG. 32.
[0798] The data acquirer 102 of the vehicle control system 100
illustrated in FIG. 32 includes a visible light camera, an infrared
(far-infrared) camera, and a sensor such as an IMU, and the
detector 131 receives images captured by these cameras and vehicle
motion information (=the camera motion information) to execute the
above-described processing. Note that a processing result is, for
example, displayed on a display forming the output unit 106 of the
vehicle control system 100 illustrated in FIG. 32, and is checked
by a user (a driver).
[0799] Hereinafter, the configuration of the vehicle control system
100 illustrated in FIG. 32 will be described.
[0800] Note that in a case where the vehicle provided with the
vehicle control system 100 is distinguished from other vehicles,
such a vehicle will be hereinafter described as a subject car or a
subject vehicle.
[0801] The vehicle control system 100 includes an input unit 101,
the data acquirer 102, a communication unit 103, in-vehicle
equipment 104, the output controller 105, the output unit 106, a
drive system controller 107, a drive system 108, a body system
controller 109, a body system 110, a storage unit 111, and an
automatic driving controller 112. The input unit 101, the data
acquirer 102, the communication unit 103, the output controller
105, the drive system controller 107, the body system controller
109, the storage unit 111, and the automatic driving controller 112
are connected together via a communication network 121. The
communication network 121 includes, for example, an in-vehicle
communication network, a bus, or the like in accordance with an
optional standard such as a controller area network (CAN), a local
interconnect network (LIN), a local area network (LAN), or FlexRay
(registered trademark). Note that each unit of the vehicle control
system 100 may be directly connected without the communication
network 121.
[0802] Note that in a case where each unit of the vehicle control
system 100 performs communication via the communication network
121, the communication network 121 will not be described
hereinafter. For example, in a case where the input unit 101 and
the automatic driving controller 112 communicate with each other
via the communication network 121, it is merely described as the
input unit 101 and the automatic driving controller 112
communicating with each other.
[0803] The input unit 101 includes a device used for inputting
various types of data, instructions, and the like by a passenger.
For example, the input unit 101 includes an operation device such
as a touch panel, a button, a microphone, a switch, and a lever;
and an operation device allowing input by other methods than manual
operation, such as audio or gesture, for example. Moreover, the
input unit 101 may be, for example, a remote control device
utilizing infrared light or other radio waves, or external
connection equipment such as mobile equipment or wearable equipment
compatible with operation of the vehicle control system 100. The
input unit 101 generates an input signal on the basis of, e.g., the
data or instruction input by the passenger, and supplies the input
signal to each unit of the vehicle control system 100.
[0804] The data acquirer 102 includes, for example, various sensors
configured to acquire data used for the processing of the vehicle
control system 100, and supplies the acquired data to each unit of
the vehicle control system 100.
[0805] For example, the data acquirer 102 includes various sensors
for detecting the state of the subject car and the like.
Specifically, the data acquirer 102 includes, for example, a gyro
sensor, an acceleration sensor, inertial measurement unit (IMU),
and a sensor for detecting an accelerator pedal operation amount, a
brake pedal operation amount, a steering wheel steering angle, the
number of rotations of an engine, the number of rotations of a
motor, wheel rotation speed, and the like.
[0806] Moreover, the data acquirer 102 includes, for example,
various sensors for detecting information regarding the outside of
the subject car. Specifically, the data acquirer 102 includes, for
example, an imaging device such as a time-of-flight (ToF) camera, a
visible light camera, a stereo camera, a monocular camera, an
infrared (far-infrared) camera, and other cameras. Further, the
data acquirer 102 includes, for example, an environment sensor for
detecting, e.g., weather or meteorological phenomena, and a
peripheral information detection sensor for detecting an object
around the subject car. The environment sensor includes, for
example, a raindrop sensor, a fog sensor, a solar irradiation
sensor, a snow sensor, and the like. The peripheral information
detection sensor includes, for example, an ultrasonic sensor, a
radar, light detection and ranging, laser imaging detection and
ranging (LiDAR), a sonar, and the like.
[0807] In addition, the data acquirer 102 includes, for example,
various sensors for detecting the current position of the subject
car. Specifically, the data acquirer 102 includes, for example, a
global navigation satellite system (GNSS) receiver configured to
receive a GNSS signal from a GNSS satellite and the like.
[0808] Moreover, the data acquirer 102 includes, for example,
various sensors for detecting in-vehicle information. Specifically,
the data acquirer 102 includes, for example, an imaging device
configured to image the driver, a biological sensor configured to
detect driver's biological information, a microphone configured to
collect audio in a vehicle interior, and the like. The biological
sensor is, for example, provided at a seating surface, a steering
wheel, or the like, thereby detecting the biological information of
the passenger seated on a seat or the driver holding the steering
wheel.
[0809] The communication unit 103 communicates with, e.g., the
in-vehicle equipment 104 and various types of equipment, servers,
and base stations outside the vehicle, thereby transmitting the
data supplied from each unit of the vehicle control system 100 or
supplying the received data to each unit of the vehicle control
system 100. Note that a communication protocol supported by the
communication unit 103 is not specifically limited, and the
communication unit 103 can support multiple types of communication
protocols.
[0810] For example, the communication unit 103 performs wireless
communication with the in-vehicle equipment 104 via a wireless LAN,
the Bluetooth (registered trademark), near field communication
(NFC), a wireless USB (WUSB), or the like. Moreover, the
communication unit 103 performs, for example, wired communication
with the in-vehicle equipment 104 via a not-shown connection
terminal (and a cable as necessary) by a universal serial bus
(USB), a high-definition multimedia interface (HDMI) (registered
trademark), a mobile high-definition link (MHL), or the like.
[0811] Further, the communication unit 103 communicates, for
example, with equipment (e.g., an application server or a control
server) present on an external network (e.g., the Internet, a cloud
network, or a network unique to a business operator) via a base
station or an access point. Moreover, the communication unit 103
uses, for example, a peer-to-peer (P2P) technology to communicate
with a terminal (e.g., a terminal of a pedestrian or a store or a
machine type communication (MTC) terminal) present near the subject
car. Further, the communication unit 103 performs, for example, V2X
communication such as vehicle-to-vehicle communication,
vehicle-to-infrastructure communication, vehicle-to-home
communication, or vehicle-to-pedestrian communication. Moreover,
the communication unit 103 includes, for example, a beacon receiver
to receive a radio wave or an electromagnetic wave transmitted
from, e.g., a wireless station placed on a road, thereby acquiring
information such as a current position, traffic jam, traffic
regulation, or required time.
[0812] The in-vehicle equipment 104 includes, for example, mobile
equipment or wearable equipment of the passenger, information
equipment installed or attached to the subject car, a navigation
device configured to search a path to an optional destination, and
the like.
[0813] The output controller 105 controls output of various types
of information to the passenger of the subject car or the outside
of the subject car. For example, the output controller 105
generates an output signal containing at least one of visual
information (e.g., image data) or audio information (e.g., audio
data), and supplies the output signal to the output unit 106. In
this manner, the output controller 105 controls output of the
visual information and the audio information from the output unit
106. Specifically, the output controller 105 synthesizes, for
example, image data captured by different imaging devices of the
data acquirer 102 to generate, e.g., a bird's-eye image or a
panoramic image, thereby supplying the output signal containing the
generated image to the output unit 106. Moreover, the output
controller 105 generates, for example, generates audio data
containing, e.g., a warning tone or a warning message against a
risk such as collision, contact, and entrance into a danger area,
thereby supplying the output signal containing the generated audio
data to the output unit 106.
[0814] The output unit 106 includes a device configured to output
the visual information or the audio information to the passenger of
the subject car or the outside of the subject car. For example, the
output unit 106 includes, for example, a display device, an
instrument panel, an audio speaker, headphones, a wearable device
such as a spectacle display attached to the passenger, a projector,
and a lamp. The display device provided at the output unit 106 may
be not only a device having a typical display, but also a device
configured to display the visual information in the field of view
of the driver, such as a head-up display, a transmissive display,
and a device with an augmented reality (AR) display function, for
example.
[0815] The drive system controller 107 generates various control
signals and supplies these signals to the drive system 108, thereby
controlling the drive system 108. Moreover, the drive system
controller 107 supplies, as necessary, the control signal to each
unit other than the drive system 108, thereby notifying a control
state of the drive system 108, for example.
[0816] The drive system 108 includes various devices in connection
with a drive system of the subject car. For example, the drive
system 108 includes a drive force generation device for generating
drive force of an internal-combustion engine, a drive motor, or the
like, a drive force transmission mechanism for transmitting drive
force to a wheel, a steering mechanism configured to adjust a
rudder angle, a braking device configured to generate braking
force, an antilock brake system (ABS), an electronic stability
control (ESC), an electrically-assisted power steering device, and
the like.
[0817] The body system controller 109 generates various control
signals and supplies these signals to the body system 110, thereby
controlling the body system 110. Moreover, the body system
controller 109 supplies, as necessary, the control signal to each
unit other than body system 110, thereby notifying a control state
of the body system 110, for example.
[0818] The body system 110 includes various devices of a body
system installed on a vehicle body. For example, the body system
110 includes a keyless entry system, a smart key system, a power
window device, a power seat, a steering wheel, an air-conditioning
device, various lamps (e.g., a head lamp, a back lamp, a brake
lamp, an indicator, a fog lamp, and the like), and the like.
[0819] For example, the storage unit 111 includes a magnetic
storage device, a semiconductor storage device, an optical storage
device, and a magneto-optical storage device, such as a read only
memory (ROM), a random access memory (RAM), a hard disc drive
(HDD), and the like. The storage unit 111 stores, e.g., various
programs or data used by each unit of the vehicle control system
100. For example, the storage unit 111 stores map data including,
e.g., a three-dimensional high-accuracy map such as a dynamic map,
a global map having lower accuracy than the high-accuracy map and
covering a broad area, and a local map containing information
around the subject car.
[0820] The automatic driving controller 112 performs control
regarding automatic driving such as autonomous running or drive
assist. Specifically, the automatic driving controller 112
performs, for example, cooperative control for the purpose of
implementing an advanced driver assistance system (ADAS) function
including, e.g., collision avoidance or impact attenuation of the
subject car, follow-up running based on an inter-vehicular
distance, vehicle speed maintaining running, subject car collision
warning, or subject car lane deviation warning. Moreover, the
automatic driving controller 112 performs, for example, cooperative
control for the purpose of, e.g., automatic driving for autonomous
running regardless of operation of the driver. The automatic
driving controller 112 includes the detector 131, a self-location
estimator 132, a situation analyzer 133, a planner 134, and an
operation controller 135.
[0821] The detector 131 detects various types of information
necessary for control of automatic driving. The detector 131
includes the outer-vehicle information detector 141, an in-vehicle
information detector 142, and a vehicle state detector 143.
[0822] The outer-vehicle information detector 141 performs the
processing of detecting information outside the subject car on the
basis of the data or signal from each unit of the vehicle control
system 100. For example, the outer-vehicle information detector 141
performs the processing of detecting, recognizing, and tracking the
object around the subject car and the processing of detecting a
distance to the object. The object as a detection target includes,
for example, a vehicle, a person, an obstacle, a structure, a road,
a traffic light, a traffic sign, a road indication, and the like.
Moreover, the outer-vehicle information detector 141 performs, for
example, the processing of detecting environment around the subject
car. The surrounding environment as the detection target includes,
for example, weather, an air temperature, a humidity, brightness, a
road condition, and the like. The outer-vehicle information
detector 141 supplies data indicating a detection processing result
to the self-location estimator 132, a map analyzer 151, a traffic
rule recognizer 152, and a situation recognizer 153 of the
situation analyzer 133, and an emergency situation avoider 171 of
the operation controller 135, for example.
[0823] The in-vehicle information detector 142 performs the
processing of detecting information in the vehicle on the basis of
the data or signal from each unit of the vehicle control system
100. For example, the in-vehicle information detector 142 performs
the processing of authenticating and recognizing the driver, the
processing of detecting the state of the driver, the processing of
detecting the passenger, the processing of detecting environment in
the vehicle, and the like. The state of the driver as the detection
target includes, for example, a physical condition, the degree of
consciousness, the degree of concentration, the degree of fatigue,
the line of sight, and the like. The in-vehicle environment as the
detection target includes, for example, an air temperature, a
humidity, brightness, smell, and the like. The in-vehicle
information detector 142 supplies data indicating a detection
processing result to the situation recognizer 153 of the situation
analyzer 133 and the emergency situation avoider 171 of the
operation controller 135, for example.
[0824] The vehicle state detector 143 performs the processing of
detecting the state of the subject car on the basis of the data or
signal from each unit of the vehicle control system 100. The state
of the subject car as the detection state includes, for example, a
speed, an acceleration, a rudder angle, the presence/absence and
contents of an abnormality, a driving operation state, the position
and inclination of the power seat, a door locking state, other
in-vehicle equipment states, and the like. The vehicle state
detector 143 supplies data indicating a detection processing result
to the situation recognizer 153 of the situation analyzer 133 and
the emergency situation avoider 171 of the operation controller
135, for example.
[0825] The self-location estimator 132 performs the processing of
estimating the position, posture, and the like of the subject car
on the basis of the data or signal from each unit of the vehicle
control system 100, such as the outer-vehicle information detector
141 and the situation recognizer 153 of the situation analyzer 133.
Moreover, the self-location estimator 132 generates, as necessary,
a local map (hereinafter referred to as a self-location estimation
map) used for estimation of the self-location. The self-location
estimation map is, for example, a high-accuracy map using a
technology such as simultaneous localization and mapping (SLAM).
The self-location estimator 132 supplies data indicating an
estimation processing result to the map analyzer 151, the traffic
rule recognizer 152, and the situation recognizer 153 of the
situation analyzer 133, for example. Moreover, the self-location
estimator 132 stores the self-location estimation map in the
storage unit 111.
[0826] The situation analyzer 133 performs the processing of
analyzing a subject car situation or a surrounding situation. The
situation analyzer 133 includes the map analyzer 151, the traffic
rule recognizer 152, the situation recognizer 153, and a situation
predictor 154.
[0827] As necessary, the map analyzer 151 uses the data or signal
from each unit of the vehicle control system 100, such as the
self-location estimator 132 and the outer-vehicle information
detector 141, to perform the processing of analyzing various maps
stored in the storage unit 111, thereby building a map containing
information necessary for automatic driving processing. The map
analyzer 151 supplies the built map to the traffic rule recognizer
152, the situation recognizer 153, the situation predictor 154, and
a route planner 161, an action planner 162, an operation planner
163 of the planner 134, for example.
[0828] The traffic rule recognizer 152 performs the processing of
recognizing a traffic rule around the subject car on the basis of
the data or signal from each unit of the vehicle control system
100, such as the self-location estimator 132, the outer-vehicle
information detector 141, and the map analyzer 151. By such
recognition processing, the position and state of the traffic light
around the subject car, the contents of the traffic regulation
around the subject car, a drivable lane, and the like are
recognized, for example. The traffic rule recognizer 152 supplies
data indicating a recognition processing result to the situation
predictor 154, for example.
[0829] The situation recognizer 153 performs the processing of
recognizing a situation regarding the subject car on the basis of
the data or signal from each unit of the vehicle control system
100, such as the self-location estimator 132, the outer-vehicle
information detector 141, the in-vehicle information detector 142,
the vehicle state detector 143, and the map analyzer 151. For
example, the situation recognizer 153 performs the processing of
recognizing, e.g., a subject car situation, a subject car
surrounding situation, and a subject car driver situation.
Moreover, the situation recognizer 153 generates, as necessary, a
local map (hereinafter referred to as a "situation recognition
map") used for recognition of the subject car surrounding
situation. The situation recognition map is, for example, an
occupancy grid map.
[0830] The subject car situation as a recognition target includes,
for example, the position, posture, and motion (e.g., a speed, an
acceleration, a movement direction, and the like) of the subject
car and the presence/absence and contents of an abnormality, for
example. The subject car surrounding situation as the recognition
target includes, for example, the type and position of a
surrounding stationary object, the type, position, and motion
(e.g., a speed, an acceleration, a movement direction, and the
like) of a surrounding animal body, a surrounding road
configuration, a surrounding road surface condition, surrounding
weather, a surrounding air temperature, a surrounding humidity,
surrounding brightness, and the like. The state of driver as the
recognition target includes, for example, a physical condition, the
degree of consciousness, the degree of concentration, the degree of
fatigue, eye movement, and driving operation.
[0831] The situation recognizer 153 supplies data (as necessary,
containing the situation recognition map) indicating a recognition
processing result to the self-location estimator 132 and the
situation predictor 154, for example. Moreover, the situation
recognizer 153 stores the situation recognition map in the storage
unit 111.
[0832] The situation predictor 154 performs the processing of
predicting the situation regarding the subject car on the basis of
the data or signal from each unit of the vehicle control system
100, such as the map analyzer 151, the traffic rule recognizer 152,
and the situation recognizer 153. For example, the situation
predictor 154 performs the processing of predicting the subject car
situation, the subject car surrounding situation, the driver
situation, and the like.
[0833] The subject car situation as a prediction target includes,
for example, subject car behavior, occurrence of an abnormality, a
drivable distance, and the like. The subject car surrounding
situation as the prediction target includes, for example, behavior
of an animal body around the subject car, a change in the state of
the traffic light, a change in environment such as weather, and the
like. The driver situation as the prediction target includes, for
example, behavior, a physical condition, and the like of the
driver.
[0834] The situation predictor 154 supplies, together with data
from the traffic rule recognizer 152 and the situation recognizer
153, data indicating a prediction processing result to the route
planner 161, the action planner 162, and the operation planner 163
of the planner 134, for example.
[0835] The route planner 161 plans the route to the destination on
the basis of the data or signal from each unit of the vehicle
control system 100, such as the map analyzer 151 and the situation
predictor 154. For example, the route planner 161 sets a route to a
destination specified from a current position on the basis of the
global map. Moreover, for example, the route planner 161 changes
the route as necessary on the basis of a situation such as traffic
jam, an accident, traffic regulation, and construction, the
physical condition of the driver, and the like. The route planner
161 supplies data indicating the planned route to the action
planner 162, for example.
[0836] On the basis of the data or signal from each unit of the
vehicle control system 100, such as the map analyzer 151 and the
situation predictor 154, the action planner 162 plans subject car
action for safe running within planned time along the route planned
by the route planner 161. For example, the action planner 162
performs planning for starting, stopping, a travelling direction
(e.g., advancing, retreating, left turn, right turn, a direction
change, and the like), a running lane, a running speed, and
overtaking. The action planner 162 supplies data indicating the
planned subject car action to the operation planner 163, for
example.
[0837] The operation planner 163 plans subject car operation for
implementing the action planned by the action planner 162 on the
basis of the data or signal from each unit of the vehicle control
system 100, such as the map analyzer 151 and the situation
predictor 154. For example, the operation planner 163 performs
planning for acceleration, deceleration, a running path, and the
like. The operation planner 163 supplies data indicating the
planned subject car operation to an acceleration/deceleration
controller 172 and a direction controller 173 of the operation
controller 135, for example.
[0838] The operation controller 135 performs control of the subject
car operation. The operation controller 135 includes the emergency
situation avoider 171, the acceleration/deceleration controller
172, and the direction controller 173.
[0839] On the basis of detection results of the outer-vehicle
information detector 141, the in-vehicle information detector 142,
and the vehicle state detector 143, the emergency situation avoider
171 performs the processing of detecting an emergency situation
such as collision, contact, entrance into a danger area, a driver
abnormality, and a vehicle abnormality. In the case of detecting
occurrence of the emergency situation, the emergency situation
avoider 171 plans the subject car operation for avoiding the
emergency situation, such as sudden stop and sharp turn.
[0840] The emergency situation avoider 171 supplies data indicating
the planned subject car operation to the acceleration/deceleration
controller 172 and the direction controller 173, for example.
[0841] The acceleration/deceleration controller 172 performs
acceleration/deceleration control for implementing the subject car
operation planned by the operation planner 163 or the emergency
situation avoider 171. For example, the acceleration/deceleration
controller 172 computes a control target value of the drive force
generation device or the braking device for implementing planned
acceleration, deceleration, or sudden stop, thereby supplying a
control instruction indicating the computed control target value to
the drive system controller 107.
[0842] The direction controller 173 performs direction control for
implementing the subject car operation planned by the operation
planner 163 or the emergency situation avoider 171. For example,
the direction controller 173 computes a control target value of the
steering mechanism for implementing the running path or sharp turn
planned by the operation planner 163 or the emergency situation
avoider 171, thereby supplying a control instruction indicating the
computer control target value to the drive system controller
107.
[0843] (11. Summary of Configuration of Present Disclosure)
[0844] The embodiments of the present disclosure have been
described above in detail with reference to specific examples.
However, it is obvious that modification or substitution can be
made to the embodiments by those skilled in the art without
departing from the gist of the present disclosure. That is, the
present disclosure has been disclosed in the form of examples, and
shall not be interpreted in a limited way. For determining the gist
of the present disclosure, the claims needs to be referred.
[0845] Note that the technology disclosed in the present
specification can employ the following configurations.
[0846] (1)
[0847] An image processing device, comprising: [0848] image
processing circuitry configured to: [0849] receive input of a
visible-ray image and an infrared-ray image obtained by
photographing a same subject; [0850] estimate, based on the
visible-ray image, the infrared-ray image and motion information, a
blur estimate associated with the visible-ray image; and [0851]
generate, based on the estimated blur estimate, a corrected
visible-ray image.
[0852] (2)
[0853] The image processing device of (1), wherein the infrared-ray
image is a far-infrared-ray image.
[0854] (3)
[0855] The image processing device of (1), wherein estimating the
blur estimate associated with the visible-ray image comprises:
[0856] estimating, based on the visible-ray image and the
infrared-ray image, an image-based blur estimate; [0857]
estimating, based on the motion information, a motion-based blur
estimate; and [0858] estimating the blur estimate associated with
the visible-ray image based on the image-based blur estimate and
the motion-based blur estimate.
[0859] (4)
[0860] The image processing device of (3), wherein estimating the
image-based blur estimate comprises: [0861] applying each of a
plurality of filters having different blurring characteristics to
the infrared-ray image to produce a plurality of blurred
infrared-ray images; [0862] comparing the visible-ray image to the
plurality of blurred infrared-ray images; and [0863] selecting a
filter from among the plurality of filters that produced a blurred
infrared-ray image having blurring most similar to the visible-ray
image.
[0864] (5)
[0865] The image processing device of (4), wherein the different
blurring characteristics correspond to different point-spread
functions.
[0866] (6)
[0867] The image processing device of (4), wherein comparing the
visible-ray image to the plurality of blurred infrared images
comprises: [0868] calculating correlation values between the
visible-ray image and each of the plurality of blurred infrared-ray
images, and [0869] wherein selecting a filter from among the
plurality of filters comprises selecting the filter that produced
the blurred infrared-ray image having a highest correlation value
from among the calculated correlation values.
[0870] (7)
[0871] The image processing device of (3), wherein estimating the
motion-based blur estimate comprises: [0872] determining, based on
the motion information, a direction and magnitude of blur in the
visible-ray image.
[0873] (8)
[0874] The image processing device of (7), wherein estimating the
motion-based blur estimate further comprises:
[0875] specifying, within the visible-ray image, a plurality of
image blocks, each of which corresponds to a portion of the
visible-ray image, and
[0876] wherein determining the direction and magnitude of blur in
the visible-ray image comprises determining the direction and
magnitude of blur in the visible-ray image for each image block of
the plurality of image blocks.
[0877] (9)
[0878] The image processing device of (3), wherein estimating the
blur estimate associated with the visible-ray image based on the
image-based blur estimate and the motion-based blur estimate
comprises selecting as the blur estimate associated with the
visible-ray image the image-based blur estimate or the motion-based
blur estimate.
[0879] (100
[0880] The image processing device of (9), wherein selecting as the
blur estimate associated with the visible-ray image the image-based
blur estimate or the motion-based blur estimate comprises: [0881]
determining an amount of object motion for an object in the
visible-ray image; [0882] selecting the image-based blur estimate
as the blur estimate associated with the visible-ray image when the
determined amount of object motion is greater than a threshold
value; and [0883] selecting the motion-based blur estimate as the
blur estimate associated with the visible-ray image when the
determined amount of object motion is less than the threshold
value.
[0884] (11)
[0885] The image processing device of (10), wherein determining the
amount of object motion comprises determining the amount of object
motion based on environment information, wherein the environment
information includes one or more of map information, time
information, and traffic information.
[0886] (12)
[0887] The image processing device of (10), wherein determining the
amount of object motion comprises determining the amount of object
motion for one or more portions of the visible-ray image.
[0888] (13)
[0889] The image processing device of (3), wherein estimating the
blur estimate associated with the visible-ray image based on the
image-based blur estimate and the motion-based blur estimate
comprises combining the image-based blur estimate and the
motion-based blur estimate.
[0890] (14)
[0891] The image processing device of (1), wherein estimating the
blur estimate associated with the visible-ray image comprises:
[0892] estimating, based on the motion information, a motion-based
blur estimate; and [0893] estimating the blur estimate associated
with the visible-ray image based on the visible-ray image, the
infrared-ray image, and the motion-based blur estimate.
[0894] (15)
[0895] The image processing device of (14), wherein estimating the
blur estimate associated with the visible-ray image based on the
visible-ray image, the infrared-ray image, and the motion-based
blur estimate comprises:
[0896] selecting, based on the motion-based blur estimate, a filter
from among a plurality of filters having different blurring
characteristics.
[0897] (16)
[0898] The image processing device of (14), wherein estimating the
blur estimate associated with the visible-ray image comprises:
[0899] applying each of a plurality of filters having different
blurring characteristics to the infrared-ray image to produce a
plurality of blurred infrared-ray images; [0900] calculating
correlation values between the visible-ray image and each of the
plurality of blurred infrared-ray images; and [0901] selecting a
filter from among the plurality of filters, wherein the selection
of the filter is based on the calculated correlation values and the
motion-based blur estimate.
[0902] (17)
[0903] The image processing device of (1), wherein estimating the
blur estimate associated with the visible-ray image comprises
selecting a filter from among a plurality of filters having
different blurring characteristics, and [0904] wherein generating
the corrected visible-ray image comprises applying to the
visible-ray image an inverse characteristic to a characteristic of
the selected filter.
[0905] (18)
[0906] An image processing method that is performed in an image
processing device, the image processing method comprising: [0907]
receiving input of a visible-ray image and an infrared-ray image
obtained by photographing a same subject; [0908] estimating, based
on the visible-ray image, the infrared-ray image and motion
information, a blur estimate associated with the visible-ray image;
and [0909] generating, based on the estimated blur estimate, a
corrected visible-ray image.
[0910] (19)
[0911] A non-transitory computer readable medium encoded with a
plurality of instructions that, when executed by image processing
circuitry of an image processing device, perform an image
processing method, the image processing method comprising: [0912]
receiving input of a visible-ray image and an infrared-ray image
obtained by photographing a same subject; [0913] estimating, based
on the visible-ray image, the infrared-ray image and motion
information, a blur estimate associated with the visible-ray image;
and [0914] generating, based on the estimated blur estimate, a
corrected visible-ray image.
[0915] Moreover, a series of processing described in the
specification can be executed by hardware, software, or a combined
configuration thereof. In the case of executing the processing by
the software, a program recording a processing sequence can be
installed and executed in a memory in a computer incorporated in
dedicated hardware, or a program can be installed and executed in a
versatile computer configured to execute various types of
processing. For example, the program can be recorded in advance in
a recording medium. The program can be not only installed in the
computer from the recording medium, but also can be received via a
network such as a local area network (LAN) or the Internet. Then,
the program can be installed in a recording medium such as a
built-in hard drive.
[0916] Note that various types of processing described in the
specification may be not only executed in chronological order as
described, but also may be executed in parallel or separately
according to a processing capacity of a device configured to
execute the processing or as necessary. Moreover, the system in the
present specification is a logical configuration set of multiple
devices, and is not limited to each device configuration in the
same housing.
INDUSTRIAL APPLICABILITY
[0917] As described above, according to the configuration of one
embodiment of the present disclosure, the device and method for
executing the image quality improvement processing of removing or
reducing a blur (defocusing) on a visible light image are
implemented.
[0918] Specifically, a visible light image and a far-infrared image
captured by simultaneous photographing of the same object and
camera motion information are input; a camera motion-based blur as
a blur (defocusing) on the visible light image due to camera motion
is estimated; the visible light image, the far-infrared image, and
the camera motion-based blur are utilized to estimate an integrated
filter as a filter for generating a blur corresponding to an
integrated blur of a visible light image-based blur and the camera
motion-based blur; and an opposite characteristic filter with
characteristics opposite to those of the estimated integrated
filter is applied to the visible light image to generate a
corrected visible light image whose blur has been removed or
reduced.
[0919] By these types of processing, the device and method for
executing the image quality improvement processing of removing or
reducing the blur (defocusing) on the visible light image are
implemented.
REFERENCE SIGNS LIST
[0920] 11 Blurred visible light image [0921] 12 Blur-less
far-infrared image [0922] 13 Camera motion information [0923] 15
Blur-reduced visible light image [0924] 21 Visible light image
input unit [0925] 22 Far-infrared image input unit [0926] 23 Camera
motion information input unit [0927] 25 Pre-correction visible
light image [0928] 26 Far-infrared image [0929] 27 Post-correction
visible light image [0930] 30 Image-based blur estimator [0931] 30b
Blur estimator [0932] 31 Filter processor [0933] 32 Correlation
computer [0934] 33 Image-based filter determinator [0935] 34 Filter
bank selector [0936] 35 Filter bank [0937] 37 Integrated filter
determinator [0938] 38 Filter (blur) comparator [0939] 39
Correlation corrector [0940] 40 Camera motion-based blur estimator
[0941] 41 Camera motion blur map acquirer [0942] 45 Camera motion
blur map storage unit [0943] 50 Integrated blur estimator [0944] 51
Object motion determinator [0945] 52 Integrated filter determinator
[0946] 55 Environment information storage unit/input unit [0947] 60
Blur remover [0948] 61 Inverse filter calculator [0949] 62 Inverse
filter processor [0950] 63 Inverse filter corrector [0951] 70
Reliability calculator [0952] 71 Filter (blur) comparator [0953] 81
CPU [0954] 82 ROM [0955] 83 RAM [0956] 84 Bus [0957] 85
Input/output interface [0958] 86 Input unit [0959] 87 Output unit
[0960] 88 Storage unit [0961] 89 Communication unit [0962] 90 Drive
[0963] 91 Removable medium [0964] 95 Imager (camera) [0965] 96
Display [0966] 100 Vehicle control system [0967] 101 Input unit
[0968] 102 Data acquirer [0969] 103 Communication unit [0970] 104
In-vehicle equipment [0971] 105 Output controller [0972] 106 Output
unit [0973] 107 Drive system controller [0974] 108 Drive system
[0975] 109 Body system controller [0976] 110 Body system [0977] 111
Storage unit [0978] 112 Automatic driving controller [0979] 131
Detector [0980] 132 Self-location estimator [0981] 133 Situation
analyzer [0982] 134 Planner [0983] 135 Operation controller [0984]
141 Outer-vehicle information detector [0985] 142 In-vehicle
information detector [0986] 143 Vehicle state detector [0987] 151
Map analyzer [0988] 152 Traffic rule recognizer [0989] 153
Situation recognizer [0990] 154 Situation predictor [0991] 161
Route planner [0992] 162 Action planner [0993] 163 Operation
planner [0994] 171 Emergency situation avoider [0995] 172
Acceleration/deceleration controller [0996] 173 Direction
controller [0997] 201 Display
* * * * *