U.S. patent application number 09/877002 was filed with the patent office on 2002-01-03 for interpolation processing apparatus and recording medium having interpolation processing program recorded therein.
This patent application is currently assigned to NIKON CORPORATION. Invention is credited to Chen, Zhe-Hong, Ishiga, Kenichi.
Application Number | 20020001409 09/877002 |
Document ID | / |
Family ID | 26581442 |
Filed Date | 2002-01-03 |
United States Patent
Application |
20020001409 |
Kind Code |
A1 |
Chen, Zhe-Hong ; et
al. |
January 3, 2002 |
Interpolation processing apparatus and recording medium having
interpolation processing program recorded therein
Abstract
A first interpolation processing apparatus that engages in
processing on image data which are provided in a calorimetric
system constituted of first.about.nth (n.gtoreq.2) color components
and include color information corresponding to a single color
component provided at each pixel to determine an interpolation
value equivalent to color information corresponding to the first
color component for a pixel at which the first color component is
missing, includes: an interpolation value calculation section that
uses color information at pixels located in a local area containing
an interpolation target pixel to undergo interpolation processing
to calculate an interpolation value including, at least (1) local
average information of the first color component with regard to the
interpolation target pixel and (2) local curvature information
corresponding to at least two color components with regard to the
interpolation target pixel.
Inventors: |
Chen, Zhe-Hong; (Tokyo,
JP) ; Ishiga, Kenichi; (Kawasaki-shi, JP) |
Correspondence
Address: |
Oliff & Berridge PLC
P.O. Box 19928
Alexandria
VA
22320
US
|
Assignee: |
NIKON CORPORATION
2-3, Marunouchi 3-chome, Chiyoda-ku
Chiyoda-ku
JP
100-8331
|
Family ID: |
26581442 |
Appl. No.: |
09/877002 |
Filed: |
June 11, 2001 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
09877002 |
Jun 11, 2001 |
|
|
|
PCT/JP00/09040 |
Dec 20, 2000 |
|
|
|
Current U.S.
Class: |
382/167 ;
348/E9.01 |
Current CPC
Class: |
G06T 5/20 20130101; G06T
3/4007 20130101; H04N 9/04557 20180801; H04N 9/04515 20180801; G06T
3/4015 20130101 |
Class at
Publication: |
382/167 |
International
Class: |
G06K 009/00 |
Foreign Application Data
Date |
Code |
Application Number |
Dec 21, 1999 |
JP |
11-363007 |
Jul 6, 2000 |
JP |
2000-204768 |
Claims
What is claimed is:
1. An interpolation processing apparatus that engages in processing
on image data which are provided in a colorimetric system
constituted of first.about.nth (n.gtoreq.2) color components and
include color information corresponding to a single color component
provided at each pixel to determine an interpolation value
equivalent to color information corresponding to the first color
component for a pixel at which the first color component is
missing, comprising: an interpolation value calculation section
that uses color information at pixels located in a local area
containing an interpolation target pixel to undergo interpolation
processing to calculate an interpolation value including, at least
(1) local average information of the first color component with
regard to the interpolation target pixel and (2) local curvature
information corresponding to at least two color components with
regard to the interpolation target pixel.
2. An interpolation processing apparatus according to claim 1,
wherein: said interpolation value calculation section calculates,
as said local curvature information corresponding to at least two
color components, (1) local curvature information based upon a
color component matching a color component at the interpolation
target pixel and (2) local curvature information based upon a color
component other than the color component at the interpolation
target pixel.
3. An interpolation processing apparatus that engages in processing
on image data which are provided in a colorimetric system
constituted of first.about.nth (n.gtoreq.2) color components and
include color information corresponding to a single color component
provided at each pixel to determine an interpolation value
equivalent to color information corresponding to the first color
component for a pixel at which the first color component is
missing, comprising: an interpolation value calculation section
that uses color information at pixels located in a local area
containing an interpolation target pixel to undergo interpolation
processing to calculate an interpolation value including, at least
(1) local average information of the first color component with
regard to the interpolation target pixel and (2) local curvature
information based upon a color component other than a color
component at the interpolation target pixel.
4. An interpolation processing apparatus that engages in processing
on image data which are provided in a calorimetric system
constituted of first.about.nth (n.gtoreq.2) color components and
include color information corresponding to a single color component
provided at each pixel to determine an interpolation value
equivalent to color information corresponding to the first color
component for a pixel at which the first color component is
missing, comprising: an interpolation value calculation section
that uses color information at pixels located in a local area
containing an interpolation target pixel to undergo interpolation
processing to calculate an interpolation value including, at least
(1) local average information of the first color component with
regard to the interpolation target pixel and (2) local curvature
information corresponding to the first color component with respect
to the interpolation target pixel.
5. An interpolation processing apparatus according to claim 1,
further comprising: a first similarity judgment section that judges
degrees of similarity to the interpolation target pixel along at
least two directions in which pixels with color information
corresponding to the first color component are connected with the
interpolation target pixel; and a second similarity judgment
section that judges degrees of similarity to the interpolation
target pixel along at least two directions other than the
directions in which the degrees of similarity are judged by said
first similarity judgment section, wherein: said interpolation
value calculation section selects a direction along which pixels
having color information to be used to calculate said local average
information of the first color component are set based upon results
of a judgment made by said first similarity judgment section; (1)
said interpolation value calculation section selects a direction
along which pixels having color information to be used to calculate
said local curvature information are set based upon results of the
judgment made by said first similarity judgment section if said
local curvature information is "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said first similarity judgment section"; and (2) said
interpolation value calculation section selects a direction along
which pixels having color information to be used to calculate said
local curvature information are set based upon results of a
judgment made by said second similarity judgment section if said
local curvature information is "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said second similarity judgment section."
6. An interpolation processing apparatus that engages in processing
on image data which are provided in a calorimetric system
constituted of first.about.nth (n.gtoreq.2) color components and
include color information corresponding to a single color component
provided at each pixel to determine an interpolation value
equivalent to color information corresponding to the first color
component for a pixel at which the first color component is
missing, comprising: an interpolation value calculation section
that calculates an interpolation value including at least two
terms, i.e., a first term and a second term by using color
information at pixels set in a local area containing an
interpolation target pixel to undergo interpolation processing; a
first similarity judgement section that judges degrees of
similarity to the interpolation target pixel along at least two
directions in which pixels having color information corresponding
to the first color component are connected to the interpolation
target pixel; and a second similarity judgment section that judges
degrees of similarity to the interpolation target pixel along at
least two directions other than the directions in which the degrees
of similarity are judged by said first similarity judgment section,
wherein: said interpolation value calculation section selects a
direction along which pixels having color information to be used to
calculate said first term are set based upon results of a judgment
made by said first similarity judgment section and selects a
direction along which pixels having color information to be used to
calculate said second term are set based upon results of a judgment
made by said second similarity judgment section.
7. An interpolation processing apparatus according to claim 6,
wherein said interpolation value calculation section: calculates a
term containing (a) local average information of the first color
component with regard to the interpolation target pixel and (b)
local curvature information constituted of a single color component
and manifesting directionality along a direction in which degrees
of similarity are judged by said first similarity judgment section,
as said first term; and calculates a term containing local
curvature information constituted of a single color component and
manifesting directionality along a direction in which degrees of
similarity are judged by said second similarity judgment section,
as said second term.
8. An interpolation processing apparatus according to claim 5,
wherein: when image data are provided in a calorimetric system
constituted of first.about.third color components with the first
color component achieving a higher spatial frequency than the
second color component and the third color component, the first
color component set in a checker-board pattern, the second color
component and the third color component each set in a line sequence
between pixels at which color information corresponding to the
first color component is present and information corresponding to
the second color component present at the interpolation target
pixel; said first similarity judgment section calculates similarity
degrees manifested by the interpolation target pixel along two
directions, i.e., a vertical direction and a horizontal direction,
in which pixels with color information corresponding to the first
color component that are closest to the interpolation target pixel
are connected to the interpolation target pixel and makes a
judgment with regard to degrees of similarity manifested by the
interpolation target pixel along the vertical direction and the
horizontal direction based upon a difference between said
similarity degrees; said second similarity judgment section
calculates similarity degrees manifested by the interpolation
target pixel along two diagonal directions in which pixels with
color information corresponding to the third color component that
are closest to the interpolation target pixel are connected to the
interpolation target pixel and makes a judgment with regard to
degrees of similarity manifested by the interpolation target pixel
along the two diagonal directions based upon a difference between
said similarity degrees; and said interpolation value calculation
section selects at least either the second color component or the
first color component based upon which said "local curvature
information constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said first similarity judgment section" is provided and
selects at least either the second color component or the third
color component based upon which said "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said second similarity judgment section" is provided.
9. An interpolation processing apparatus according to claim 8,
wherein: when said local curvature information is "local curvature
information based upon a color component other than the color
component at the interpolation target pixel", said interpolation
value calculation section selects the first color component or the
third color component to which said local curvature information is
to correspond in conformance to the degrees of similarity judged by
said second similarity judgment section.
10. An interpolation processing apparatus according to claim 9,
wherein: said interpolation value calculation section calculates
local curvature information based upon the first color component if
said second similarity judgment section judges that roughly equal
degrees of similarity manifest along the two diagonal directions
and calculates local curvature information based upon the third
color component if said second similarity judgment section judges
that a higher degree of similarity manifests along one of the two
diagonal directions compared to the other diagonal direction.
11. An interpolation processing apparatus according to claim 8,
wherein: said first similarity judgment section judges that roughly
equal degrees of similarity manifest along the vertical direction
and the horizontal direction if a difference between the similarity
degrees along the vertical direction and the horizontal direction
is smaller than a specific threshold value; and said second
similarity judgment section judges that roughly equal degrees of
similarity manifest along the two diagonal directions if a
difference between the similarity degrees along the two diagonal
directions is smaller than a specific threshold value.
12. An interpolation processing apparatus according to claim 8,
wherein: said first similarity judgment section calculates the
similarity degrees along the vertical direction and the horizontal
direction by using color information corresponding to a plurality
of color components for a single interpolation target pixel; and
said second similarity judgment section calculates the similarity
degrees along the two diagonal directions by using color
information corresponding to a plurality of color components for a
single interpolation target pixel.
13. An interpolation processing apparatus according to claim 12,
wherein: said second similarity judgment section calculates a
similarity degree manifesting along each of the two diagonal
directions through weighted addition of: (1) a similarity degree
component constituted of color information corresponding to the
first color component alone; (2) a similarity degree component
constituted of color information corresponding to the second color
component alone; (3) a similarity degree component constituted of
color information corresponding to the third color component alone;
and (4) a similarity degree component constituted of color
information corresponding to the second color component and the
third color component.
14. An interpolation processing apparatus according to claim 8,
wherein: said first similarity judgment section calculates
similarity degrees along the vertical direction and the horizontal
direction for each pixel and makes a judgment on similarity
manifested by the interpolation target pixel along the vertical
direction and the horizontal direction based upon differences in
similarity degrees manifesting at nearby pixels as well as at the
interpolation target pixel; and said second similarity judgment
section calculates similarity degrees along the two diagonal
directions for each pixel and makes a judgment on similarity
manifested by the interpolation target pixel along the two diagonal
directions based upon differences in similarity degrees manifesting
at nearby pixels as well as at the interpolation target pixel.
15. An interpolation processing apparatus that engages in
processing on image data which are provided in a calorimetric
system constituted of first.about.nth (n.gtoreq.2) color components
and include color information corresponding to a single color
component provided at each pixel to determine an interpolation
value equivalent to color information corresponding to the first
color component for a pixel at which the first color component is
missing, comprising: a first term calculation section that
calculates a first term representing average information of the
first color component with regard to an interpolation target pixel
to undergo interpolation processing by using color information
corresponding to color components at pixels set in a local area
containing the interpolation target pixel; a second term
calculation section that calculates a second term representing
local curvature information based upon a color component matching
the color component at the interpolation target pixel with regard
to the interpolation target pixel by using color information
corresponding to color components at pixels set in a local area
containing the interpolation target pixel; and an interpolation
value calculation section that calculates an interpolation value by
adding said second term multiplied by a weighting coefficient
constituted of color information corresponding to a plurality of
color components at pixels in the local area containing the
interpolation target pixel to said first term.
16. An interpolation processing apparatus according to claim 15,
wherein: said interpolation value calculation section uses color
information corresponding to a plurality of color components
provided at the interpolation target pixel and at a plurality of
pixels set along a predetermined direction relative to the
interpolation target pixel to ascertain inclinations manifesting in
color information corresponding to the individual color components
along the direction and calculates said weighting coefficient in
conformance to a correlation manifesting among the inclinations in
the color information corresponding to the individual color
components.
17. An interpolation processing apparatus that implements
processing for supplementing a color component value at a pixel at
which information corresponding to a color component is missing in
image data provided in a calorimetric system constituted of a
luminance component and the color component, with the luminance
component having a higher spatial frequency than the color
component and the luminance component present both at pixels having
information corresponding to the color component and at pixels
lacking information corresponding to the color component,
comprising: a hue value calculation section that calculates hue
values at a plurality of pixels located near an interpolation
target pixel to undergo interpolation processing and having both
the luminance component and the color component by using luminance
component values and color component values at the individual
pixels; a hue value interpolation section that calculates a hue
value at the interpolation target pixel by using a median of the
hue values at the plurality of pixels calculated by said hue value
calculation section; and a color conversion section that
interpolates a color component at the interpolation target pixel by
using the luminance component at the interpolation target pixel to
convert the hue value at the interpolation target pixel calculated
by said hue value interpolation section to a color component.
18. An interpolation processing apparatus that implements
processing for supplementing a luminance component at a pixel at
which information corresponding to a luminance component is missing
and supplementing a color component at a pixel at which information
corresponding to a color component is missing, on image data
provided in a calorimetric system constituted of the luminance
component and the color component, with the luminance component
having a higher spatial frequency than the color component and a
given pixel having only information corresponding to either the
luminance component or the color component, comprising: a luminance
component interpolation section that interpolates a luminance
component at a luminance component interpolation target pixel to
undergo luminance component interpolation processing by using at
least either "similarity manifesting between the luminance
component interpolation target pixel and a pixel near the luminance
component interpolation target pixel" or "a plurality of color
components within a local area containing the luminance component
interpolation target pixel"; a hue value calculation section that
calculates hue values at a plurality of pixels located near an
interpolation target pixel to undergo color component interpolation
processing, having color component values and having luminance
component values interpolated by said luminance component
interpolation section, by using the luminance component values and
color component values at the individual pixels; a hue value
interpolation section that calculates a hue value for the
interpolation target pixel by using a median of the hue values at
the plurality of pixels calculated by said hue value calculation
section; and a color conversion section that interpolates a color
component value for the interpolation target pixel by using the
luminance component value at the interpolation target pixel to
convert the hue value at the interpolation target pixel calculated
by said hue value interpolation section to a color component
value.
19. An interpolation processing apparatus according to claim 17,
wherein: when the luminance component in the image data corresponds
to a green color component and the color component in the image
data corresponds to a red color component and a blue color
component, said hue value interpolation section calculates a hue
value for the interpolation target pixel by using a median of hue
values containing the red color component at pixels near the
interpolation target pixel if the green color component is present
but the red color component is missing at the interpolation target
pixel and calculates a hue value for the interpolation target pixel
by using a median of hue values containing the blue color component
at pixels near the interpolation target pixel if the green color
component is present but the blue color component is missing at the
interpolation target pixel.
20. An interpolation processing apparatus according to claim 17,
wherein: when the luminance component in the image data corresponds
to a green color component and the color component in the image
data corresponds to a red color component and a blue color
component, said hue value interpolation section calculates a hue
value for the interpolation target pixel by using a median of hue
values containing the red color component at pixels set near the
interpolation target pixel if the blue color component is present
but the red color component is missing at the interpolation target
pixel.
21. An interpolation processing apparatus according to claim 17,
wherein: when the luminance component in the image data corresponds
to a green color component and the color component in the image
data corresponds to a red color component and a blue color
component, said hue value interpolation section calculates a hue
value for the interpolation target pixel by using a median of hue
values containing the blue color component at pixels set near the
interpolation target pixel if the red color component is present
but the blue color component is missing at the interpolation target
pixel.
22. An interpolation processing apparatus according to claim 17,
with a color component missing at the interpolation target pixel
present at only one pixel among four pixels set symmetrically along
the vertical direction and the horizontal direction, wherein said
hue value interpolation section comprises: a first hue value
interpolation unit that calculates a hue value for the
interpolation target pixel by using a median of hue values at a
plurality of diagonally adjacent pixels if the hue values of the
plurality of diagonally adjacent pixels adjacent to the
interpolation target pixel along diagonal directions have been
calculated by said hue value calculation section; and a second hue
value interpolation unit that calculates a hue value for the
interpolation target pixel by using a median of hue values at a
plurality of vertically and horizontally adjacent pixels if the hue
values of the plurality of vertically and horizontally adjacent
pixels adjacent to the interpolation target pixel in the vertical
direction and the horizontal direction have been calculated by said
hue value calculation section or said first hue value interpolation
unit.
23. A recording medium having recorded therein an interpolation
processing program to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a colorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel, said interpolation processing program comprising: an
interpolation value calculation step in which an interpolation
value including, at least (1) local average information of the
first color component with regard to an interpolation target pixel
to undergo interpolation processing and (2) local curvature
information corresponding to at least two color components with
regard to the interpolation target pixel, is calculated by using
color information provided at pixels set within a local area
containing the interpolation target pixel.
24. A recording medium having recorded therein an interpolation
processing program to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a colorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel, said interpolation processing program comprising: an
interpolation value calculation step in which an interpolation
value including, at least (1) local average information of the
first color component with regard to an interpolation target pixel
to undergo the interpolation processing; and (2) local curvature
information based upon a color component other than a color
component at the interpolation target pixel, is calculated by using
color information provided at pixels set within a local area
containing the interpolation target pixel.
25. A recording medium having recorded therein an interpolation
processing program to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a calorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel, said interpolation processing program comprising: an
interpolation value calculation step in which an interpolation
value including, at least (1) local average information of the
first color component with regard to an interpolation target pixel
to undergo the interpolation processing, and (2) local curvature
information corresponding to the first color component with respect
to the interpolation target pixel, is calculated by using color
information provided at pixels set within a local area containing
the interpolation target pixel.
26. A recording medium having recorded therein an interpolation
processing program to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a colorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel, said interpolation processing program comprising: an
interpolation value calculation step in which an interpolation
value including at least two terms, i.e., a first term and a second
term is calculated by using color information at pixels set within
a local area containing an interpolation target pixel to undergo
interpolation processing; a first similarity judgment step in which
degrees of similarity to the interpolation target pixel are judged
along at least two directions in which pixels having color
information corresponding to the first color component are
connected with the interpolation target pixel; and a second
similarity judgment step in which degrees of similarity to the
interpolation target pixel are judged along at least two directions
other than the directions along which the degrees of similarity are
judged in said first similarity judgment step, wherein: in said
interpolation value calculation step, a direction in which pixels
having color information to be used to calculate said first term
are set is selected based upon results of a judgment made in said
first similarity judgment step and a direction in which pixels
having color information to be used to calculate said second term
are set is selected based upon results of a judgment made in said
second similarity judgment step.
27. A recording medium having recorded therein an interpolation
processing program to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a calorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel, said interpolation processing program comprising: a
first term calculation step in which a first term representing
average information of the first color component with regard to an
interpolation target pixel to undergo interpolation processing is
calculated by using color information corresponding to a color
component at pixels set within a local area containing the
interpolation target pixel; a second term calculation step in which
a second term representing local curvature information based upon a
color component matching the color component at the interpolation
target pixel is calculated with regard to the interpolation target
pixel by using color information corresponding to a color component
at pixels set within a local area containing the interpolation
target pixel; and an interpolation value calculation step in which
an interpolation value is calculated by adding said second term
multiplied by a weighting coefficient constituted of color
information corresponding to a plurality of color components
provided at pixels set within a local area containing the
interpolation target pixel to the first term.
28. A recording medium having recorded therein an interpolation
processing program for implementing on a computer processing
supplementing a color component value at a pixel at which
information corresponding to a color component is missing, on image
data provided in a colorimetric system constituted of a luminance
component and the color component, with the luminance component
having a higher spatial frequency than the color component and the
luminance component present both at pixels having information
corresponding to the color component and at pixels lacking
information corresponding to the color component; said
interpolation processing program comprising: a hue value
calculation step in which hue values for a plurality of pixels near
an interpolation target pixel to undergo interpolation processing
and having information corresponding to both the luminance
component and the color component are calculated by using luminance
component values and color component values at the individual
pixels; a hue value interpolation step in which a hue value for the
interpolation target pixel is calculated by using a median of the
hue values at the plurality of pixels calculated in the hue value
calculation step; and a color conversion step in which a color
component value at the interpolation target pixel is interpolated
by using a value indicated by the luminance component present at
the interpolation target pixel to convert the hue value of the
interpolation target pixel calculated in the hue value
interpolation step to a color component value.
29. A recording medium having recorded therein an interpolation
processing program for implementing on a computer processing for
supplementing a luminance component value at a pixel at which
information corresponding to a luminance component is missing and a
color component value at a pixel at which information corresponding
to a color component missing, on image data provided in a
calorimetric system constituted of the luminance component and the
color component, with the luminance component having a higher
spatial frequency than the color component and information
corresponding to either the luminance component or the color
component present at each pixel, said interpolation processing
program comprising: a luminance component interpolation step in
which a luminance component value is interpolated for a luminance
component interpolation target pixel to undergo luminance component
interpolation processing by using at least either "similarity
between the luminance component interpolation target pixel and a
pixel near the luminance component interpolation target pixel" or
"information corresponding to a plurality of color components
within a local area containing the luminance component
interpolation target pixel"; a hue value calculation step in which
hue values at a plurality of pixels located near an interpolation
target pixel to undergo color component interpolation processing,
having color component values and having luminance component values
interpolated in said luminance component interpolation step are
calculated by using the luminance component values and color
component values at the individual pixels; a hue value
interpolation step in which a hue value for the interpolation
target pixel is calculated by using a median of the hue values at
the plurality of pixels calculated in the hue value calculation
step; and a color conversion step in which a color component value
is interpolated for the interpolation target pixel by using the
luminance component value at the interpolation target pixel to
convert the hue value at the interpolation target pixel calculated
in said hue value interpolation step to a color component
value.
30. An interpolation processing apparatus according to claim 2,
further comprising: a first similarity judgment section that judges
degrees of similarity to the interpolation target pixel along at
least two directions in which pixels with color information
corresponding to the first color component are connected with the
interpolation target pixel; and a second similarity judgment
section that judges degrees of similarity to the interpolation
target pixel along at least two directions other than the
directions in which the degrees of similarity are judged by said
first similarity judgment section, wherein: said interpolation
value calculation section selects a direction along which pixels
having color information to be used to calculate said local average
information of the first color component are set based upon results
of a judgment made by said first similarity judgment section; (1)
said interpolation value calculation section selects a direction
along which pixels having color information to be used to calculate
said local curvature information are set based upon results of the
judgment made by said first similarity judgment section if said
local curvature information is "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said first similarity judgment section"; and (2) said
interpolation value calculation section selects a direction along
which pixels having color information to be used to calculate said
local curvature information are set based upon results of a
judgment made by said second similarity judgment section if said
local curvature information is "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said second similarity judgment section."
31. An interpolation processing apparatus according to claim 30,
wherein: when image data are provided in a colorimetric system
constituted of first.about.third color components with the first
color component achieving a higher spatial frequency than the
second color component and the third color component, the first
color component set in a checker-board pattern, the second color
component and the third color component each set in a line sequence
between pixels at which color information corresponding to the
first color component is present and information corresponding to
the second color component present at the interpolation target
pixel; said first similarity judgment section calculates similarity
degrees manifested by the interpolation target pixel along two
directions, i.e., a vertical direction and a horizontal direction,
in which pixels with color information corresponding to the first
color component that are closest to the interpolation target pixel
are connected to the interpolation target pixel and makes a
judgment with regard to degrees of similarity manifested by the
interpolation target pixel along the vertical direction and the
horizontal direction based upon a difference between said
similarity degrees; said second similarity judgment section
calculates similarity degrees manifested by the interpolation
target pixel along two diagonal directions in which pixels with
color information corresponding to the third color component that
are closest to the interpolation target pixel are connected to the
interpolation target pixel and makes a judgment with regard to
degrees of similarity manifested by the interpolation target pixel
along the two diagonal directions based upon a difference between
said similarity degrees; and said interpolation value calculation
section selects at least either the second color component or the
first color component based upon which said "local curvature
information constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said first similarity judgment section" is provided and
selects at least either the second color component or the third
color component based upon which said "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said second similarity judgment section" is provided.
32. An interpolation processing apparatus according to claim 31,
wherein: when said local curvature information is "local curvature
information based upon a color component other than the color
component at the interpolation target pixel", said interpolation
value calculation section selects the first color component or the
third color component to which said local curvature information is
to correspond in conformance to the degrees of similarity judged by
said second similarity judgment section.
33. An interpolation processing apparatus according to claim 32,
wherein: said interpolation value calculation section calculates
local curvature information based upon the first color component if
said second similarity judgment section judges that roughly equal
degrees of similarity manifest along the two diagonal directions
and calculates local curvature information based upon the third
color component if said second similarity judgment section judges
that a higher degree of similarity manifests along one of the two
diagonal directions compared to the other diagonal direction.
34. An interpolation processing apparatus according to claim 31,
wherein: said first similarity judgment section judges that roughly
equal degrees of similarity manifest along the vertical direction
and the horizontal direction if a difference between the similarity
degrees along the vertical direction and the horizontal direction
is smaller than a specific threshold value; and said second
similarity judgment section judges that roughly equal degrees of
similarity manifest along the two diagonal directions if a
difference between the similarity degrees along the two diagonal
directions is smaller than a specific threshold value.
35. An interpolation processing apparatus according to claim 31,
wherein: said first similarity judgment section calculates the
similarity degrees along the vertical direction and the horizontal
direction by using color information corresponding to a plurality
of color components for a single interpolation target pixel; and
said second similarity judgment section calculates the similarity
degrees along the two diagonal directions by using color
information corresponding to a plurality of color components for a
single interpolation target pixel.
36. An interpolation processing apparatus according to claim 35,
wherein: said second similarity judgment section calculates a
similarity degree manifesting along each of the two diagonal
directions through weighted addition of: (1) a similarity degree
component constituted of color information corresponding to the
first color component alone; (2) a similarity degree component
constituted of color information corresponding to the second color
component alone; (3) a similarity degree component constituted of
color information corresponding to the third color component alone;
and (4) a similarity degree component constituted of color
information corresponding to the second color component and the
third color component.
37. An interpolation processing apparatus according to claim 31,
wherein: said first similarity judgment section calculates
similarity degrees along the vertical direction and the horizontal
direction for each pixel and makes a judgment on similarity
manifested by the interpolation target pixel along the vertical
direction and the horizontal direction based upon differences in
similarity degrees manifesting at nearby pixels as well as at the
interpolation target pixel; and said second similarity judgment
section calculates similarity degrees along the two diagonal
directions for each pixel and makes a judgment on similarity
manifested by the interpolation target pixel along the two diagonal
directions based upon differences in similarity degrees manifesting
at nearby pixels as well as at the interpolation target pixel.
38. An interpolation processing apparatus according to any one of
claim 3, further comprising: a first similarity judgment section
that judges degrees of similarity to the interpolation target pixel
along at least two directions in which pixels with color
information corresponding to the first color component are
connected with the interpolation target pixel; and a second
similarity judgment section that judges degrees of similarity to
the interpolation target pixel along at least two directions other
than the directions in which the degrees of similarity are judged
by said first similarity judgment section, wherein: said
interpolation value calculation section selects a direction along
which pixels having color information to be used to calculate said
local average information of the first color component are set
based upon results of a judgment made by said first similarity
judgment section; (1) said interpolation value calculation section
selects a direction along which pixels having color information to
be used to calculate said local curvature information are set based
upon results of the judgment made by said first similarity judgment
section if said local curvature information is "local curvature
information constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said first similarity judgment section"; and (2) said
interpolation value calculation section selects a direction along
which pixels having color information to be used to calculate said
local curvature information are set based upon results of a
judgment made by said second similarity judgment section if said
local curvature information is "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by said second similarity judgment section."
39. An interpolation processing apparatus according to claim 18,
wherein: when the luminance component in the image data corresponds
to a green color component and the color component in the image
data corresponds to a red color component and a blue color
component, said hue value interpolation section calculates a hue
value for the interpolation target pixel by using a median of hue
values containing the red color component at pixels near the
interpolation target pixel if the green color component is present
but the red color component is missing at the interpolation target
pixel and calculates a hue value for the interpolation target pixel
by using a median of hue values containing the blue color component
at pixels near the interpolation target pixel if the green color
component is present but the blue color component is missing at the
interpolation target pixel.
40. An interpolation processing apparatus according to claim 18,
wherein: when the luminance component in the image data corresponds
to a green color component and the color component in the image
data corresponds to a red color component and a blue color
component, said hue value interpolation section calculates a hue
value for the interpolation target pixel by using a median of hue
values containing the red color component at pixels set near the
interpolation target pixel if the blue color component is present
but the red color component is missing at the interpolation target
pixel.
41. An interpolation processing apparatus according to claim 18,
wherein: when the luminance component in the image data corresponds
to a green color component and the color component in the image
data corresponds to a red color component and a blue color
component, said hue value interpolation section calculates a hue
value for the interpolation target pixel by using a median of hue
values containing the blue color component at pixels set near the
interpolation target pixel if the red color component is present
but the blue color component is missing at the interpolation target
pixel.
42. An interpolation processing apparatus according to claim 18,
with a color component missing at the interpolation target pixel
present at only one pixel among four pixels set symmetrically along
the vertical direction and the horizontal direction, wherein said
hue value interpolation section comprises: a first hue value
interpolation unit that calculates a hue value for the
interpolation target pixel by using a median of hue values at a
plurality of diagonally adjacent pixels if the hue values of the
plurality of diagonally adjacent pixels adjacent to the
interpolation target pixel along diagonal directions have been
calculated by said hue value calculation section; and a second hue
value interpolation unit that calculates a hue value for the
interpolation target pixel by using a median of hue values at a
plurality of vertically and horizontally adjacent pixels if the hue
values of the plurality of vertically and horizontally adjacent
pixels adjacent to the interpolation target pixel in the vertical
direction and the horizontal direction have been calculated by said
hue value calculation section or said first hue value interpolation
unit.
Description
[0001] This application is a continuation of International
Application No. PCT/JP00/09040 filed Dec. 20, 2000.
INCORPORATION BY REFERENCE
[0002] The disclosures of following applications are herein
incorporated by reference: Japanese Patent Application No.
H11-363007 filed Dec. 21, 1999; Japanese Patent Application No.
2000-204768 filed Jul. 6, 2000; and International Application No.
PCT/JPOO/09040 filed Dec. 20, 2000.
BACKGROUND OF THE INVENTION
[0003] 1. Field of the Invention
[0004] The present invention relates to an interpolation processing
apparatus that engages in interpolation processing on color image
data to supplement a color component and a luminance component
missing in pixels and a recording medium having an interpolation
processing program for achieving the interpolation processing on a
computer, that can be read by a computer.
[0005] 2. Description of the Related Art
[0006] Some electronic cameras generate color image data by
employing an image-capturing sensor having three color (R, G and
B.: red, green and blue) color filters provided at specific
positions (e.g., a Bayer array). In such an electronic camera in
which the individual pixels at the image-capturing sensor each
output color information corresponding to only a single color
component, it is necessary to implement interpolation processing to
obtain color information corresponding to all the color components
for each pixel.
[0007] In an interpolation processing method proposed in the prior
art, spatial similarity manifested by an interpolation target pixel
undergoing the interpolation processing is judged and an
interpolation value is calculated by using the color information
output from pixels positioned along the direction in which a high
degree of similarity is manifested.
[0008] For instance, in the art disclosed in U.S. Pat. No.
5,629,734, a green color interpolation value G5 for the
interpolation target pixel is calculated through one formula among
formula 1 through formula 3 when the color information
corresponding to individual pixels is provided as shown below, with
A5 representing the color information at the interpolation target
pixel (a pixel with the green color component missing), A1, A3, A7
and A9 representing color information from pixels provided with
color filters in the same color as the color of the filter at the
interpolation target pixel and G2, G4, G6 and G8 representing color
information from pixels provided with green color filters. 1
[0009] If a marked similarity manifests along the horizontal
direction, the green color interpolation value G5 for the
interpolation target pixel is calculated through;
G5=(G4+G6)/2+(-A3+2A5-A7)/4 (formula 1).
[0010] if a marked similarity manifests along the vertical
direction, the green color interpolation value G5 for the
interpolation target pixel is calculated through;
G5=(G2+G8)/2+(-A1+2A5-A9)/4 (formula 2).
[0011] If roughly equal degrees of similarity manifest along the
horizontal direction and the vertical direction, the green color
interpolation value G5 for the interpolation target pixel is
calculated through
G5=(G2+G4+G6+G8)/4+(-A1-A3+4A5-A7-A9)/8 (formula 3).
[0012] It is to be noted that in order to simplify the subsequent
explanation, the first terms ((G4+G6)/2, (G2+G8)/2) in formulae 1
and 2 are each referred to as a primary term and that second terms
((-A3+2A5 -A7)/4, (-A1+2A5-A9)/4) in formulae 1 and 2 are each
referred to as a correctional term.
[0013] In U.S. Pat. No. 5,629,734, assuming that the image data
undergoing the interpolation processing manifest marked similarity
along the horizontal direction with A3, G4, A5, G6 and A7 provided
as indicated with .circle-solid. marks in FIG. 17, A4 representing
the average of A3 and A5, and A6 representing the average of A5 and
A7, the value of the correctional term in formula 1 is equivalent
to the vector quantity (.alpha. in FIG. 17) representing the
difference between A5 and the average of A4 and A6. In addition,
the green color interpolation value G5 is equivalent to a value
achieved by correcting the average of the values indicated by the
color information from pixels adjacent along the horizontal
direction (corresponds to the value of the primary term in formula
1) by .alpha..
[0014] In other words, in the art disclosed in U.S. Pat. No.
5,629,734, a green color interpolation value is calculated by
assuming that the color difference between the green color
component and the color component (the red color component or the
blue color component) at the interpolation target pixel is constant
((A4-G4), (A5-G5) and (A6-G6) in FIG. 17 match) and correcting the
average of the values indicated by the color information from the
pixels that are adjacent along the direction in which a high degree
of similarity is manifested with color information corresponding to
the same color component as that of the interpolation target
pixel.
[0015] Optical systems such as lenses are known to manifest
magnification chromatic aberration. For instance, if there is
magnification chromatic aberration at the photographic lens of an
electronic camera having an image-capturing sensor provided with
color filters in three colors, i.e., R, G and B, arranged in a
Bayer array, images corresponding to the red color component and
the blue color component are formed at positions slightly offset
from the position at which the image corresponding to the green
color component is formed, as shown in FIGS. 18B and 18C.
[0016] If the photographic lens is free of any magnification
chromatic aberration and color information corresponding to the
individual pixels is provided as indicated by the .circle-solid.
marks in FIG. 19A (the image data undergoing the interpolation
processing manifest marked similarity along the horizontal
direction, the color information corresponding to the green color
component indicates a constant value and the values indicated by
the color information corresponding to the red color component and
the color information corresponding to the blue color component
both change gently in the vicinity of the interpolation target
pixel (the pixel at which A5 is present), the value of the
correctional term in formula 1 is 0 and, as a result, the average
of G4 and G6 (the primary term) is directly used as the green color
interpolation value G5 without correction.
[0017] However, when A3, A5 and A7 each represent color information
corresponding to the red color component and each set of color
information corresponding to the red color component is offset by
one pixel to the right due to a magnification chromatic aberration
at the photographic lens, the color information from the individual
pixels undergoes a change as shown in FIG. 19B. Consequently, the
value of the correctional term in formula 1 is not 0 and the
primary term is over-corrected (hereafter referred to as an
"over-correction") in such a case, resulting in the green color
interpolation value G5 that should be similar to the values
indicated by G4 and G6 becoming larger than the G4 and G6 values
(hereafter this phenomenon is referred to as an "overshoot"). If,
on the other hand, A3, A5 and A7 each represent color information
corresponding to the blue color component and each set of color
information corresponding to the blue color component is offset by
one pixel to the left due to a magnification chromatic aberration,
the color information from the individual pixels undergoes a change
as shown in FIG. 19C. Thus, the value of the correctional term in
formula 1 is not 0, resulting in the green color interpolation
value G5 that should be similar to the G4 and G6 values becoming
smaller than those corresponding to G4 and G6 (hereafter this
phenomenon is referred to as an "undershoot") through an
over-correction.
[0018] In other words, the art disclosed in U.S. Pat. No. 5,629,734
poses a problem in that color artifacts occur in the color image
obtained through the interpolation processing due to a
magnification chromatic aberration.
[0019] An over correction also occurs at a color boundary where the
color difference changes as well as when there is a magnification
chromatic aberration. For instance, the color information
corresponding to the individual pixels is provided as indicated by
the .circle-solid. marks in FIGS. 20A and 20B (when the color
information corresponding to the green color component is constant
and the values identified by the color information corresponding to
the red color component or the blue color component change
drastically near the interpolation target pixel (the pixel at which
A5 is present)), the value of the correctional term in formula 1 is
not 0 and, an overshoot or an undershoot occurs due to an over
correction with regard to the green color interpolation value G5,
which should be similar to the values indicated by G4 and G6.
[0020] Thus, in a color boundary where the color difference
changes, a color artifact occurs as a result of interpolation
processing even if there is no magnification chromatic aberration.
It is to be noted that such a color artifact as that described
above may occur when calculating a red color interpolation value or
a blue color interpolation value as well as when calculating a
green color interpolation value.
SUMMARY OF THE INVENTION
[0021] An object of the present invention is to provide an
interpolation processing apparatus capable of preventing occurrence
of color artifacts and a recording medium having recorded therein
an interpolation processing program with which occurrence of color
artifacts can be prevented.
[0022] More specifically, an object of the present invention is to
suppress the occurrence of color artifacts by reducing the problems
of the prior art while retaining the advantages of the
interpolation processing in the prior art and reducing the degree
of the adverse effect of magnification chromatic aberration.
[0023] In order to achieve the object described above, a first
interpolation processing apparatus according to the present
invention that engages in processing on image data which are
provided in a colorimetric system constituted of first.about.nth
(n.gtoreq.2) color components and include color information
corresponding to a single color component provided at each pixel to
determine an interpolation value equivalent to color information
corresponding to the first color component for a pixel at which the
first color component is missing, comprises: an interpolation value
calculation section that uses color information at pixels located
in a local area containing an interpolation target pixel to undergo
interpolation processing to calculate an interpolation value
including, at least (1) local average information of the first
color component with regard to the interpolation target pixel and
(2) local curvature information corresponding to at least two color
components with regard to the interpolation target pixel.
[0024] Namely, the first interpolation processing apparatus
calculates an interpolation value by correcting the "local average
information of the first color component with regard to the
interpolation target pixel" with the "local curvature information
corresponding to at least two color components with regard to the
interpolation target pixel." It is to be noted that in the
explanation of the first interpolation processing apparatus, the
"local average information of the first color component with regard
to the interpolation target pixel" may be the average of the values
indicated by the color information corresponding to the first color
component present in the local area containing the interpolation
target pixel or a value within the range of the values indicated by
the color information corresponding to the first color component in
the local area containing the interpolation target pixel. In
addition, the "local curvature information corresponding to at
least two color components with regard to the interpolation target
pixel" refers to information that indicates how the color
information corresponding to at least two color components in the
local area containing the interpolation target pixel changes. In
other words, the local curvature information corresponding to a
given color component is information that indicates the degree of
change in the rate of change occurring with regard to the color
component in the local area, and when the values corresponding to
each color component are plotted and rendered as a curve (or a
polygonal line), the information indicates the curvature and the
degree of change in the curvature (this definition applies in the
subsequent description). The information, which may be obtained by
calculating a quadratic differential or a higher differential of
the color component, indicates a value reflecting structural
information with regard to fluctuations in the values corresponding
to the color component. When the information is rendered in a
polygonal line, it indicates changes in the inclinations of the
individual line segments.
[0025] A second interpolation processing apparatus achieves that in
the first interpolation processing apparatus the interpolation
value calculation section calculates, as the local curvature
information corresponding to at least two color components, (1)
local curvature information based upon a color component matching a
color component at the interpolation target pixel and (2) local
curvature information based upon a color component other than the
color component at the interpolation target pixel.
[0026] Namely, in the second interpolation processing apparatus,
the interpolation value is calculated by correcting the "local
average information of the first color component with regard to the
interpolation target pixel" with the "local curvature information
based upon a color component matching the color component at the
interpolation target pixel" and the "local curvature information
based upon a color component other than the color component at the
interpolation target pixel."
[0027] It is to be noted that in the explanation of the second
interpolation processing apparatus, the "local curvature
information based upon a color component matching the color
component at the interpolation target pixel (or a color component
other than the color component at the interpolation target pixel)"
refers to information that indicates how the color information
corresponding to the color component matching the color component
at the interpolation target pixel (or a color component other than
the color component at the interpolation target pixel) in the local
area containing the interpolation target pixel changes and is
represented as a value reflecting the structural information
regarding the fluctuations obtained through calculation of a
quadratic differential or a higher differential of the color
component.
[0028] A third interpolation processing apparatus that engages in
processing on image data which are provided in a colorimetric
system constituted of first.about.nth (n.gtoreq.2) color components
and include color information corresponding to a single color
component provided at each pixel to determine an interpolation
value equivalent to color information corresponding to the first
color component for a pixel at which the first color component is
missing, comprises: an interpolation value calculation section that
uses color information at pixels located in a local area containing
an interpolation target pixel to undergo interpolation processing
to calculate an interpolation value including, at least (1) local
average information of the first color component with regard to the
interpolation target pixel and (2) local curvature information
based upon a color component other than a color component at the
interpolation target pixel.
[0029] Namely, in the third interpolation processing apparatus, the
interpolation value is calculated by correcting the "local average
information of the first color component with regard to the
interpolation target pixel" with the "local curvature information
based upon a color component other than the color component at the
interpolation target pixel."It is to be noted that in the
explanation of the third interpolation processing apparatus, the
"local curvature information based upon a color component other
than the color component at the interpolation target pixel" refers
to information that indicates how the color information
corresponding to the color component other than the color component
at the interpolation target pixel in the local area containing the
interpolation target pixel changes and is represented as a value
reflecting the structural information regarding the fluctuations
obtained through calculation of a quadratic differential or a
higher differential of the color component.
[0030] A fourth interpolation processing apparatus that engages in
processing on image data which are provided in a colorimetric
system constituted of first.about.nth (n.gtoreq.2) color components
and include color information corresponding to a single color
component provided at each pixel to determine an interpolation
value equivalent to color information corresponding to the first
color component for a pixel at which the first color component is
missing, comprises: an interpolation value calculation section that
uses color information at pixels located in a local area containing
an interpolation target pixel to undergo interpolation processing
to calculate an interpolation value including, at least (1) local
average information of the first color component with regard to the
interpolation target pixel and (2) local curvature information
corresponding to the first color component with respect to the
interpolation target pixel.
[0031] Namely, in the fourth interpolation processing apparatus,
the interpolation value is calculated by correcting the "local
average information of the first color component with regard to the
interpolation target pixel" with the "local curvature information
corresponding to the first color component with regard to the
interpolation target pixel"
[0032] It is to be noted that in the explanation of the fourth
interpolation processing apparatus, the "local curvature
information corresponding to the first color component with respect
to the interpolation target pixel" refers to information that
indicates how the color information corresponding to the first
color component in the local area containing the interpolation
target pixel changes and is represented as a value reflecting the
structural information regarding the fluctuations obtained through
calculation of a quadratic differential or a higher differential of
the color component.
[0033] A fifth interpolation processing apparatus achieves that in
the first through third interpolation processing apparatus a first
similarity judgment section that judges degrees of similarity to
the interpolation target pixel along at least two directions in
which pixels with color information corresponding to the first
color component are connected with the interpolation target pixel;
and a second similarity judgment section that judges degrees of
similarity to the interpolation target pixel along at least two
directions other than the directions in which the degrees of
similarity are judged by the first similarity judgment section, are
further provided, and: the interpolation value calculation section
selects a direction along which pixels having color information to
be used to calculate the local average information of the first
color component are set based upon results of a judgment made by
the first similarity judgment section; (1) the interpolation value
calculation section selects a direction along which pixels having
color information to be used to calculate the local curvature
information are set based upon results of the judgment made by the
first similarity judgment section if the local curvature
information is "local curvature information constituted of a single
color component and manifesting directionality along a direction in
which degrees of similarity are judged by the first similarity
judgment section"; and (2) the interpolation value calculation
section selects a direction along which pixels having color
information to be used to calculate the local curvature information
are set based upon results of a judgment made by the second
similarity judgment section if the local curvature information is
"local curvature information constituted of a single color
component and manifesting directionality along a direction in which
degrees of similarity are judged by the second similarity judgment
section."
[0034] Namely, in the fifth interpolation processing apparatus, any
of the "local curvature information corresponding to at least two
color components with regard to the interpolation target pixel" in
the first interpolation processing apparatus and the "local
curvature information based upon a color component matching the
color component at the interpolation target pixel" and the "local
curvature information based upon a color component other than the
color component at interpolation target pixel" in the second or the
third interpolation processing apparatus that constitutes "local
curvature information constituted of a single color component and
manifesting directionality along a direction in which degrees of
similarity are judged by the first similarity judgment section" is
calculated by using color information at pixels present along a
direction selected based upon the results of the judgment made by
the first similarity judgment section, whereas any of the
information listed above that constitutes "local curvature
information constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by the second similarity judgment section" is calculated by
using color information at pixels present along a direction
selected based upon the results of the judgment made by the second
similarity judgment section.
[0035] It is to be noted that the details of the directions along
which the various types of local curvature information manifest
directionality are to be defined in the "Best Mode For Carrying Out
The Invention".
[0036] As described above, the color information used to calculate
local curvature information can be selected in correspondence to
degrees of similarity to the interpolation target pixel in the
fifth interpolation processing apparatus. In addition, the color
information used to calculate the local average information of the
first color component, too, can be selected in correspondence to
the degrees of similarity to the interpolation target pixel.
[0037] A sixth interpolation processing apparatus that engages in
processing on image data which are provided in a colorimetric
system constituted of first.about.nth (n.gtoreq.2) color components
and include color information corresponding to a single color
component provided at each pixel to determine an interpolation
value equivalent to color information corresponding to the first
color component for a pixel at which the first color component is
missing, comprises: an interpolation value calculation section that
calculates an interpolation value including at least two terms,
i.e., a first term and a second term by using color information at
pixels set in a local area containing an interpolation target pixel
to undergo interpolation processing; a first similarity judgement
section that judges degrees of similarity to the interpolation
target pixel along at least two directions in which pixels having
color information corresponding to the first color component are
connected to the interpolation target pixel; and a second
similarity judgment section that judges degrees of similarity to
the interpolation target pixel along at least two directions other
than the directions in which the degrees of similarity are judged
by the first similarity judgment section, wherein: the
interpolation value calculation section selects a direction along
which pixels having color information to be used to calculate the
first term are set based upon results of a judgment made by the
first similarity judgment section and selects a direction along
which pixels having color information to be used to calculate the
second term are set based upon results of a judgment made by the
second similarity judgment section.
[0038] Namely, in the sixth interpolation processing apparatus, in
which the directions in which similarity is judged by the second
similarity judgment section are different from the directions along
which similarity is judged by the first similarity judgment
section, the processing can be performed by using color information
from pixels set along more directions including the direction along
which color information used to calculate the second term is
provided as well as the direction along which color information
used to calculate the first term is provided.
[0039] Thus, the interpolation value can be calculated by using
color information at pixels located along a finely differentiated
plurality of directions in the sixth interpolation processing
apparatus. In addition, the first term and the second term can be
calculated by using color information at pixels set along the
direction in which a high degree of similarity is manifested or
through weighted synthesis of color information from pixels located
along a plurality of directions, which is performed in
correspondence to varying degrees of similarity, in the sixth
interpolation processing apparatus.
[0040] A seventh interpolation processing apparatus achieves that
in the sixth interpolation processing apparatus the interpolation
value calculation section: calculates a term containing (a) local
average information of the first color component with regard to the
interpolation target pixel and (b) local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by the first similarity judgment section, as the first term;
and calculates a term containing local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by the second similarity judgment section, as the second
term.
[0041] Namely, in the seventh interpolation processing apparatus,
the interpolation value is calculated by correcting the "local
average information of the first color component with regard to the
interpolation target pixel" with the "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by the first similarity judgment section" and the "local
curvature information constituted of a single color component and
manifesting directionality along a direction in which degrees of
similarity are judged by the second similarity judgment
section."
[0042] It is to be noted that the details of the directions along
which the various types of local curvature information manifest
directionality are to be defined in the "Best Mode For Carrying Out
The Invention".
[0043] An eighth interpolation processing apparatus achieves that
in the fifth or seventh interpolation processing apparatus: when
image data are provided in a colorimetric system constituted of
first.about.third color components with the first color component
achieving a higher spatial frequency than the second color
component and the third color component, the first color component
set in a checker-board pattern, the second color component and the
third color component each set in a line sequence between pixels at
which color information corresponding to the first color component
is present and information corresponding to the second color
component present at the interpolation target pixel; the first
similarity judgment section calculates similarity degrees
manifested by the interpolation target pixel along two directions,
i.e., a vertical direction and a horizontal direction, in which
pixels with color information corresponding to the first color
component that are closest to the interpolation target pixel are
connected to the interpolation target pixel and makes a judgment
with regard to degrees of similarity manifested by the
interpolation target pixel along the vertical direction and the
horizontal direction based upon a difference between the similarity
degrees; the second similarity judgment section calculates
similarity degrees manifested by the interpolation target pixel
along two diagonal directions in which pixels with color
information corresponding to the third color component that are
closest to the interpolation target pixel are connected to the
interpolation target pixel and makes a judgment with regard to
degrees of similarity manifested by the interpolation target pixel
along the two diagonal directions based upon a difference between
the similarity degrees; and the interpolation value calculation
section selects at least either the second color component or the
first color component based upon which the "local curvature
information constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by the first similarity judgment section" is provided and
selects at least either the second color component or the third
color component based upon which the "local curvature information
constituted of a single color component and manifesting
directionality along a direction in which degrees of similarity are
judged by the second similarity judgment section" is provided.
[0044] The details of the directions along which the various types
of local curvature information manifest directionality are to be
defined in the "Best Mode For Carrying Out The Invention".
[0045] In the eighth interpolation processing apparatus, the color
component manifesting directionality along the two directions,
i.e., the vertical direction and the horizontal direction in which
degrees of similarity are judged by the first similarity judgment
section, includes the second color component and the first color
component and the color components manifesting directionality along
the two diagonal directions in which degrees of similarity are
judged by the second similarity judgment section includes the
second color component and the third color component.
[0046] Thus, the "local curvature information constituted of a
single color component and manifesting directionality along a
direction in which degrees of similarity are judged by the first
similarity judgment section" in the fifth or seventh interpolation
processing apparatus is calculated with respect to at least either
the second color component or the first color component based upon
the results of the judgment made by the first similarity judgment
section and the "local curvature information constituted of a
single color component and manifesting directionality along a
direction in which degrees of similarity are judged by the second
similarity judgment section" is calculated with respect to at least
either the second color component or the third color component
based upon the results of the judgment made by the second
similarity judgment section.
[0047] In addition, in the eighth interpolation processing
apparatus, the similarity manifesting along the diagonal directions
is reflected with a high degree of reliability when calculating the
"local curvature information based upon a color component achieving
similarity along a direction in which degrees of similarity are
judged by the second similarity judgment section."
[0048] A ninth interpolation processing apparatus achieves that in
the eighth interpolation processing apparatus: when the local
curvature information is "local curvature information based upon a
color component other than the color component at the interpolation
target pixel", the interpolation value calculation section selects
the first color component or the third color component to which the
local curvature information is to correspond in conformance to the
degrees of similarity judged by the second similarity judgment
section.
[0049] In the ninth interpolation processing apparatus, the third
color component is present at pixels adjacent to the interpolation
target pixel along the two diagonal directions and the similarity
judged by the second similarity judgment section is the similarity
manifested by the interpolation target pixel along the two diagonal
directions.
[0050] In other words, in the ninth interpolation processing
apparatus, the similarity along the diagonal directions can be
reflected in the "local curvature information based upon a color
component other than the color component at the interpolation
target pixel" by switching the "local curvature information based
upon a color component other than the color component at the
interpolation target pixel" to correspond to the first color
component or the third color component.
[0051] A tenth interpolation processing apparatus achieves that in
the ninth interpolation processing apparatus: the interpolation
value calculation section calculates local curvature information
based upon the first color component if the second similarity
judgment section judges that roughly equal degrees of similarity
manifest along the two diagonal directions and calculates local
curvature information based upon the third color component if the
second similarity judgment section judges that a higher degree of
similarity manifests along one of the two diagonal directions
compared to the other diagonal direction.
[0052] Namely, in the 10th interpolation processing apparatus, the
similarity manifesting along the diagonal directions is reflected
with a high degree of reliability when calculating the "local
curvature information based upon a color component other than the
color component at the interpolation target pixel."
[0053] An 11th interpolation processing apparatus achieves that in
the eighth interpolation processing apparatus: the first similarity
judgment section judges that roughly equal degrees of similarity
manifest along the vertical direction and the horizontal direction
if a difference between the similarity degrees along the vertical
direction and the horizontal direction is smaller than a specific
threshold value; and the second similarity judgment section judges
that roughly equal degrees of similarity manifest along the two
diagonal directions if a difference between the similarity degrees
along the two diagonal directions is smaller than a specific
threshold value.
[0054] As a result, the adverse effect of noise can be reduced in
the judgement of the similarity along the two directions, i.e., the
vertical direction and the horizontal direction and the similarity
manifesting along the two diagonal directions in the 11th
interpolation processing apparatus.
[0055] A 12th interpolation processing apparatus achieves that in
the eighth interpolation processing apparatus: the first similarity
judgment section calculates the similarity degrees along the
vertical direction and the horizontal direction by using color
information corresponding to a plurality of color components for a
single interpolation target pixel; and the second similarity
judgment section calculates the similarity degrees along the two
diagonal directions by using color information corresponding to a
plurality of color components for a single interpolation target
pixel.
[0056] In other words, in the 12th interpolation processing
apparatus, color information corresponding to a plurality of color
components is reflected in the judgement of the similarity
manifesting along the vertical and horizontal directions and the
similarity manifesting along the two diagonal directions.
[0057] A 13th interpolation processing apparatus achieves that in
the twelfth interpolation processing apparatus: the second
similarity judgment section calculates a similarity degree
manifesting along each of the two diagonal directions through
weighted addition of: (1) a similarity degree component constituted
of color information corresponding to the first color component
alone; (2) a similarity degree component constituted of color
information corresponding to the second color component alone; (3)
a similarity degree component constituted of color information
corresponding to the third color component alone; and (4) a
similarity degree component constituted of color information
corresponding to the second color component and the third color
component.
[0058] As a result, color information corresponding to a plurality
of color components is reflected with a high degree of reliability
in the judgement of the similarity manifesting along the two
diagonal directions in the 13th interpolation processing
apparatus.
[0059] A 14th interpolation processing apparatus achieves that in
the eighth: the first similarity judgment section calculates
similarity degrees along the vertical direction and the horizontal
direction for each pixel and makes a judgment on similarity
manifested by the interpolation target pixel along the vertical
direction and the horizontal direction based upon differences in
similarity degrees manifesting at nearby pixels as well as at the
interpolation target pixel; and the second similarity judgment
section calculates similarity degrees along the two diagonal
directions for each pixel and makes a judgment on similarity
manifested by the interpolation target pixel along the two diagonal
directions based upon differences in similarity degrees manifesting
at nearby pixels as well as at the interpolation target pixel.
[0060] Namely, in the 14th interpolation processing apparatus, the
continuity with the nearby pixels is reflected in the judgement of
the similarity manifesting along the vertical and horizontal
directions and the similarity manifesting along the two diagonal
directions.
[0061] A 15th interpolation processing apparatus that engages in
processing on image data which are provided in a calorimetric
system constituted of first.about.nth (n.gtoreq.2) color components
and include color information corresponding to a single color
component provided at each pixel to determine an interpolation
value equivalent to color information corresponding to the first
color component for a pixel at which the first color component is
missing, comprises: a first term calculation section that
calculates a first term representing average information of the
first color component with regard to an interpolation target pixel
to undergo interpolation processing by using color information
corresponding to color components at pixels set in a local area
containing the interpolation target pixel; a second term
calculation section that calculates a second term representing
local curvature information based upon a color component matching
the color component at the interpolation target pixel with regard
to the interpolation target pixel by using color information
corresponding to color components at pixels set in a local area
containing the interpolation target pixel; and an interpolation
value calculation section that calculates an interpolation value by
adding the second term multiplied by a weighting coefficient
constituted of color information corresponding to a plurality of
color components at pixels in the local area containing the
interpolation target pixel to the first term.
[0062] In other words, in the 15th interpolation processing
apparatus, the interpolation value is calculated by correcting the
"average information of the first color component with regard to
the interpolation target pixel" with the "local curvature
information based upon a color component matching the color
component at the interpolation target pixel with regard to the
interpolation target pixel" multiplied by a weighting coefficient
constituted of color information corresponding to a plurality of
color components present at pixels within a local area containing
the interpolation target pixel.
[0063] A 16th interpolation processing apparatus achieves that in
the 15th interpolation processing apparatus: the interpolation
value calculation section uses color information corresponding to a
plurality of color components provided at the interpolation target
pixel and at a plurality of pixels set along a predetermined
direction relative to the interpolation target pixel to ascertain
inclinations manifesting in color information corresponding to the
individual color components along the direction and calculates the
weighting coefficient in conformance to a correlation manifesting
among the inclinations in the color information corresponding to
the individual color components.
[0064] As a result, in the 16th interpolation processing apparatus,
in which the "average information of the first color component with
regard to the interpolation target pixel" is corrected with the
"local curvature information based upon a color component matching
the color component at the interpolation target pixel with regard
to the interpolation target pixel" multiplied by the weighting
coefficient, and the weighting coefficient is calculated in
conformance to the correlation among the inclinations of the color
information corresponding to the different color components in the
local area containing the interpolation target pixel.
[0065] A 17th interpolation processing apparatus that implements
processing for supplementing a color component value at a pixel at
which information corresponding to a color component is missing in
image data provided in a calorimetric system constituted of a
luminance component and the color component, with the luminance
component having a higher spatial frequency than the color
component and the luminance component present both at pixels having
information corresponding to the color component and at pixels
lacking information corresponding to the color component,
comprises: a hue value calculation section that calculates hue
values at a plurality of pixels located near an interpolation
target pixel to undergo interpolation processing and having both
the luminance component and the color component by using luminance
component values and color component values at the individual
pixels; a hue value interpolation section that calculates a hue
value at the interpolation target pixel by using a median of the
hue values at the plurality of pixels calculated by the hue value
calculation section; and a color conversion section that
interpolates a color component at the interpolation target pixel by
using the luminance component at the interpolation target pixel to
convert the hue value at the interpolation target pixel calculated
by the hue value interpolation section to a color component.
[0066] Namely, in the 17th interpolation processing apparatus, the
hue value of the interpolation target pixel is calculated by using
the median of the hue values of a plurality of pixels located near
the interpolation target pixel.
[0067] A 18th interpolation processing apparatus that implements
processing for supplementing a luminance component at a pixel at
which information corresponding to a luminance component is missing
and supplementing a color component at a pixel at which information
corresponding to a color component is missing, on image data
provided in a calorimetric system constituted of the luminance
component and the color component, with the luminance component
having a higher spatial frequency than the color component and a
given pixel having only information corresponding to either the
luminance component or the color component, comprises: a luminance
component interpolation section that interpolates a luminance
component at a luminance component interpolation target pixel to
undergo luminance component interpolation processing by using at
least either "similarity manifesting between the luminance
component interpolation target pixel and a pixel near the luminance
component interpolation target pixel" or "a plurality of color
components within a local area containing the luminance component
interpolation target pixel"; a hue value calculation section that
calculates hue values at a plurality of pixels located near an
interpolation target pixel to undergo color component interpolation
processing, having color component values and having luminance
component values interpolated by the luminance component
interpolation section, by using the luminance component values and
color component values at the individual pixels; a hue value
interpolation section that calculates a hue value for the
interpolation target pixel by using a median of the hue values at
the plurality of pixels calculated by the hue value calculation
section; and a color conversion section that interpolates a color
component value for the interpolation target pixel by using the
luminance component value at the interpolation target pixel to
convert the hue value at the interpolation target pixel calculated
by the hue value interpolation section to a color component
value.
[0068] Namely, in the 18th interpolation processing apparatus, the
hue value of the interpolation target pixel is calculated by using
the median of the hue values of a plurality of pixels located near
the interpolation target pixel.
[0069] A 19th interpolation processing apparatus achieves that in
the 17th or 18th interpolation processing apparatus: when the
luminance component in the image data corresponds to a green color
component and the color component in the image data corresponds to
a red color component and a blue color component, the hue value
interpolation section calculates a hue value for the interpolation
target pixel by using a median of hue values containing the red
color component at pixels near the interpolation target pixel if
the green color component is present but the red color component is
missing at the interpolation target pixel and calculates a hue
value for the interpolation target pixel by using a median of hue
values containing the blue color component at pixels near the
interpolation target pixel if the green color component is present
but the blue color component is missing at the interpolation target
pixel.
[0070] In other words, in the 19th interpolation processing
apparatus, the hue value of the interpolation target pixel at which
the green color component is present but the red color component is
missing is calculated by using the median of the hue values
containing the red color component from pixels located near the
interpolation target pixel, whereas the hue value of the
interpolation target pixel at which the green color component is
present but the blue color component is missing is calculated by
using the median of the hue values containing the blue color
component from pixels located near the interpolation target
pixel.
[0071] A 20th interpolation processing apparatus achieves that in
the 17th or 18th interpolation processing apparatus: when the
luminance component in the image data corresponds to a green color
component and the color component in the image data corresponds to
a red color component and a blue color component, the hue value
interpolation section calculates a hue value for the interpolation
target pixel by using a median of hue values containing the red
color component at pixels set near the interpolation target pixel
if the blue color component is present but the red color component
is missing at the interpolation target pixel.
[0072] Namely, in the 20th interpolation processing apparatus, the
hue value of the interpolation target pixel at which the blue color
component is present but the red color component is missing is
calculated by using the median of the hue values containing the red
color component from pixels located near the interpolation target
pixel.
[0073] A 21st interpolation processing apparatus achieves that in
the 17th or 18th interpolation processing apparatus: when the
luminance component in the image data corresponds to a green color
component and the color component in the image data corresponds to
a red color component and a blue color component, the hue value
interpolation section calculates a hue value for the interpolation
target pixel by using a median of hue values containing the blue
color component at pixels set near the interpolation target pixel
if the red color component is present but the blue color component
is missing at the interpolation target pixel.
[0074] Namely, in the 21st interpolation processing apparatus, the
hue value of the interpolation target pixel at which the red color
component is present but the blue color component is missing is
calculated by using the median of the hue values containing the
blue color component from pixels located near the interpolation
target pixel.
[0075] A 22nd interpolation processing apparatus achieves that in
the any one of the 17th through 21st interpolation processing
apparatus a color component is missing at the interpolation target
pixel present at only one pixel among four pixels set symmetrically
along the vertical direction and the horizontal direction, and the
hue value interpolation section comprises: a first hue value
interpolation unit that calculates a hue value for the
interpolation target pixel by using a median of hue values at a
plurality of diagonally adjacent pixels if the hue values of the
plurality of diagonally adjacent pixels adjacent to the
interpolation target pixel along diagonal directions have been
calculated by the hue value calculation section; and a second hue
value interpolation unit that calculates a hue value for the
interpolation target pixel by using a median of hue values at a
plurality of vertically and horizontally adjacent pixels if the hue
values of the plurality of vertically and horizontally adjacent
pixels adjacent to the interpolation target pixel in the vertical
direction and the horizontal direction have been calculated by the
hue value calculation section or the first hue value interpolation
unit.
[0076] In other words, in the 22nd interpolation processing
apparatus, if the hue values at pixels adjacent along the diagonal
directions are already calculated, the hue value of the
interpolation target pixel is calculated by using the median of the
hue values at the diagonally adjacent pixels, whereas if the hue
values at pixels adjacent in the vertical and horizontal directions
are already calculated, the hue value of the interpolation target
pixel is calculated by using the median of the hue values at the
vertically and horizontally adjacent pixels.
[0077] A first recording medium has an interpolation processing
program recorded therein to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a colorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel. The interpolation processing program comprises: an
interpolation value calculation step in which an interpolation
value including, at least (1) local average information of the
first color component with regard to an interpolation target pixel
to undergo interpolation processing and (2) local curvature
information corresponding to at least two color components with
regard to the interpolation target pixel, is calculated by using
color information provided at pixels set within a local area
containing the interpolation target pixel.
[0078] Namely, through the interpolation processing program
recorded at the first recording medium, the interpolation value is
calculated by correcting the "local average information of the
first color component with regard to the interpolation target
pixel" with the "local curvature information corresponding to at
least two color components with regard to the interpolation target
pixel."
[0079] A second recording medium has an interpolation processing
program recorded therein to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a colorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel. The interpolation processing program comprises: an
interpolation value calculation step in which an interpolation
value including, at least (1) local average information of the
first color component with regard to an interpolation target pixel
to undergo the interpolation processing; and (2) local curvature
information based upon a color component other than a color
component at the interpolation target pixel, is calculated by using
color information provided at pixels set within a local area
containing the interpolation target pixel.
[0080] Namely, through the interpolation processing program
recorded at the second recording medium, the interpolation value is
calculated by correcting the "local average information of the
first color component with regard to the interpolation target
pixel" with the "local curvature information based upon a color
component other than the color component at the interpolation
target pixel."
[0081] A third recording medium has an interpolation processing
program recorded therein to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a colorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel. The interpolation processing program comprises: an
interpolation value calculation step in which an interpolation
value including, at least (1) local average information of the
first color component with regard to an interpolation target pixel
to undergo the interpolation processing, and (2) local curvature
information corresponding to the first color component with respect
to the interpolation target pixel, is calculated by using color
information provided at pixels set within a local area containing
the interpolation target pixel.
[0082] Namely, through the interpolation processing program
recorded at the third recording medium, the interpolation value is
calculated by correcting the "local average information of the
first color component with regard to the interpolation target
pixel" with the "local curvature information corresponding to the
first color component with respect to the interpolation target
pixel."
[0083] A fourth recording medium has an interpolation processing
program recorded therein to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a calorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel. The interpolation processing program comprises: an
interpolation value calculation step in which an interpolation
value including at least two terms, i.e., a first term and a second
term is calculated by using color information at pixels set within
a local area containing an interpolation target pixel to undergo
interpolation processing; a first similarity judgment step in which
degrees of similarity to the interpolation target pixel are judged
along at least two directions in which pixels having color
information corresponding to the first color component are
connected with the interpolation target pixel; and a second
similarity judgment step in which degrees of similarity to the
interpolation target pixel are judged along at least two directions
other than the directions along which the degrees of similarity are
judged in the first similarity judgment step, wherein: in the
interpolation value calculation step, a direction in which pixels
having color information to be used to calculate the first term are
set is selected based upon results of a judgment made in the first
similarity judgment step and a direction in which pixels having
color information to be used to calculate the second term are set
is selected based upon results of a judgment made in the second
similarity judgment step.
[0084] In other words, since the directions along which similarity
is judged in the second similarity judging step are different from
the directions in which similarity is judged in the first
similarity judging step in the interpolation processing program
recorded at the fourth recording medium, color information from
pixels located along more diverse directions including a direction
along which the color information used to calculate the second term
is provided as well as a direction along which the color
information used to calculate the first term is provided can be
used in the processing.
[0085] A fifth recording medium has an interpolation processing
program recorded therein to implement on a computer processing for
determining an interpolation value equivalent to color information
corresponding to a first color component missing at a pixel, on
image data provided in a colorimetric system constituted of
first.about.nth (n.gtoreq.2) color components with color
information corresponding to a single color component present at
each pixel. The interpolation processing program comprises: a first
term calculation step in which a first term representing average
information of the first color component with regard to an
interpolation target pixel to undergo interpolation processing is
calculated by using color information corresponding to a color
component at pixels set within a local area containing the
interpolation target pixel; a second term calculation step in which
a second term representing local curvature information based upon a
color component matching the color component at the interpolation
target pixel is calculated with regard to the interpolation target
pixel by using color information corresponding to a color component
at pixels set within a local area containing the interpolation
target pixel; and an interpolation value calculation step in which
an interpolation value is calculated by adding the second term
multiplied by a weighting coefficient constituted of color
information corresponding to a plurality of color components
provided at pixels set within a local area containing the
interpolation target pixel to the first term.
[0086] Namely, through the interpolation processing program
recorded at the fifth recording medium, the interpolation value is
calculated by correcting the "average information of the first
color component with regard to the interpolation target pixel" with
the "local curvature information based upon a color component
matching the color component at the interpolation target pixel with
regard to the interpolation target pixel" multiplied by a weighting
coefficient constituted of color information corresponding to a
plurality of color components at the interpolation target pixel and
at pixels located in the local area containing the interpolation
target pixel.
[0087] A sixth recording medium has an interpolation processing
program recorded therein for implementing on a computer processing
supplementing a color component value at a pixel at which
information corresponding to a color component is missing, on image
data provided in a colorimetric system constituted of a luminance
component and the color component, with the luminance component
having a higher spatial frequency than the color component and the
luminance component present both at pixels having information
corresponding to the color component and at pixels lacking
information corresponding to the color component. The interpolation
processing program comprises: a hue value calculation step in which
hue values for a plurality of pixels near an interpolation target
pixel to undergo interpolation processing and having information
corresponding to both the luminance component and the color
component are calculated by using luminance component values and
color component values at the individual pixels; a hue value
interpolation step in which a hue value for the interpolation
target pixel is calculated by using a median of the hue values at
the plurality of pixels calculated in the hue value calculation
step; and a color conversion step in which a color component value
at the interpolation target pixel is interpolated by using a value
indicated by the luminance component present at the interpolation
target pixel to convert the hue value of the interpolation target
pixel calculated in the hue value interpolation step to a color
component value.
[0088] Namely, through the interpolation processing program
recorded at the sixth recording medium, the hue value of the
interpolation target pixel is calculated by using the median of the
hue values at a plurality of pixels present near the interpolation
target pixel.
[0089] A seventh recording medium has an interpolation processing
program recorded therein for implementing on a computer processing
for supplementing a luminance component value at a pixel at which
information corresponding to a luminance component is missing and a
color component value at a pixel at which information corresponding
to a color component missing, on image data provided in a
calorimetric system constituted of the luminance component and the
color component, with the luminance component having a higher
spatial frequency than the color component and information
corresponding to either the luminance component or the color
component present at each pixel. The interpolation processing
program comprises: a luminance component interpolation step in
which a luminance component value is interpolated for a luminance
component interpolation target pixel to undergo luminance component
interpolation processing by using at least either "similarity
between the luminance component interpolation target pixel and a
pixel near the luminance component interpolation target pixel" or
"information corresponding to a plurality of color components
within a local area containing the luminance component
interpolation target pixel"; a hue value calculation step in which
hue values at a plurality of pixels located near an interpolation
target pixel to undergo color component interpolation processing,
having color component values and having luminance component values
interpolated in the luminance component interpolation step are
calculated by using the luminance component values and color
component values at the individual pixels; a hue value
interpolation step in which a hue value for the interpolation
target pixel is calculated by using a median of the hue values at
the plurality of pixels calculated in the hue value calculation
step; and a color conversion step in which a color component value
is interpolated for the interpolation target pixel by using the
luminance component value at the interpolation target pixel to
convert the hue value at the interpolation target pixel calculated
in the hue value interpolation step to a color component value.
[0090] Namely, through the interpolation processing program
recorded at the seventh recording medium, the hue value of the
interpolation target pixel is calculated by using the median of the
hue values at a plurality of pixels present near the interpolation
target pixel.
[0091] It is to be noted that by adopting the interpolation
processing programs at the first.about.seventh recording media, the
first, third, fourth, sixth, 15th, 17th and 19th interpolation
processing apparatuses may be realized on a computer. Likewise, the
second, fifth, seventh.about.14th, 16th, 18th, and 20th.about.22nd
interpolation processing apparatuses may be realized through
interpolation processing programs recorded at recording media.
These interpolation processing programs may be provided to a
computer through a communication line such as the Internet.
BRIEF DESCRIPTION OF THE DRAWINGS
[0092] FIG. 1 is a functional block diagram of an electronic camera
corresponding to first through fifth embodiments;
[0093] FIGS. 2A and 2B show the arrangements of the color
components in the image data adopted in the first embodiment, the
second embodiment and the fourth embodiment;
[0094] FIGS. 3A and 3B show the arrangements of the color
components in the image data adopted in the third embodiment and
the fifth embodiment;
[0095] FIG. 4 is a flowchart (1) of the operation achieved at the
interpolation processing unit in the first embodiment;
[0096] FIG. 5 is a flowchart (2) of the operation achieved at the
interpolation processing unit in the first embodiment;
[0097] FIGS. 6A and 6B illustrate methods of weighted addition of
similarity degree components;
[0098] FIG. 7 shows the directions along which marked similarity
manifests in correspondence to values (HV[i,j], DN[i,j]);
[0099] FIG. 8 shows the positions of the color information used to
calculate the green color interpolation value G[i,j];
[0100] FIGS. 9A and 9B show how the adverse effect of magnification
chromatic aberration is eliminated;
[0101] FIGS. 10A 10C illustrate median processing of the prior
art;
[0102] FIGS. 11A and 11B illustrate the median processing operation
achieved in the first embodiment;
[0103] FIGS. 12A and 12B illustrate the ranges of the median
processing implemented in the first embodiment;
[0104] FIG. 13 shows the positions of the color information used to
calculate local curvature information;
[0105] FIG. 14 (continued from FIG. 13) shows the positions of the
color information used to calculate local curvature
information;
[0106] FIG. 15 illustrates the function of the weighting
coefficient in the fourth embodiment;
[0107] FIG. 16 is a functional block diagram of a sixth
embodiment;
[0108] FIG. 17 illustrates an example of the interpolation
processing in the prior art;
[0109] FIGS. 18A.about.18C illustrate the adverse effects of
magnification chromatic aberration;
[0110] FIGS. 19A.about.19C illustrate over correction occurring due
to magnification chromatic aberration; and
[0111] FIGS. 20A and 20B illustrate the effects of over correction
occurring at a color boundary.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0112] The following is a detailed explanation of the embodiments
of the present invention, given in reference to the drawings. FIG.
1 is a functional block diagram of the electronic camera
corresponding to the first through fifth embodiments.
[0113] In FIG. 1, an electronic camera 10 comprises a control unit
11, a photographic optical system 12, an image-capturing unit 13,
an A/D conversion unit 14, an image processing unit 15 and a
recording unit 16. The image processing unit 15 is provided with an
interpolation processing unit (e.g., a one-chip microprocessor
dedicated to interpolation processing). The image-capturing unit 13
is provided with an image-capturing sensor (not shown) constituted
by arranging R, G and B color filters in a Bayer array.
[0114] It is to be noted that while FIG. 1 shows only the
interpolation processing unit 17 in the image processing unit 15 to
simplify the illustration, a functional block that engages in other
image processing such as gradation conversion processing may also
be provided in the image processing unit 15.
[0115] In FIG. 1, the control unit 11 is connected to the
image-capturing unit 13, the A/D conversion unit 14, the image
processing unit 15 and the recording unit 16. In addition, an
optical image obtained at the photographic optical system 12 is
formed at the image-capturing sensor in the image-capturing unit
13. An output from the image-capturing unit 13 is quantized at the
A/D conversion unit 14 and is provided to the image processing unit
15 as image data. The image data provided to the image processing
unit 15 undergo interpolation processing at the interpolation
processing unit 17 and after having undergone image compression as
necessary, they are recorded via the recording unit 16. The image
data with the degrees of resolution corresponding to the individual
color components improved through the interpolation processing are
ultimately output as image data in a calorimetric system that
corresponds to the type of device that is connected, such as a
display or a printer.
[0116] FIGS. 2A and 2B show the arrangements of the color
components in the image data adopted in the first embodiment, the
second embodiment and the fourth embodiment, and FIGS. 3A and 3B
show the arrangements of the color components in the image data
adopted in the third embodiment and the fifth embodiment. It is to
be noted that in FIGS. 2A and 2B and in FIGS. 3A and 3B, the
individual color components are indicated as R, G and B, with the
positions of pixels at which the various color components are
present indicated with i and j.
[0117] With [i,j] indicating the coordinates of an interpolation
target pixel to undergo the interpolation processing, FIGS. 2A and
2B each show the arrangement of 7.times.7 pixels with the
interpolation target pixel at the center, whereas FIGS. 3A and 3B
each show the arrangement of 5.times.5 pixels with the
interpolation target pixel at the center. In addition, FIGS. 2A and
3A each show an arrangement of pixels among which a pixel with the
red color component is to undergo the interpolation processing, and
FIGS. 2B and 3B each show an arrangement of pixels among which a
pixel with the blue color component is to undergo the interpolation
processing.
[0118] In the various embodiments to be detailed later, the
interpolation processing unit 17 first implements the interpolation
processing to supplement the green color interpolation values for
pixels at which the green color component is missing (hereafter
referred to as "G interpolation processing") and then engages in
interpolation processing through which red color interpolation
values and blue color interpolation values are supplemented at
pixels at which the red color component and the blue color
component are missing (hereafter referred to as "RB interpolation
processing"). However, since the interpolation processing
implemented to supplement blue color interpolation values
(hereafter referred to as "B interpolation processing") is
implemented in a manner identical to the manner with which the
interpolation processing for supplementing red color interpolation
values (hereafter referred to as "R interpolation processing") is
implemented, its explanation is omitted.
[0119] In addition, it is assumed that the pixel at coordinates
[i,j] is the interpolation target pixel to undergo the G
interpolation processing, to simplify the subsequent explanation.
Since the green color interpolation value can be calculated through
the G interpolation processing in each of the embodiments explained
below regardless of the color component (red or blue) at the
interpolation target pixel, R and B in FIGS. 2A and 2B and FIGS. 3A
and 3B are all replaced with Z with the color information at the
interpolation target pixel expressed as Z[i,j] and color
information at other pixels also expressed in a similar manner in
the following explanation.
First Embodiment
[0120] FIGS. 4 and 5 present a flowchart of the operation achieved
in the interpolation processing unit 17 in the first embodiment,
with FIG. 4 corresponding to the operation of the interpolation
processing unit 17 during the G interpolation processing and FIG. 5
corresponding to the operation of the interpolation processing unit
17 during the R interpolation processing.
[0121] The explanation of the operation achieved in the first
embodiment given below focuses on the operation of the
interpolation processing unit 17 by referring to FIGS. 4 and 5.
[0122] First, the interpolation processing unit 17 calculates a
similarity degree Cv[i,j] along the vertical direction and a
similarity degree Ch[i,j] along the horizontal direction for an
interpolation target pixel at which the green component is missing
(FIG. 4 S1).
[0123] Now, the details of the processing implemented to calculate
the vertical similarity degree Cv[i,j] and the horizontal
similarity degree Ch[i,j] in the first embodiment are
explained.
[0124] The interpolation processing unit 17 first calculates a
plurality of types of similarity degree components along the
vertical direction and the horizontal direction defined through the
following formulae 10.about.21.
[0125] G-G similarity degree component along vertical
direction:
Cv1[i,j]=.vertline.G[i,j-1]-G[i,j+1].vertline. formula 10
[0126] G-G similarity degree component along horizontal
direction:
Ch1[i,j]=.vertline.G[i-1,j]-G[i+1,j].vertline. formula 11
[0127] B-B (R-R) similarity degree component along vertical
direction:
Cv2[i,j]=(.vertline.Z[i-1,j-1]-Z[i-1,j+1].vertline.+.vertline.Z[i+1,j-1]-Z-
[i+1,j+1].vertline.)/2 formula 12
[0128] B-B (R-R) similarity degree component along horizontal
direction:
Ch2[i,j]=(.vertline.Z[i-1,j-1]-z[i+1,j-1]+.vertline.Z[i-1,j+1]-Z[i+1,j+1].-
vertline.)/2 formula 13
[0129] R-R (B-B) similarity degree component along vertical
direction:
Cv3[i,j]=(.vertline.Z[i,j-2)-Z[i,j].vertline.+.vertline.Z[i,j+2]-Z[i,j].ve-
rtline.)/2 formula 14
[0130] R-R (B-B) similarity degree component along horizontal
direction:
Ch3[i,j]=(.vertline.Z[i"2,j]-Z[i,j].vertline.+.vertline.Z[+2,j]-Z[i,j].ver-
tline.)/2 formula 15
[0131] G-R (G-B) similarity degree component along vertical
direction:
Cv4[i,j]=(.vertline.G[i,j-1]"Z[i,j].vertline.+.vertline.G[i,j+1]-Z[i,j].ve-
rtline.)/2 formula 16
[0132] G-R (G-B) similarity degree component along horizontal
direction:
Ch4[i,j]=(.vertline.G[i-1,j]-Z[i,j]i+.vertline.G[i+1,i]-Z
i,j].vertline.)/2 formula 17
[0133] B-G (R-G) similarity degree component along vertical
direction:
Cv5[i,j]=(.vertline.Z[i-1,j-1]-G[i-1, j].vertline.+.vertline.Z[i-1,
j+1]-G[i-1,
j].vertline.+.vertline.Z[i+1,j-1]"G[i+1,j].vertline.+.vertlin-
e.Z[i+1,j+1]-G[i+1,j].vertline.)/4 formula 18
[0134] B-G (R-G) similarity degree component along horizontal
direction:
Ch5[i,j]=(.vertline.Z[i-1, j-1]-G[i,j-1].vertline.+.vertline.Z[i-1,
j+1]-G[i,j+]j.vertline.+(Z[i+1,j-1]-G[i,j-1].vertline.+.vertline.Z[i+1,j+-
1]-G[i,j+1].vertline.)/4 formula 19
[0135] luminance similarity degree component along vertical
direction:
Cv6[i,j]=(.vertline.Y[i,j-1]-Y[i,j].vertline.+.vertline.[i,j+1]-Y[i,j].ver-
tline.)/2 formula 20
[0136] luminance similarity degree component along horizontal
direction:
Ch6[i,j]=(.vertline.Y[i-1, j]-Y[i,j].vertline.+.Arrow-up
bold.Y[i+1,j]-Y[i,]j.vertline.)/2 formula 21
[0137] In formulae 20 and 21, Y[i,j] represents a value calculated
through
Y[i, j]=(4.multidot.A[i, j]+2.multidot.(A[i,
j-1]+A[i,j+1]+A[i-1,j]+A[i+1,- j])+A[i-1, j-1]+A[i-1,
j+1]+A[i+1,j-1]+A[i+1,j+1])/16 formula 22
[0138] which is equivalent to the luminance value generated through
filtering processing in which the color information corresponding
to the color components at nearby pixels around the interpolation
target pixel is averaged at a ratio of R:G:B=1:2:1. It is to be
noted that A[i,j] represents an arbitrary set of color information
on the Bayer array which may assume a G value or a Z value
depending upon the position at which the color information is
provided.
[0139] Next, the interpolation processing unit 17 performs weighted
addition of the plurality of types of similarity degree components
along each direction by using weighting coefficients a1, a2, a3,
a4, a5 and a6, as expressed in the follow formulae 23 and 24, to
calculate a similarity degree Ch0[i,j] along the vertical direction
and a similarity degree Ch0[i,j] along the horizontal direction for
the interpolation target pixel.
Cv0[i,j]=(a1,Cv1[i,j]+a2.multidot.Cv2[i,j]+a3.multidot.Cv3[i,j]+a4.multido-
t.Cv4[i,j]+a5.multidot.Cv5 [i,j]+a6.multidot.Cv6 [i,j]
)i(a1+a2+a3+a4+a5+a6) formula 23
Ch0[i,j]=(a1,Ch1[i,j]+a2.multidot.Ch2[i,j]+a3.multidot.Ch3[i,j]+a4.multido-
t.Ch4[i,j]+a5.multidot.Ch5[i,j]+a6.multidot.Ch6[i,j])/(a1+a2+a3+a4+a5+a6)
formula 24
[0140] It is to be noted that the ratio among the weighting
coefficients a1, a2, a3, a4, a5 and a6 in formulae 23 and 24 may
be, for instance, "a1: a2: a3: a4: a5: a6:=2:1:1:4:4:12."
[0141] In the first embodiment, a further improvement is achieved
in the accuracy with which the similarity degrees are calculated by
calculating the similarity degree components along the vertical and
horizontal directions and performing weighted addition of the
similarity degree components for nearby pixels around the
interpolation target pixel as well as for the interpolation target
pixel.
[0142] Namely, the interpolation processing unit 17 performs
weighted addition of the results obtained by implementing weighted
addition of the similarity degree components at the interpolation
target pixel and the nearby pixels (Cv0[i,j], Cv0[i-1, j-1],
Cv0[i-1, j+1], Cv0[i+1,j-1], Cv0[i+1,j+1] and the like), through
either (method 1) or (method 2) detailed below, to obtain a
similarity degree Cv[i,j] along the vertical direction and a
similarity degree Ch[i,j] along the horizontal direction
manifesting by the interpolation target pixel.
[0143] (method 1)
Cv[i,j]=(4 Cv0[i,j]+Cv0[i-1, j-1]+Cv0[i-1,
j+1]+Cv0[i+1,j-1]+Cv0[i+1,j+1])- /8 formula 25
Ch[i,j]3=(4 Ch0[i,j]+Ch0[i-1, j-1]+Ch0[i-1,
j+1]+Ch0[i+1,j-1]+Ch0[i+1,j+1]- )/8 formula 26
[0144] (method 2)
Cv[i,j]=(4-Cv0[i,j]+2-(Cv0[i-1, j-1]+Cv0[i+1,j-1]+Cv0[i-1,
j+1]+Cv0[i+1,j+1]) +Cv0[i,j-2]+Cv0[i,j+2]+Cv0[i-2,j]+Cv0[i+2,j])/16
formula 27
Ch[i,j]=(4-Ch0[i,j]+2.multidot.(Ch0[i-1, j-1]+Ch0[i+1,j-1]+Ch0[i-1,
j+1]+Ch0[i+1,j+1]) +Ch0[i,j-2]+Ch0[i,j+2]+Ch0[i-2,j]+Ch0[i+2,j])/16
formula 28
[0145] It is to be noted that while (method 1) corresponds to that
weighted addition of the similarity degree components at the
interpolation target pixel and the nearby pixels is implemented as
illustrated in FIG. 6A, (method 2) corresponds to that weighted
addition of the similarity degree components at the interpolation
target pixel and the nearby pixels is implemented as illustrated in
FIG. 6B.
[0146] The similarity degree components each calculated by using
color information corresponding to the same color component such as
the G-G similarity degree components, B-B (R-R) similarity degree
component and the R-R (B-B) similarity degree components (hereafter
referred to as "same-color similarity degree components") have been
confirmed through testing to be suitable for use in the evaluation
of similarity manifesting in an image with a low spatial frequency
and a large colored area. The similarity degree components each
calculated by using color information corresponding to different
color components such as the G-R (G-B) similarity degree components
and B-G (R-G) similarity degree components (hereafter referred to
as "different-color similarity degree components") have been
confirmed through testing to be suitable for use in the evaluation
of similarity manifesting in an image with a high spatial frequency
and a large achromatic image area. In addition, the luminance
similarity degree components have been confirmed through testing to
be suitable for use in the evaluation of similarity manifesting in
an image containing both a colored area and an image area with a
fairly high spatial frequency.
[0147] In other words, the evaluation of similarity manifesting in
various types of images can be achieved with a high degree of
accuracy by using similarity degrees obtained through weighted
addition of same-color similarity degree components,
different-color similarity degree components and luminance
similarity degree components.
[0148] In addition, the functions of the three types of similarity
degree components calculated as the same-color similarity degree
components (the G-G similarity degree components, the B-B (R-R)
similarity degree components and the R-R (B-B) similarity degree
components) in the similarity evaluation can be complemented by one
another and the functions of the two types of similarity degrees
components calculated as the different-color similarity degree
components (the G-R (G-B) similarity degree components and the B-G
(R-G) similarity degree components) in the similarity evaluation,
too, can be complemented by each other.
[0149] Furthermore, in the first embodiment, the vertical
similarity degree Cv[i,j] and the horizontal similarity degree
Ch[i,j] are calculated through weighted addition of the results of
weighted addition of similarity degree components at the
interpolation target pixel and the results of weighted addition of
similarity degree components at nearby pixels. Thus, the continuity
between the color information at the interpolation target pixel and
the color information at the pixels located near the interpolation
target pixel is readily reflected in the vertical similarity degree
Cv[i,j] and the horizontal similarity degree Ch[i,j].
[0150] In particular , the vertical similarity degree Cv[i,j] and
the horizontal similarity degree Ch[i,j] calculated through (method
2) reflect color information corresponding to the color components
at pixels over a wide range and thus, are effective in the
similarity evaluation of an image manifesting a pronounced
magnification chromatic aberration.
[0151] It is to be noted that the vertical similarity degree
Cv[i,j] and the horizontal similarity degree Ch[i,j] in the first
embodiment indicate more marked similarity as their values become
smaller.
[0152] After the vertical similarity degree Cv[i,j] and the
horizontal similarity degree Ch[i,j] are calculated as described
above, the interpolation processing unit 17 compares the similarity
along the vertical direction and the similarity along the
horizontal direction manifesting at the interpolation target pixel
(hereafter referred to as the "vertical/horizontal similarity")
based upon the vertical similarity degree Cv[i,j] and the
horizontal similarity degree Ch[i,j] (FIG. 4 S2). Then, it sets one
of the following values for an index HV[i,j] which indicates the
vertical/horizontal similarity based upon the results of the
comparison.
[0153] For instance, if .vertline.Cv[i,j]-Ch[i,j].vertline.>T1
and Cv[i,j]<Ch[i,j] are true with regard to a given threshold
value T1, the interpolation processing unit 17 judges that a more
marked similarity is manifested along the vertical direction than
along the horizontal direction and sets 1 for the index HV[i,j]
(FIG. 4 S3), if; .vertline.Cv[i,j]-Ch[i,j].vertline.>T1 and
Cv[i,j]>Ch[i,j] are true, the interpolation processing unit 17
judges that a more marked similarity is manifested along the
horizontal direction than along the vertical direction and sets -1
for the index HV[i,j] (FIG. 4 S4) and if;
.vertline.Cv[i,j]-Ch[i,j].vertline..ltoreq.T1 is true, the
interpolation processing unit 17 judges that the degree of
similarity manifested along the horizontal direction and along the
vertical direction are essentially the same and sets 0 for the
index HV[i,j] (FIG. 4 S5).
[0154] It is to be noted that the threshold value Ti is used to
prevent an erroneous judgment that the similarity along either
direction is more marked from being made due to noise when the
difference between the vertical similarity degree Cv[i,j] and the
horizontal similarity degree Ch[i,j] is very little. Accordingly,
by setting a high value for the threshold value T1 when processing
a color image with a great deal of noise, an improvement in the
accuracy of the vertical/horizontal similarity judgment is
achieved.
[0155] Next, the interpolation processing unit 17 calculates a
similarity degree C45[i,j] along the diagonal 45.degree. direction
and a similarity degree C135[i,j] along the diagonal 135.degree.
degree direction for the interpolation target pixel (FIG. 4
S6).
[0156] Now, details of the processing implemented in the first
embodiment to calculate the diagonal 45.degree. similarity degree
C45[i,j] and the diagonal 135.degree. similarity degree C135[i,j]
are explained.
[0157] First, the interpolation processing unit 17 calculates a
plurality of types of similarity degree components along the
diagonal 45.degree. direction and the diagonal 135.degree.
direction as defined in the following formulae 29.about.36;
[0158] G-G similarity degree component along the diagonal
45.degree. direction:
C45.sub.--1[i,j]=(.vertline.G[i,j-1]-G[i-1,
j].vertline.+"G[i+1,j]-G[i,j+1- ].vertline.)/2 formula 29
[0159] G-G similarity degree component along the diagonal
135.degree. direction:
C135.sub.--1.vertline.[i,j]=(.vertline.G[i,j-1]-G[i+1,j].vertline.+.vertli-
ne.G[i-1, j]-G[i,j+1].vertline.)/2 formula 30
[0160] B-B (R-R) similarity degree component along the diagonal
45.degree. direction:
C45.sub.--2[i,j]=.vertline.Z[i+1,j-1]-Z[i-1, j+1].vertline. formula
31
[0161] B-B (R-R) similarity degree component along the diagonal
135.degree. direction:
C135.sub.--2[i,j]=.vertline.Z[i-1, j-1]-Z[i+1,j+1].vertline.
formula 32
[0162] R-R (B-B) similarity degree component along the diagonal
45.degree. direction:
C45.sub.--3[i,j]=(.vertline.Z[i+2,j-2]-Z[i,j].vertline.+.vertline.Z[i-2,j+-
2]-Z[i,j].vertline.)/2 formula 33
[0163] R-R (B-B) similarity degree component along the diagonal
135.degree. direction:
C135.sub.--3[i,j]=(.vertline.Z[i-2,j-2]-Z[i,j].vertline.+.vertline.Z[i+2,j-
+2]-Z[i,j].vertline.)/2 formula 34;
[0164] B-R (R-B) similarity degree component along the diagonal
45.degree. direction:
C45.sub.--4[i,j]=(.vertline.Z[i+1,j-1]-Z[i,j]l+.vertline.Z[i-1,
j+1]-Z[i,j].vertline.)/2 formula 35
[0165] B-R (R-B) similarity degree component along the diagonal
135.degree. direction:
C135.sub.--4[i,j]=(.vertline.Z[i-1,
j-1]-Z[i,j].vertline.+.vertline.Z[i+1,- j+1]-Z[i,j].vertline.)/2
formula 36;
[0166] Next, the interpolation processing unit 17 calculates a
similarity degree C45.sub.--0[i,j] along the diagonal 45.degree.
direction and a similarity degree C135.sub.--0[i,j] along the
diagonal 135.degree. direction through weighted addition of the
plurality of types of similarity degree components performed along
each of the two directions by using weighting coefficients b1, b2,
b3 and b4 , as expressed in the following formulae 37 and 38.
C45.sub.--0[i,j]=(b1.multidot.C45.sub.--1[i,j]+b2.multidot.C45.sub.--2[i,j-
]+b3.multidot.C45.sub.--3[i,j]+b4 C45.sub.--4[i,j])/(b1+b2+b3+b4)
formula 37
C135.sub.--0[i,j]=(b1.multidot.C135.sub.--1[i,j]+b2.multidot.C135.sub.--2[-
i,j]+b3.multidot.C135.sub.--3[i,j]+b4.multidot.C135.sub.--4[i,j])/(b1+b2+b-
3+b4) formula 38
[0167] It is to be noted that the ratio of the weighting
coefficients b1, b2, b3 and b4 in formulae 37 and 38 may be, for
instance, "b1:b2:b3:b4:=2:1:1:2."
[0168] In the first embodiment, a further improvement is achieved
in the accuracy with which the similarity degrees are calculated by
calculating the similarity degree components along the diagonal
45.degree. direction and the diagonal 135.degree. direction and
performing weighted addition of the similarity degree components
for nearby pixels around the interpolation target pixel as well as
for the interpolation target pixel.
[0169] Namely, the interpolation processing unit 17 performs
weighted addition of the results obtained by implementing weighted
addition of the similarity degree components at the interpolation
target pixel and the nearby pixels (C45.sub.--0[i,j],
C45.sub.--0[i-1, j-1]C45.sub.--0[i-1, j+1],
C45.sub.--0[i+1,j-1]C45.sub.--0[i+1,j+1] and the like) through
either (method 1) or (method 2) detailed below, to obtain a
similarity degree C45[i,j] along the diagonal 45.degree. direction
and a similarity degree C135[i,j] along the diagonal 135.degree.
direction manifesting by the interpolation target pixel (equivalent
to implementing weighted addition of similarity degree components
at the interpolation target pixel and the nearby pixels as
illustrated in FIGS. 6A and 6B).
[0170] (method 1)
C45[i,j]=(4.multidot.C45.sub.--0[i,j]+C45.sub.--0[i-1,
j-1]+C45.sub.--0[i+1,j-1]+C45.sub.--0[i-1,
j+1]+C45.sub.--0[i+1,j+1])/8 formula 39
C135[i,j]=(4.multidot.C135.sub.--0[i,j]+C135.sub.--0[i-1,
j-1]+C135.sub.--0[i+1,j-1]+C135.sub.--0[i-1,
j+1]+C135.sub.--0[i+1,j+1])/- 8 formula 40
[0171] (method 2)
C45[i,j]=(4.multidot.C45.sub.--0[i,j]+2(C45.sub.--0[i-1,
j-1]+C45.sub.--0[i+1,j-1]+C45.sub.--0[i-1,
j+1]+C45.sub.--0[i+1,j+1])+C45-
.sub.--0[i,j-2]+C45.sub.--0[i,j+2]+C45.sub.--0[i-2,j]+C45.sub.--0[i+2,j])/-
16 formula 41
C135[i,j]=(4.multidot.C135.sub.--0[i,j]+2(C135.sub.--0[i-1,
j-1]+C135.sub.--0[i+1,j-1]+C135.sub.--0[i-1,
j+1]+C135.sub.--0[i+1,j+1])+-
Cl35.sub.--0[i,j-2]+C135.sub.--0[i,j+2]+C135.sub.--0[i-2,j]+C135.sub.--0[i-
+2,j])/16 formula 42
[0172] It is to be noted that the weighted addition of the
plurality of similarity degree components and the consideration of
the evaluation of similarity degrees at the nearby pixels with
regard to the diagonal 45.degree. similarity degree C45[i,j] and
the diagonal 135.degree. similarity degree C135[i,j] thus
calculated achieves the same function as that with regard to the
vertical similarity degree Cv[i,j] and the horizontal similarity
degree Ch[i,j]. In addition, the diagonal 45.degree. similarity
degree C45[i,j] and the diagonal 135.degree. similarity degree
C135[i,j] in the first embodiment indicate more marked similarity
as their values become smaller.
[0173] After the diagonal 45.degree. similarity degree C45[i,j] and
the diagonal 135.degree. similarity degree C135[i,j] are
calculated, the interpolation processing unit 17 compares the
similarity along the diagonal 45.degree. direction and the
similarity along the diagonal 135.degree. direction manifesting at
the interpolation target pixel (hereafter referred to as the
"diagonal similarity") based upon the diagonal 45.degree.
similarity degree C45[i,j] and the diagonal 135.degree. similarity
degree C135[i,j] (FIG. 4 S7). Then, it sets one of the following
values for an index DN[i,j] which indicates the diagonal similarity
based upon the results of the comparison.
[0174] For instance, if;
.vertline.C45[i,j]-C135[i,j].vertline.>T2 and
C45[i,j]<C135[i,j] are true with regard to a given threshold
value T2, the interpolation processing unit 17 judges that a more
marked similarity is manifested along the diagonal 45.degree.
direction than along the diagonal 135.degree. direction and sets 1
for the index DN[i,j] (FIG. 4 S8), if;
.vertline.C45[i,j]-C135[i,j].vertline.>T2 and
C45[i,j]>C135[i,j] are true, the interpolation processing unit
17 judges that a more marked similarity is manifested along the
diagonal 135.degree. direction than along the diagonal 45.degree.
direction and sets -1 for the index DN[i,j] (FIG. 4 S9) and if;
.vertline.C45[i,j]-C135 [i,j].vertline..ltoreq.T2 is true, the
interpolation processing unit 17 judges that the degree of
similarity manifesting along the diagonal 45.degree. direction and
along the diagonal 135.degree. direction are essentially the same
and sets 0 for the index DN[i,j] (FIG. 4 S10).
[0175] It is to be noted that the threshold value T2 is used to
prevent an erroneous judgment that the similarity along either
direction is more marked from being made due to noise.
[0176] Next, the interpolation processing unit 17 ascertains the
specific values of the index HV[i,j] indicating the
vertical/horizontal similarity and the index DN[i,j] indicating the
diagonal similarity (FIG. 4 S11) and classifies the class of the
similarity manifesting at the interpolation target pixel as one of
the following; case 1 case 9.
[0177] case 1: (HV[i,j], DN[i,j])=(1, 1): marked similarity
manifesting along the vertical direction and the diagonal
45.degree. direction
[0178] case 2: (HV[i,j], DN[i,j])=(1, 0): marked similarity
manifesting along the vertical direction
[0179] case 3: (HV[i,j], DN[i,j])=(1, -1): marked similarity
manifesting along the vertical direction and the diagonal
135.degree. direction
[0180] case 4: (HV[i,j], DN[i,j])=(0, 1): marked similarity
manifesting along the diagonal 45.degree. direction
[0181] case 5: (HV[i,j], DN[i,j])=(0, 0): marked similarity
manifesting along all the directions or little similarity
manifesting along all the directions
[0182] case 6: (HV[i,j], DN[i,j])=(0, -1): marked similarity
manifesting along the diagonal 135.degree. direction
[0183] case 7: (HV[i,j], DN[i,j])=(-1, 1): marked similarity
manifesting along the horizontal direction and the diagonal
45.degree. direction
[0184] case 8: (HV[i,j], DN[i,j])=(-1, 0): marked similarity
manifesting along the horizontal direction
[0185] case 9: (HV[i,j], DN[i,j])=(-1, -1): marked similarity
manifesting along the horizontal direction and the diagonal
135.degree. direction.
[0186] FIG. 7 illustrates the directions along which marked
similarity manifests, as indicated by the values of HV[i,j],
DN[i,j]
[0187] In FIG. 7, there is no directional indication that
corresponds to "case 5: (HV[i,j], DN[i,j]) ,=(0, 0)." A marked
similarity manifesting along all the directions or only a slight
similarity manifesting along all the directions as in case 5 means
that the interpolation target pixel is contained within a flat area
or is an isolated point (an image area manifesting a lower degree
of similarity to nearby pixels and having a high spatial
frequency).
[0188] Next, the interpolation processing unit 17 calculates the
green color interpolation value G[i,j] as indicated below based
upon the results of the judgment explained above.
[0189] In case 1, G[i,j]=Gv45[i,j] : FIG. 4 S12
[0190] In case 2, G[i,j]=Gv[i,j] : FIG. 4 S13
[0191] In case 3, G[i,j]=Gv135[i,j]: FIG. 4 S14
[0192] In case 4, G[i,j]=(Gv45[i,j]+Gh45[i,j])/2 : FIG. 4 S15
[0193] In case 5, G[i,j]=(Gv[i,j]+Gh[i,j])/2 FIG. 4 S16
[0194] In case 6, G[i,j]=(Gv135[i,j]+Gh135[i,j])/2 FIG. 4 S17
[0195] In case 7, G[i,j]=Gh45[i,j]: FIG. 4 S18
[0196] In case 8, G[i,j]=Gh[i,j]: FIG. 4 S19
[0197] In case 9, G[i,j]=Gh135[i,j]: FIG. 4 S20, with
Gv[i, j]=(G [i, j-1]+G [i, j+1])/2+(2.multidot.Z [i, j]-Z[i,
j-2]-Z[i, j+2])/8+(2 G[i-1,j]-G[i-1, j-2]-G[i-1,
j+2]+2.multidot.G[i+1, j]-G[i+1, j-2]-G[i+1, j+2])16 formula 43
Gv45 i,j (G [i, j-1]+G [i, j+1]/2(2.multidot.Z[i,j]-Z[i, j-2]-Z[i,
j+2]/8+(2.multidot.Z [i-1,j+1]-Z [i-1, j-1]-Z[i-1,
j+3]+2.multidot.Z[i+1,j-1]-Z[i+1,j-3]-Z[i+1,j+1j]/16 formula 44
Gv135[i,j]=(G[i
,j-1]+G[i,j+1])/2+(2.multidot.Z[i,j]-Z[i,j-2]-Z[i,j+2])/8+-
(2.multidot.Z[i-1,j-1]-Z[i-1,j-3]-Z[i-1,j+1]+2.multidot.Z[i+1,j+1]-Z[i+1,j-
-1]-Z[i+1,j+3])/16 formula 45
Gh[i,j]=(G[i-1, j]+G[i+1,j])/2+(2.multidot.Z [i,j]-Z [i-2, j]-Z
[i+2,
j])/8+(2.multidot.G[i,j-1]-G[i-2,j-1]-G[i+2,j-1]+2.multidot.G[i,j+1]-G[i--
2,j+1]-G[i+2,j+1])/16 formula 46
Gh45[i,j]=(G[i-1,
j]+G[i+1,j])/2+(2.multidot.Z[i,j]-Z[i-2,j]-Z[i+2,j])/8+(-
2.multidot.Z[i+1,j-1]-Z[i-1, j-1]-Z[i+3,j-1]+2.multidot.Z[i-1,
j+1]-Z[i-3,j+1]-Z[i+1,j+1])/16 formula 47
Gh135[i,j]=(G[i-1,
j]+G[i+1,j])/2+(2.multidot.Z[i,j]-Z[i-2,j]-Z[i+2,j])/8+-
(2.multidot.Z[i-1, j-1]-Z[i-3,j-1]-Z[i+1,j-1+2]-Z[i+1,j+1]-Z[i-1,
j+1]-Z[i+3,j+1])/16 formula 48
[0198] FIG. 8 shows the positions of the color information used to
calculate the green color interpolation value G[i,j]. In FIG. 8,
the color information at the circled pixels is used as a
contributing factor in the curvature information that constitutes
the green color interpolation value G[i,j].
[0199] In each of formulae 43.about.48, the first term constitutes
the "local average information of the green color component" and
corresponds to the primary terms in formulae 1 and 2. The second
term represents the "local curvature information based upon a color
component matching a color component at the interpolation target
pixel" and the third term represents the "local curvature
information based upon a color component other than the color
component at the interpolation target pixel." It is to be noted
that the curvature information in the second and third terms is
obtained through quadratic differentiation of the color components.
To explain this point by referring to formula 44, in the second
term of formula 44, the difference between the color information
Z[i,j] and the color information Z[i,j-2] and the difference
between the color information Z[i,j+2] and the color information
Z[i,j] are obtained and then the difference between these
differences is ascertained. In the third term the difference
between the color information Z[i-1, j+1] and the color information
Z[i-1, j-1] and the difference between the color information
Z[i-1,j+3] and the color information Z[i-1, j+1] are obtained with
the difference between these differences then ascertained and the
difference between the color information Z[i+1,j-1] and the color
information Z[i+1,j-3] and the difference between the color
information Z[i+1,j+1] and the color information Z[i+1,j-1] are
obtained with the difference between these differences then
ascertained.
[0200] In Gv45[i,j], the "local curvature information based upon a
color component matching the color component at the interpolation
target pixel" is local curvature information with directionality
manifesting along the vertical direction, and the "local curvature
information based upon a color component other than the color
component at the interpolation target pixel" is local curvature
information with directionality manifesting along the vertical
direction and the diagonal 45 direction. In Gv135[i,j], the "local
curvature information based upon a color component matching the
color component at the interpolation target pixel" is local
curvature information with directionality manifesting along the
vertical direction, and the "local curvature information based upon
a color component other than the color component at the
interpolation target pixel" is local curvature information with
directionality manifesting along the vertical direction and the
diagonal 135.degree. direction. In Gh45[i,j], the "local curvature
information based upon a color component matching the color
component at the interpolation target pixel" is local curvature
information with the directionality manifesting along the
horizontal direction, and the "local curvature information based
upon a color component other than the color component at the
interpolation target pixel" is local curvature information with
directionality manifesting along the horizontal direction and the
diagonal 45.degree. direction. In Gh135[i,j], the "local curvature
information based upon a color component matching the color
component at the interpolation target pixel" is local curvature
information with directionality manifesting along the horizontal
direction, and the "local curvature information based upon a color
component other than the color component at the interpolation
target pixel" is local curvature information with directionality
manifesting along the horizontal direction and the diagonal
135.degree. direction.
[0201] In addition, the "local curvature information based upon a
color component matching the color component at the interpolation
target pixel" and the "local curvature information based upon a
color component other than the color component at the interpolation
target pixel" in Gv[i,j] are both local curvature information with
directionality manifesting along the vertical direction, and the
"local curvature information based upon a color component matching
the color component at the interpolation target pixel" and the
"local curvature information based upon a color component other
than the color component at the interpolation target pixel" in
Gh[i,j] are both local curvature information with directionality
manifesting along the horizontal direction.
[0202] In other words, in the first embodiment, the local average
information of the green color component is corrected by using the
"local curvature information based upon a color component matching
the color component at the interpolation target pixel" and the
"local curvature information based upon a color component other
than the color component at the interpolation target pixel."
[0203] For instance, when marked similarity manifests along the
diagonal directions and the green color interpolation value is
calculated by using Gv45[i,j], Gv135[i,j], Gh45[i,j] and Gh135[i,j]
(case 1, case 3, case 4, case 6, case 7 or case 9), the local
average information of the green color component (the primary term)
is corrected by using the local curvature information based upon
the red color component and the local curvature information based
upon the blue color component at phases that are opposite from each
other. In such a case, color information in the individual color
components to be used to calculate the local curvature information
corresponding to each color component is obtained from pixels that
are present on both sides of a line drawn along a direction judged
to manifest marked similarity.
[0204] Thus, even when the color information corresponding to the
red color component and the color information corresponding to the
blue color component are offset relative to the color information
corresponding to the green color component due to magnification
chromatic aberration, as illustrated in FIG. 9A (equivalent to a
drawing achieved by superimposing FIG. 18B on FIG. 18C), the
primary term is corrected in correspondence to the average quantity
of change in the color information corresponding to the red color
component and the color information corresponding to the blue color
component. As a result, by adopting the first embodiment, the
primary term can be corrected for a desired pixel even if there is
magnification chromatic aberration at the photographic optical
system 12, with the overshoot and the undershoot occurring as a
result of the G interpolation processing disclosed in U.S. Pat. No.
5,629,734 canceled out by each other. Consequently, the occurrence
of color artifacts attributable to over correction can be reduced
in the first embodiment.
[0205] It is to be noted that while an overshoot may also occur
when correcting a primary term constituted of color information
corresponding to the blue color component as well as when
correcting a primary term constituted of color information
corresponding to the red color component, overshoot values
corresponding to the individual color components are averaged in
the first embodiment and thus, the average value does not exceed an
overshoot value resulting from the G interpolation processing
disclosed in U.S. Pat. No. 5,629,734. In addition, even if an
undershoot occurs when correcting a primary term constituted of
color information corresponding to the blue color component or
correcting a primary term constituted of color information
corresponding to the red color component, the undershoot value in
the first embodiment never exceeds the undershoot value resulting
from the G interpolation processing disclosed in U.S. Pat. No.
5,629,734.
[0206] In the first embodiment, the image data to undergo the G
interpolation processing are arranged in a Bayer array as shown in
FIGS. 2A and 2B, with the color information corresponding to the
red color component and the color information corresponding to the
blue color component positioned diagonally to each other. Thus, if
color information corresponding to the blue color component is
provided at the interpolation target pixel, for instance, the local
curvature information based upon the red color component to be used
to correct the primary term is calculated by using color
information corresponding to the red color component at pixels
positioned along a diagonal direction along which marked similarity
to the interpolation target pixel manifests. In addition, the green
color interpolation value is calculated by using the color
information at pixels set along a diagonal direction distanced from
the interpolation target pixel such as Z[i-1, j+3] and Z[i+1,j-3]
in formula 44, Z[i-1, j-31 and Z[i+1,j+3] in formula 45, Z[i+3,j1]
and Z[i-3,j+1] in formula 47 and Z[i-3,j-11 and Z[i+3,j+1] in
formula 48.
[0207] As a result, in the G interpolation processing in the first
embodiment which requires a highly accurate judgment on the
diagonal similarity, the interpolation processing unit 17 achieves
a high degree of accuracy in the judgement of the diagonal
similarity by using a plurality of sets of color information when
calculating a plurality of types of similarity degree components
along the diagonal 45.degree. direction and the diagonal
135.degree. direction.
[0208] In other words, the accuracy of the interpolation processing
is improved through a highly accurate judgment on the diagonal
similarity in the first embodiment.
[0209] In addition, if marked similarity manifests along the
vertical direction or the horizontal direction and thus the green
interpolation value is calculated using Gv[i,j] or Gh[i,j] (case 2
or case 8), local curvature information based upon the green color
component is used as the "local curvature information based upon a
color component other than the color component at the interpolation
target pixel" and the local average information of the green color
component is corrected by using local curvature information based
upon the red color component or the blue color component and local
curvature information based upon the green color component.
[0210] Under normal circumstances, due to the effect of
magnification chromatic aberration, the red color component, the
green color component and the blue color component may have
relative positional offsets in order of wavelength, i.e. in order
of red color, green color and blue color. And the green color
component positions between the red color component and the blue
color component. Thus, if color information corresponding to the
red color component is provided at the interpolation target pixel,
the local curvature information based upon the green color
component can be used as a component at a phase opposite from the
phase of the local curvature information based upon the red color
component to reduce the occurrence of color artifacts resulting
from over correction. Also, in the same manner, if color
information corresponding to the blue color component is provided
at the interpolation target pixel, the local curvature information
based upon the green color component can be used as a component at
a phase opposite from the phase of the local curvature information
based upon the blue color component to reduce the occurrence of
color artifacts resulting from over correction.
[0211] In the following explanation of the RB interpolation
processing operation, the RB interpolation processing implemented
in the prior art is first described and then the R interpolation
processing in FIG. 5 in the RB interpolation processing implemented
in the first embodiment is explained (an explanation of the B
interpolation processing is omitted).
[0212] A known example of the RB interpolation processing in the
prior art is linear interpolation processing implemented in a color
difference space in which after calculating color differences at
all the pixels (values each obtained by subtracting the value
indicated by color information corresponding to the green color
component from the value indicated by color information
corresponding to the red color component (or the blue color
component)), one of the three different types of processing
(1).about.(3) described below is implemented on each interpolation
target pixel to calculate the interpolation value.
[0213] (1) If a color component missing at the interpolation target
pixel is present at the two pixels adjacent to the interpolation
target pixel along the vertical direction, the interpolation target
value is calculated as a value achieved by adding the color
information corresponding to the green color component at the
interpolation target pixel to the average of the color differences
at the two pixels.
[0214] (2) If a color component missing at the interpolation target
pixel is present at the two pixels adjacent to the interpolation
target pixel along the horizontal direction, the interpolation
target value is calculated as a value achieved by adding the value
indicated by the color information corresponding to the green color
component at the interpolation target pixel to the average of the
color differences at the two pixels.
[0215] (3) If a color component missing at the interpolation target
pixel is present at the four pixels adjacent to the interpolation
target pixel along the diagonal directions, the interpolation
target value is calculated as a value achieved by adding the value
indicated by the color information corresponding to the green color
component at the interpolation target pixel to the average of the
color differences at the four pixels.
[0216] In addition, interpolation processing that incorporates
nonlinear median processing, which is more effective in preventing
color artifacts compared to linear processing in a color difference
space is also implemented in the prior art.
[0217] In the art disclosed in U.S. Pat. No. 5,799,113, in which
video signals provided in one of the following colorimetric
systems, RGB, YUv and YCbCr, undergo culled compression at a
resolution of 1/4 to reduce the transmission volume, nonlinear
median processing is implemented to restore the video signals to
the original resolution by interpolating 3-component data at the
culled pixels which have been lost. For instance, if the video
signals are provided in the YCbCr colorimetric system, the
interpolation values for the luminance component Y and the
interpolation values corresponding to the color components Cb and
Cr at the culled pixels marked O, .DELTA. and X in FIGS.
10A.about.10C are calculated through identical arithmetic
processing. It is to be noted that in order to retain the structure
at an edge, the pixel marked X alone is interpolated by using the
median value (median) of the values at four nearby pixels, the
pixels marked O are each interpolated by using the average of the
values at the pixels adjacent along the horizontal direction and
the pixels marked .DELTA. are each interpolated by using the
average of the values at the pixels adjacent along the vertical
direction.
[0218] However, while the interpolation processing implemented as
described above is effective in restoring the image quality in a
dynamic image, it is not suited for processing a still image that
requires high definition. Namely, the art disclosed in U.S. Pat.
No. 5,799,113, in which the luminance component Y and the color
components Cr and Cb are handled in exactly the same way, achieves
only a very low degree of accuracy with regard to the interpolation
values for the luminance component Y which determines the
resolution. In addition, since the luminance component Y is
interpolated by using the median, the likelihood of the image
structure becoming lost is high. Furthermore, there is a concern
that color artifacts may spread when the data are converted to the
RGB calorimetric system.
[0219] In an electronic camera which employs an image-capturing
sensor constituted by arranging R, G and B color filters in a Bayer
array to generate a still image, the interpolation processing on
the green color component which is equivalent to the luminance
component with a high spatial frequency (G interpolation
processing) can be implemented with a very high degree of accuracy
by using similarity manifesting between the interpolation target
pixel and nearby pixels and calculating the interpolation value
using a plurality of color components, as explained earlier. In
such an electronic camera, after implementing high-definition
interpolation processing on the green color component, which most
faithfully reflects the high-frequency information in the image
data, the interpolation processing on the red color component and
the blue color component is achieved through linear interpolation
in color difference spaces relative to the green color component to
reduce color artifacts by reflecting the high-frequency information
in the image data in the red color component and the blue color
component.
[0220] For instance, if the sets of color information at individual
pixels are arranged one-dimensionally in the order of (R1, G2, R3),
the red color interpolation value is calculated through;
R2=(R1+R3)/2+(2-G2-G1-G3)/2 formula 49.
[0221] In the formula, G2 represents color information
corresponding to the green color component in the original image
and G1 and G3 each represent a green color interpolation value
obtained through the G interpolation processing.
[0222] However, this RB interpolation processing poses a problem in
that the color artifact is allowed to remain in the vicinity of an
isolated point (an image area manifesting only slight similarity to
nearby pixels and having a high spatial frequency). In the prior
art, this type of color artifact is often eliminated in post
processing, in which a and b hue planes obtained by converting the
image data to the Lab calorimetric system individually undergo
median filtering after the G interpolation processing and the RB
interpolation processing are implemented.
[0223] Since a 3.times.3 (=9 points) filter size achieves hardly
any effect, the filter size must be set over a large range of
5.times.5 (=25 points) when applying such a median filter.
[0224] In other words, in the electronic camera described above,
extremely heavy processing must be implemented since both the RB
interpolation processing in the prior art and the median processing
must be performed in the interpolation processing on the red color
component and the blue color component in a still image and also
the filter size must be set over a wide range for the median
processing. Furthermore, the risk of the fine structure in a
colored area (hereafter referred to as a "color structure") being
lost is higher when the filter size in the median processing is
increased.
[0225] Accordingly, in the first embodiment, RB interpolation
processing through which red color and blue color interpolation
values can be calculated quickly with a high degree of accuracy
without allowing any color artifacts to remain in the vicinity of
an isolated point or losing the color structure is proposed. It is
to be noted that the following is an explanation of the only R
interpolation processing in the RB interpolation processing, given
in reference to FIG. 5.
[0226] First, the interpolation processing unit 17 calculates a
color difference that contains the red color component for each
pixel at which color information corresponding to the red color
component is present by subtracting the green color interpolation
value (the value obtained through the G interpolation processing
explained earlier) from the value indicated by the color
information corresponding to the red color component (FIG. 5
S1).
[0227] For instance, the interpolation processing unit 17
calculates a color difference Cr[i,j] containing the red color
component at a pixel at given coordinates [i,j] with color
information corresponding to the red color component as;
Cr[i,j]=R[i,j]-G[i,j] formula 50.
[0228] It is to be noted that in the first embodiment, when the
color differences containing the red color component have been
calculated as described above, the color differences containing the
red color component are set so as to surround pixels at which color
information corresponding to the red color component is missing and
color information corresponding to the blue color component is
present from the four diagonal directions.
[0229] The interpolation processing unit 17 interpolates the color
difference containing the red color component for each of the
pixels surrounded by color differences containing the red color
component from the four diagonal directions (each pixel at which
color information corresponding to the red color component is
missing and color information corresponding to the blue color
component is present in the first embodiment) by using the median
of the color differences containing the red color component at the
pixels set diagonally to the target pixel (FIG. 5 S2).
[0230] Namely, in the first embodiment, the interpolation
processing unit 17 calculates the color difference Cr[m,n] at the
pixel at given coordinates [m,n] surrounded by color differences
containing the red color component from the four diagonal
directions as shown in FIG. 11A through;
Cr[m,n]=median{Cr[m-1,n-1),Cr[m+1,n-1],
Cr[m-1,n+1],Cr[m+1,n+1]} formula 51.
[0231] In the formula, median{} represents a function through which
the median of a plurality of elements is calculated and, if there
are an even number of elements, it takes the average of the two
middle elements.
[0232] In the first embodiment, when the color differences
containing the red color component have been calculated through
formulae 50 and 51, the color differences containing the red color
component are set so as to surround pixels at which color
information corresponding to the red color component and color
information corresponding to the blue color component are both
missing from the four directions; i.e., from above, from below and
from the left and the right.
[0233] The interpolation processing unit 17 interpolates the color
difference containing the red color component for each of the
pixels surrounded by color differences containing the red color
component from the four directions; i.e., from above, from below
and from the left and the right (each pixel at which color
information corresponding to the red color component and color
information corresponding to the blue color component are both
missing in the first embodiment) by using the median of the color
differences containing the red color component at the pixels set
above, below and to the left and the right of the pixel (FIG. 5
S3).
[0234] Namely, in the first embodiment, the interpolation
processing unit 17 calculates the color difference Cr[m,n] at the
pixel at given coordinates [m,n] surrounded by color differences
containing the red color component from the four directions; i.e.,
from above, from below and from the left and the right as shown in
FIG. 11B through; Cr[m,n]=median(Cr[m,n-1],Cr[m-1,n],
Cr[m+1,n],Cr[m,n+1]} formula 52
[0235] Next, the interpolation processing unit 17 converts the
color difference containing the red color component calculated
through formula 51 or formula 52 for each pixel at which color
information corresponding to the red color component is missing to
a red color interpolation value by using color information
corresponding to the green color component (or the green color
interpolation value) (FIG. 5 S4).
[0236] Namely, the interpolation processing unit 17 calculates the
red color interpolation value R[m,n] for the pixel at given
coordinates [m,n] through;
R[m,n]=Cr[m,n]+G[m,n] formula 53.
[0237] The median processing described above is implemented on the
color differences representing the hue alone and is not implemented
on the luminance component. In addition, when the pixel marked O in
FIG. 12A is the interpolation target pixel in the R interpolation
processing, the color differences containing the red color
component at the pixels marked X are calculated by using the color
differences Cr over a 3.times.5 range, and thus, the color
difference containing the red color component at the pixel marked O
represents a value which is close to the results of median
processing implemented by weighting the color differences Cr within
the 3.times.5 range. When the pixel marked .DELTA. in FIG. 12B is
the interpolation target pixel, on the other hand, the color
differences containing the red color component at the pixels marked
X are calculated by using the color differences Cr over a 5.times.3
range, and thus, the color difference containing the red color
component at the pixel marked .DELTA. represents a value which is
close to the results of median processing implemented by weighting
the color differences Cr within the 5.times.3 range.
[0238] In other words, in the first embodiment, advantages
substantially similar to those achieved through median processing
implemented over a wide range are achieved while keeping down the
filter size. As a result, by adopting the first embodiment, the
occurrence of color artifacts around an isolated point is reduced
without destroying the color structure. Thus, a great improvement
is achieved in the color artifact reduction effect over the art
disclosed in U.S. Pat. No. 5,799,113.
[0239] In addition, since the color differences at only four points
are each used in the median processing in FIG. 5 S2 and FIG. 5 S3
in the first embodiment, good processing efficiency is achieved and
extremely fast median processing is enabled.
[0240] It is to be noted that while the RB interpolation processing
is implemented after the G interpolation processing in the first
embodiment, RB interpolation processing similar to that in the
embodiment can be implemented without having to perform G
interpolation processing on image data provided in the YCbCr
colorimetric system with Y, Cb and Cr culled at a ratio of 4:2:0
since the luminance component Y is left intact in the image
data.
Second Embodiment
[0241] The following is an explanation of the operation achieved in
the second embodiment.
[0242] It is to be noted that since the RB interpolation processing
in the second embodiment is implemented as in the first embodiment,
its explanation is omitted.
[0243] In the following explanation of the G interpolation
processing, a description of the operating details identical to
those in the first embodiment is omitted. It is to be noted that
the difference between the G interpolation processing in the second
embodiment and the G interpolation processing in the first
embodiment is in the values of Gv[i,j], Gv45[i,j], Gv135[i,j],
Gh[i,j], Gh45[i,j] and Gh135[i,j] used when calculating the green
interpolation value G[i,j]. For this reason, the flowchart of the
operation in the interpolation processing unit 17 during the G
interpolation processing is not provided for the second embodiment.
In addition, while an explanation is given below on an assumption
that the red color component is present at the interpolation target
pixel as shown in FIG. 2A, the second embodiment may be adopted
when implementing processing on an interpolation target pixel at
which the blue color component is present, as shown in FIG. 2B.
[0244] The interpolation processing unit 17 ascertains the degrees
of similarity manifesting at the interpolation target pixel as in
the first embodiment (corresponds to FIG. 4 S1.about.S11) and
classifies the type of the similarity at the interpolation target
pixel as one of cases 1.about.9 explained earlier. Then, the
interpolation processing unit 17 calculates the green color
interpolation value G[i,j] as indicated below.
[0245] In case 1, G[i,j]=Gv45[i,j]
[0246] In case 2, G[i,j]=Gv[i,j]
[0247] In case 3, G[i,j]=Gv135[i,j]
[0248] In case 4, G[i,j]=(Gv45[i,j]+Gh45[i,j])/2
[0249] In case 5, G[i,j]=(Gv[i,j]+Gh[i,j])/2
[0250] In case 6, G[i,j]=(Gv135[i,j]+Gh135[i,j])/2 S
[0251] In case 7, G[i,j]=Gh45[i,j]
[0252] In case 8, G[i,j]=Gh[i,j]
[0253] In case 9, G[i,j]=Gh135[i,j], with
Gv[i,j]=gv[i,j]+.beta.red .multidot..delta.Rv i,j]+green .beta.Gv[i
j] formula 54
Gv45[i,j]=gv]i,j]+.alpha.red.multidot..delta.Rv45+.alpha.green.multidot..d-
elta.Gv[i,j]+.alpha.blue.multidot..delta.Bv45[i,j] formula 55
Gv135[i,j]=gv[i,j]+.alpha.red.multidot..beta.Rv135[i,j]+.alpha.green.multi-
dot..delta.Gv[i,j]+.alpha.blue.multidot..delta.Bv135[i,j] formula
56
Gh[i,j]=gh[i,j]+.alpha.red.multidot..beta.Rh[i,j]+.alpha.green.multidot..d-
elta.Gh[i,j] formula 57
Gh45[i,j]=gh[i,j]+.alpha.red.multidot..delta.Rh45[i,j]+.alpha.green.multid-
ot..delta.Gh[i,j]+.alpha.blue.multidot..delta.Bh45[i,j] formula
58
Gh135[i,j]=[i,j]+.alpha.red.about..delta.Rh135[i,j]+.alpha.
green-6Gh[i,j]+.alpha.blue.multidot..delta.Bh135[i,j] formula
59
[0254] .alpha. red, .alpha. green, .alpha. blue, .beta. red and
.beta. green in formulae 54 - 59 each represent a constant which
may be 0 or a positive value, and they satisfy .alpha. red+.alpha.
green+.alpha. blue=1 and .beta. red+.beta. green=1. In the formula
above, gv[i,j] and gh[i,j] each constitute a term corresponding to
the "local average information of the green color component" and
are equivalent to the primary term in formula 1 or formula 2, and
.delta.Rv45[i,j], .delta.Rv [i,j], .delta.Rv135[i,j], .delta.
Rh45[i,j], .delta.Rh[i,j], .delta.Rh135[i,j], .delta. Gv45[i,j],
.delta.Gh[i,j], .delta.Bv45[i,j], .delta.Bv135[i,j],
.delta.Bh45[i,j] and .delta.Bh135[i,j] each represent a term
corresponding to the local curvature information in the
corresponding color component.
[0255] It is to be noted that the local average information of the
green color component and the local curvature information based
upon the individual color components are calculated as indicated
below, depending upon the direction along which similarity
manifests.
[0256] (local average information of the green color component)
gv[i,j]=(G[i,j-1]+G[i,j+1])/2 formula 60
gh[i,j]=(G[i-1, j]+G[i+1,j])/2 formula 61
Local Curvature Information Based Upon the Red Color Component
.delta.Rv45[i,j]=kr1(2,Z[i-2,j+2]-Z[i-2,j]-Z[i-2,j+41)/4+kr2(2.multidot.Z[-
i,j]-Z[i,j-2]-Z[i,j+2])/4+kr3(2,Z[i+2,j-2]-Z[i+2,j-4]-Z[i+2,j])/4
formula 62
.delta.Rv[i,j]=kr1(2,Z
i-2,j]-Z[i-2,j-2]-Z[i-2,j+2])/4+kr2(2.multidot.Z[i,-
j]-Z[i,j-2]-Z[i,j+2])/4+kr3(2 Z[i+2,j]-Z[i+2,j-2]-Z[i+2,j+2])/4
formula 63
.delta.Rv135[i,j]=kr1(2.multidot.Z[i-2,j-2]-Z[i-2,j-4]-Z[i-2,j])/4+kr2(2.m-
ultidot.Z[i,j]-Z[i,j-2]-Z[i,j+2])/4+kr3(2.multidot.Z[i+2,j+2]-Z[i+2,j]-Z[i-
+2,j+4])/4 formula 64
.delta.Rh45[i,j]=kr1(2.multidot.Z[i+2,j-2]-Z[i,j-2]-Z[i+4,j-2])/4+kr2
(2.multidot.Z[i, j]-Z[i-2, j]-Z[i+2,
j-2/4+kr3(2.multidot.Z[i-2,j+2]-Z[i-- 4,j+2]-Z[i,j+2])/4 formula
65
.delta.Rh[i,j]=kr1(2.multidot.Z[i,j-2]-Z[i-2,j-2]-Z[i+2,j-2])/4+kr2(2.mult-
idot.Z[i,j]-Z
i-2,j]-Z[i+2,j])/4+kr3(2.multidot.Z[i,j+2]-Z[i-2,j+2]-Z[i+2,-
j+2])/4 formula 66
.delta.Rh135[i,j]=kr1(2.multidot.Z[i-2,j-2]-Z[i-4,j-2]-Z[i,j-2])/4+kr2(2.m-
ultidot.Z[i,j]-Z[i-2,j]-Z[i+2,j])/4+kr3(2.multidot.Z[i+2,j+2]-Z[i,j+2j]-Z[-
i+4,j+2])/4 formula 67,
[0257] with kr1, kr2 and kr3 each representing a constant which may
be 0 or a positive value and satisfying kr1+kr2+kr3=1.
Local Curvature Information Based Upon the Green Color
Component
.delta.Gv [i,j]=(2.multidot.G[i-1,j]-G[i-1,j-2]-G[i-1,
j+2]+2.multidot.G[i+1,j]-G[i+1,j-2]-G[i+1,j+2])/8 formula 68
.delta.Gh[i,j]=(2.multidot.G[i,j-1]-G[i-2.sub.1j-1]-G[i+2,j-1]+2.multidot.-
G[i,j+1]-G[i-2,j+1]-G[i+2,j+1])/8 formula 69
Local Curvature Information Based Upon the Blue Color Component
.delta.Bv45[i,j]=(2.multidot.Z[i-1, j+1]-Z[i-1, j-1]-Z[i-1,
j+3]+2.multidot.Z[i+1,j-1]-Z[i+1,j-3]-Z[i+1,j+1])/8 formula 70
.delta.Bv135[i,j]=(2.multidot.Z[i-1, j-1]-Z[i-1, j-3]-Z[i-1,
j+1]+2.multidot.Z[i+1,j+1]-Z[i+1,j-1]-Z[i+1,j+3])/8 formula 71
.delta.Bh45[i,j]=(2.multidot.Z[i+1,j-1)-Z[i-1,
j-1]-Z[i+3,j-1]+2.multidot.- Z[i-1, j+1]-Z[i-3,j+1]-Z[i+1,j+1])/8
formula 72
.delta.Bh135[i,j=(2.multidot.Z[i+1,j-1]-Z[i-3,j-1-Z[i+1,j-1]+2.multidot.Z]-
i+1,j+1]-Z[i-1, j+1)-Z[i+3,j+1l)/8 formula 73
[0258] It is to be noted that FIGS. 13 and 14 show the positions of
the color information used when calculating local curvature
information based upon the individual color components. Namely,
local curvature information corresponding to a given color
component is obtained through weighted addition of the components
of the curvature information calculated by using color information
at the pixels contained within the area enclosed by the oval in
FIG. 13 or 14.
[0259] In other words, .delta.Rv45[i,j] is local curvature
information with directionality manifesting along the vertical
direction and the diagonal 45.degree. direction, .delta.Rv[i,j] is
local curvature information with directionality manifesting along
the vertical direction, .delta.Rv135[i,j] is local curvature
information with directionality manifesting along the vertical
direction and the diagonal 135.degree. direction, .delta.Rh45[i,j]
is local curvature information with directionality manifesting
along the horizontal direction and the diagonal 45.degree.
direction, .delta.Rh(i,j] is local curvature information with
directionality manifesting along the horizontal direction and
.delta.Rh135[i,j] is local curvature information with
directionality manifesting along the horizontal direction and the
diagonal 135.degree. direction.
[0260] In addition, .delta.Gv[i,j] is local curvature information
with directionality manifesting along the vertical direction and
(Gh[i,j] is local curvature information with directionality
manifesting along the horizontal direction.
[0261] .delta.Bv45[i,j] is local curvature information with
directionality manifesting along the vertical direction and the
diagonal 45.degree. direction, .delta.Bv135[i,j] is local curvature
information with directionality manifesting along the vertical
direction and the diagonal 135.degree. direction, .delta.Bh45[i,j]
is local curvature information with directionality manifesting
along the horizontal direction and the diagonal 45.degree.
direction and .delta.Bh135[i,j] is local curvature information with
directionality manifesting along the horizontal direction and the
diagonal 135.degree. direction.
[0262] It is to be noted that the first embodiment explained
earlier is equivalent to a situation in which the ratios of the
coefficients in formulae 54.about.59 and formulae 62.about.67 in
the second embodiment are set at;
.alpha. red : .alpha. green : .alpha. blue=1:0:1, .beta. red :
.beta. green=1:1 and kr1: kr2 : kr3=0:1:0.
[0263] In the second embodiment, G interpolation processing with
varying characteristics and achieving various advantages can be
realized by setting different ratios for these coefficients. The
following is an explanation of typical examples of ratio setting
for the coefficients and features and advantages of the individual
examples.
Example 1
.alpha. red : .alpha. green : .alpha. blue=1:0:1, .beta. red :
.beta. green=1:1 and kr1: kr2 : kr3=1:6:1.
[0264] By setting these ratios, the method of calculating the local
curvature information based upon the red color component is changed
from that adopted in the first embodiment, and the local curvature
information based upon the red color component extracted from a
wider range than in the first embodiment is made to undergo mild
low pass filtering as appropriate while taking into consideration
the directionality. Thus, by adopting the settings in example 1,
the overall effect for reducing over correction is improved over
the first embodiment.
Example 2
.alpha. red : .alpha. green : .alpha. blue=1:1:0, .beta. red :
.beta. green=1:1 and kr1: kr2 : kr3=0:1:0.
[0265] Through these settings, the local curvature information
based upon the blue color component which prevents an over
correction attributable to the local curvature information based
upon the red color component when similarity manifests along the
diagonal direction in the first embodiment is all substituted with
local curvature information based upon the green color component.
Thus, by adopting the settings in example 2, the need for making a
judgment with regard to similarity manifesting along the diagonal
directions is eliminated to simplify the algorithm and, at the same
time, the extraction of structural information at a sufficient
level is achieved while preventing over correction.
Example 3
.alpha. red : .alpha. green : .alpha. blue=0:1:0, .beta. red :
.beta. green=0:1 and kr1: kr2 : kr3=setting not required
[0266] By adopting these settings, the local curvature information
based upon the green color component used to prevent over
correction of the red color component in example 2 is now used as
the main element in the correctional term. The structural
information can be extracted even when the correctional term is
constituted only of the curvature information based upon the green
color component. This means that the curvature information based
upon the green color component, too, contains a great deal of
structural information equivalent to the curvature information
based upon the red color component that passes through the center.
In addition, by adopting the settings in example 3, a correction is
performed with local curvature information based upon the same
color component, i.e., the green color component, as the color
component of the average information constituting the primary term.
Thus, through the settings in example 3, no over correction occurs
and the need for performing a judgment with regard to similarity
manifesting along the diagonal directions is eliminated as in
example 2, resulting in simplification of the algorithm.
Example 4
.alpha. red : .alpha. green : .alpha. blue=0:0:1, .beta. red :
.beta.green=0:1 and kr1: kr2 : kr3=setting not required
[0267] The relationship of this example to the first embodiment is
similar to the relationship between example 2 and example 3, and by
adopting these settings, the curvature information based upon the
color component used to prevent an over correction of the local
curvature information based upon the red color component is now
utilized as the main element in the correctional term. While it is
not possible to prevent over correction attributable to local
curvature information based upon the blue color component through
the settings in example 4, advantages comparable to those in the
first embodiment are achieved with regard to the extractions of the
local structural information.
Example 5
.alpha. red : .alpha. green : .alpha. blue=1:1:1, : .beta. red :
.beta. green=1:1 and kr1: kr2 kr3=1:0:1
[0268] These settings represent an example of ratios of the
coefficients effective even when the local curvature information
based upon the red color component that passes through the center
is not used as a measure against over -correction occurring at the
settings in example 4. By adopting the settings in example 5, the
degree of over correction attributable to local curvature
information based upon the blue color component can be reduced with
the local curvature information based upon the red color component
obtained from nearby pixels when similarity manifests along the
diagonal directions, while achieving the advantage of extracting
the local structural information as in example 3 and example 4.
Third Embodiment
[0269] The following is an explanation of the operation achieved in
the third embodiment.
[0270] It is to be noted that since the RB interpolation processing
in the third embodiment is implemented as in the first embodiment,
its explanation is omitted. However, in the third embodiment, color
differences containing the red color component are interpolated for
some of the pixels at which color information corresponding to the
green color component is present (equivalent to the pixels provided
with G color filters) through formula 51 presented earlier, and
color differences containing the red color component are
interpolated for the remaining pixels at which color information
corresponding to the green color component is present and for the
pixels at which color information corresponding to the blue color
component is present through formula 52.
[0271] The following is an explanation of the G interpolation
processing.
[0272] Since the closest pixels at which the green color component
(the closest green color component) is present are located along
the horizontal direction relative to the pixel to undergo the G
interpolation processing as shown in FIGS. 3A and 3B in the third
embodiment, it is not necessary to calculate the similarity degrees
or to judge the direction along which similarity manifests as
required in the first embodiment during the G interpolation
processing. However, the calculation of similarity degrees and
judgment with regard to the direction along which similarity
manifests may be performed along the diagonal 45.degree. direction
and the diagonal 135.degree. direction in which the second closest
pixels with green color component (the second closest green color
component) are present.
[0273] In the third embodiment, the interpolation processing unit
17 calculates the green color interpolation value G[i,j] through
the following formula 74 based upon the image data arranged as
shown in FIGS. 3A and 3B.
G[i,j]=(Gi-1,
]+G[i+1,j])/2+(2.multidot.Z[i,j]-Z[i-2,j]-Z[i+2,j])/8+(2.mul-
tidot.Z[i,j]-1]-Z[i-2,j-1]-Z[i+2,j-1]+2.multidot.Z[i,j+1]-Z[i-2,j+1]-Z[i+2-
,j+1j]/16 formula 74
[0274] In formula 74, the first term represents the "local average
information of the green color component", which is equivalent to
the primary terms in formula 1 and formula 2. In addition, while
the second term represents the "local curvature information based
upon a color component matching the color component at the
interpolation target pixel" and the third term represents the
"local curvature information based upon a color component other
than the color component at the interpolation target pixel", the
third term constitutes the "local curvature information based upon
the blue color component" if color information corresponding to the
red color component is present at the interpolation target pixel
(FIG. 3A), whereas the third term constitutes the "local curvature
information based upon the red color component" if color
information corresponding to the blue color component is present at
the interpolation target pixel (FIG. 3B).
[0275] In other words, the local average information of the green
color component (the primary term) is corrected by using the local
curvature information based upon the red color component and the
local curvature information based upon the blue color component at
phases opposite from each other in the third embodiment.
[0276] As a result, as in the first embodiment, even when the color
information corresponding to the red color component and the color
information corresponding to the blue color component are offset
relative to the color information corresponding to the green color
component due to a magnification chromatic aberration, the primary
term is corrected in correspondence to the average change in
quantity in the color information corresponding to the red color
component and the color information corresponding to the blue color
component in the third embodiment (see FIGS. 9A and 9B). Thus, even
when there is a magnification chromatic aberration at the
photographic optical system 12, the primary term can be corrected
for a desired pixel with the overshoot and undershoot occurring in
the G interpolation processing disclosed in U.S. Pat. No. 5,629,734
canceling out each other in the third embodiment. Consequently, the
occurrence of color artifacts due to over correction can be reduced
by adopting the third embodiment.
Fourth Embodiment
[0277] The following is an explanation of the operation achieved in
the fourth embodiment.
[0278] It is to be noted that since the RB interpolation processing
in the fourth embodiment is implemented as in the first embodiment,
its explanation is omitted.
[0279] In the following explanation of the G interpolation
processing, operating details identical to those in the first
embodiment are not explained. It is to be noted that the difference
between the G interpolation processing in the fourth embodiment and
the G interpolation processing in the first embodiment is in the
operation performed after a judgment is made with regard to degrees
of similarity manifested by the interpolation target pixel. For
this reason, a flowchart of the operation performed at the
interpolation processing unit 17 during the G interpolation
processing in the fourth embodiment is not provided.
[0280] The interpolation processing unit 17 ascertains degrees of
similarity manifesting by the interpolation target pixel as in the
first embodiment (corresponds to FIG. 4 S1.about.S11) and
classifies the type of similarity at the interpolation target pixel
as one of case 1.about.9 explained earlier.
[0281] Then, the interpolation processing unit 17 calculates the
inclination Gk[i,j] of the green color component and the
inclination Zk[i,j] of the red color component (or the blue color
component) relative to the direction perpendicular to the direction
judged to manifest marked similarity as indicated below.
[0282] In case 1,
Gk[i,j]=((G[i-1, j]+G[i,j-1])-(G[i,j+1]+G[i+1,j]))/2 formula 75
Zk[i,j]=((Z[i-2,j]+Z[i,j-2])-(Z[i,j+2]+Z[i+2,j]))/2 formula 76
[0283] In case2,
Gk[i,j]=G[i,j-1]-G[i,j+1] formula 77
Zk[i,j]=Z[i,j-2]-Z[i,j+2] formula 78
[0284] In case3,
Gk[i,j]=((G[i-1, j]+G[i,j+1])-(G[i,j-1]+G[i+1,j]))/2 formula 79
Zk[i,j]=((Z[i-2,j]+Z[i,j+2])-(Z[i,j-2]+Z[i+2,j]))/2 formula 80
[0285] In case 4, same as an case 1
[0286] In case 5,
Gk[i,j]=l, Zk[i,j]=1
[0287] In case 6, same as an case 3
[0288] In case 7, same as an case 1
[0289] In case 8
Gk[i,j]=G[i-1, j]-G[i+1,j] formula 81
Zk[i,j]=Z[i-2,j]-Z[i+2,j] formula 82
[0290] In case 9, same as an case 3
[0291] Next, the interpolation processing unit 17 calculates the
green color interpolation value G[i,j] as follows.
[0292] In case 1, G[i,j]=Gvk[i,j]
[0293] In case 2, G[i,j]=Gvk[i,j]
[0294] In case 3, G[i,j]=Gvk[i,j]
[0295] In case 4, G[i,j]=(Gvk[i,j]+Ghk[i,j])/2
[0296] In case 5, G[i,j]=(Gvk[i,j]+Ghk[i,j])/2
[0297] In case 6, G[i,j]=(Gvk[i,j]+Ghk[i,j])/2
[0298] In case 7, G[i,j]=Ghk[i,j]
[0299] In case 8, G[i,j]=Ghk[i,j]
[0300] In case 9, G[i,j]=Ghk[i,j], with
Gvk[i,j]=(G[i,j-1]+G[i,j+1])/2+Gk[i,j]/Zk[i,j] (2
Z[i,j]-Z[i,j-2]-Z[i,j+2]- )/4 formula 83
[0301] and
Ghk[i,j]=(G[i-1, j]+G[i+1,j])/2+Gk[i,j]/Zk[i,j](2 Z [i,
j]-Z[i-2,j]-Z[i+2,j])/4 formula 84.
[0302] In formulae 83 and 84, the first term represents the "local
average information of the green color component" which is
equivalent to the primary term in formula 1 and formula 2. The
second term is the "local curvature information based upon a color
component matching the color component at the interpolation target
pixel" multiplied by a weighting coefficient (a value indicating
the correlation between the inclination Gk[i,j] of the green color
component and the inclination zk[i,j] of the red color component
(or the blue color component): Gk[i,j]/Zk[i,j]), and is equivalent
to the correctional term.
[0303] Namely, in the fourth embodiment, the local average
information of the green color component is corrected by using the
"local curvature information based upon a color component matching
the color component at the interpolation target pixel" multiplied
by the weighting coefficient.
[0304] Now, a problem that arises when calculating the correction
value by simply adding the "local curvature information based upon
a color component matching the color component at the interpolation
target pixel" without being multiplied by the weighting coefficient
to the "local average information of the green color component" is
discussed.
[0305] For instance, when color information corresponding to the
green color component and color information corresponding to the
red color component (or the blue color component) are provided as
indicated by .circle-solid. in FIG. 15 (when the values indicated
by the color information corresponding to the green color component
start to increase at a specific position and the values indicated
by the color information corresponding to the red color component
(or the blue color component) start to decrease at the same
position), the "local curvature information based upon a color
component matching the color component at the interpolation target
pixel" indicates a positive value. Thus, in such a case, if the
"local curvature information based upon a color component matching
and the color component at the interpolation target pixel" is added
to the "local average information of the green color component"
without first multiplying it with the weighting coefficient, the
"local average information of the green color component", which
should be corrected along the negative direction, becomes corrected
in the positive direction as indicated by .DELTA. in FIG. 15,
resulting in an overshoot.
[0306] In other words, when color information corresponding to the
green color component and the color information corresponding to
the red color component (or the blue color component) change in
opposite directions from a specific position at a color boundary,
an overshoot or an undershoot occurs if the correctional term is
calculated simply by adding the "local curvature information based
upon a color component matching the color component at the
interpolation target pixel" to the "local average information of
the green color component" without first multiplying the "local
curvature information based upon a color component matching the
color component at the interpolation target pixel" by the weighting
coefficient.
[0307] In the embodiment, if color information corresponding to the
green color component and color information corresponding to the
red color component (or the blue color component) are provided as
indicated by .circle-solid. in FIG. 15, the sign of the inclination
Gk[i,j] corresponding to the green color component and the sign of
the inclination Zk[i,j] corresponding to the red color component
(or the blue color component) are opposite from each other
resulting in the weighting coefficient being a negative value.
Thus, the "local average information of the green color component"
is corrected in the desired direction as indicated by .quadrature.
in FIG. 15, preventing an overshoot or an undershoot from
occurring.
[0308] Consequently, the occurrence of color artifacts attributable
to over correction at a color boundary is reduced in the fourth
embodiment.
[0309] It is to be noted that while no restrictions are imposed
with regard to the value of the weighting coefficient in the fourth
embodiment, restrictions may be imposed to set the value of the
weighting coefficient within a specific range to ensure that the
correctional term does not become too large.
[0310] For instance, the range for the weighting coefficient may be
set; .vertline.Gk[i,j]/Zk[i,j].vertline..ltoreq.5.
Fifth Embodiment
[0311] The following is an explanation of the operation achieved in
the fifth embodiment.
[0312] It is to be noted that since the RB interpolation processing
in the fifth embodiment is performed as in the first embodiment,
its explanation is omitted. However, in the fifth embodiment, color
differences containing the red color component are interpolated for
some of the pixels at which color information corresponding to the
green color component is present through formula 51 and color
differences containing the red color component are interpolated for
the remaining pixels at which color information corresponding to
the green color component is present and for pixels at which color
information corresponding to the blue color component is present
through formula 52, as in the third embodiment.
[0313] Now, the G interpolation processing is explained.
[0314] In the fifth embodiment, in which the closest pixels at
which color information corresponding to the green color component
is present are set along the horizontal direction to a pixel to
undergo the G interpolation processing as shown in FIGS. 3A and 3B,
interpolation processing is achieved in the simplest manner by
using the color information at the pixels set along the horizontal
direction. Accordingly, the green color interpolation value G[i,j]
is calculated in the fifth embodiment as in case 8 in the fourth
embodiment.
[0315] Namely, the interpolation processing unit 17 calculates the
green color interpolation value G[i,j] through formula 85.
G[i,j]=(G[i-1,j]+G[i+1,j])/2+Gk[i,j]/Zk[i,j].multidot.(2.about.Z[i,j]-Z[i--
2,j]-Z[i+2,j])/4 formula 85,
[0316] with
Gk[i,j]=G[i-1,j]-G[i+1,j] formula 81
[0317] and
Zk[i,j]=Z[i-2,j]-Z[i+2,j] formula 82.
[0318] As indicated above, in the fifth embodiment, the local
average information of the green color component is corrected by
using the "local curvature information based upon a color component
matching the color component at the interpolation target pixel"
multiplied by the weighting coefficient (a value representing the
correlation between the inclination Gk[i,j] corresponding to the
green color component and the inclination Zk[i,j] corresponding to
the red color component (or the blue color component):
Gk[i,j]/Zk[i,j]) as in the fourth embodiment. As a result, the
occurrence of color artifacts attributable to over correction at a
color boundary can be reduced through the fifth embodiment.
[0319] It is to be noted that while an explanation is given above
in reference to the embodiments on an example in which the color
difference is used as a hue in the G interpolation processing and
the RB interpolation processing, G interpolation processing and RB
interpolation processing can be achieved in a similar manner by
using a color ratio or the like as a hue instead of a color
difference.
[0320] While an explanation is given above in reference to the
individual embodiments on an example in which the curvature
information based upon each color component is calculated through
quadratic differentiation, the present invention is not limited to
this example, and curvature information may be obtained through
differentiation of a higher order. In other words, any method may
be adopted as long as the degree of change in the rate of change
occurring in each color component is ascertained.
Sixth Embodiment
[0321] The following is an explanation of the operation achieved in
the sixth embodiment.
[0322] FIG. 16 is a functional block diagram representing the sixth
embodiment. In FIG. 16, the same reference numbers are assigned to
components achieving identical functions to those in the functional
block diagram in FIG. 1 to preclude the necessity for repeated
explanation of their structures.
[0323] The structure of an electronic camera 20 shown in FIG. 16
differs from that of the electronic camera 10 in FIG. 1 in that a
control unit 21 and an image processing unit 22 in FIG. 16 replace
the control unit 11 and the image processing unit 15 in FIG. 1,
with an interface unit 23 in FIG. 16 provided as an additional
component.
[0324] In addition, in FIG. 16, a personal computer 30 is provided
with a CPU 31, an interface unit 32, a hard disk 33, a memory 34, a
CD-ROM drive device 35 and a communication interface unit 36 with
the CPU 31 connected to the interface unit 32, the hard disk 33,
the memory 34, the CD-ROM drive device 35 and the communication
interface unit 36 via a bus.
[0325] It is to be noted that an interpolation processing program
(an interpolation processing program for executing interpolation
processing similar to that implemented at the interpolation
processing unit 17 in the various embodiments explained earlier)
recorded at a recording medium such as a CD-ROM 37 is pre-installed
at the personal computer 30 via the CD-ROM drive device 35. In
other words, the interpolation processing program is stored at the
hard disk 33 in an execution-ready state.
[0326] The following is an explanation of the operation achieved in
the sixth embodiment, given in reference to FIG. 16.
[0327] First, image data generated as in the electronic camera 10
shown in FIG. 1 are provided to the image processing unit 22 in the
electronic camera 20. The image data undergo image processing
(e.g., gradation conversion processing) other than interpolation
processing at the image processing unit 22, and the image data
having undergone the image processing are then recorded at the
recording unit 16 in an image file format.
[0328] This image file is provided to the personal computer 30 via
the interface unit 23.
[0329] Upon obtaining the image file via the interface unit 32, the
CPU 31 in the personal computer 30 executes the interpolation
processing program. The image data with resolutions corresponding
to the individual color components enhanced through the
interpolation processing then undergo image compression and the
like as necessary, are recorded at the hard disk 33 or the like and
are finally output as data in a calorimetric system corresponding
to the type of individual device connected, such as a display or a
printer.
[0330] Namely, interpolation processing similar to that achieved in
the embodiments explained earlier is implemented on the personal
computer 30 in the sixth embodiment.
[0331] While an explanation is given in reference to the sixth
embodiment on an example in which the interpolation processing
program is provided through a recording medium such as the CD-ROM
37, the recording medium that may be used is not limited to a
CD-ROM and any of various types of recording media including
magnetic tape and a DVD may be used instead.
[0332] In addition, programs may be provided via a transmission
medium such as a communication line 38, a typical example of which
is the Internet. In other words, the programs which are first
converted to signals on a carrier wave that carries a transmission
medium may be transmitted. The personal computer 30 shown in FIG.
16 has such a function as well.
[0333] The personal computer 30 is provided with the communication
interface unit 36 that connects with the communication line 38. A
server computer 39, which provides the interpolation processing
program, has the interpolation processing program stored at a
recording medium such as an internal hard disk. The communication
line 38 may be a communication line for connection with the
Internet or for a personal computer communication or it may be a
dedicated communication line 38. The communication line 38 may be a
telephone line or a wireless telephone line for a mobile telephone
or the like.
[0334] It is to be noted that the interpolation processing program
according to the present invention that is executed within the
electronic camera 10 in FIG. 1 is normally installed in a ROM (not
shown) or the like at the time of camera production. However, the
ROM in which the interpolation processing program is installed may
be an overwritible ROM, and the electronic camera may be then
connected to a computer assuming a structure similar to that shown
in FIG. 16, to allow an upgrade program to be provided from a
recording medium such as a CD-ROM via the computer. Furthermore, an
upgrade program may be obtained via the Internet or the like as
described earlier.
* * * * *