U.S. patent application number 12/922817 was filed with the patent office on 2011-01-20 for image quality evaluation system, method and program.
This patent application is currently assigned to NEC CORPORATION. Invention is credited to Toru Yamada.
Application Number | 20110013830 12/922817 |
Document ID | / |
Family ID | 41255116 |
Filed Date | 2011-01-20 |
United States Patent
Application |
20110013830 |
Kind Code |
A1 |
Yamada; Toru |
January 20, 2011 |
IMAGE QUALITY EVALUATION SYSTEM, METHOD AND PROGRAM
Abstract
Disclosed is a picture quality evaluation system that comprises
a difference calculation part, which calculates the difference
between data that represent a feature value of a pixel set
comprising at least 1 pixel that constitutes a first image and data
that represent a feature value of a pixel set comprising at least 1
pixel that constitutes a second image, a main area of focus
calculation part, which uses at least the first image or the second
image to determine the main area of focus of an image that has a
specific feature and then calculates the main area of focus, which
indicates the extent of the main area of focus, a difference
weighting part, which weights the difference in the feature value
in the pixel set contained in the main area of focus, based on the
main area of focus, and a picture quality value calculation part,
which calculates the picture quality value of the first image,
based on the weighted difference from the difference weighting
part.
Inventors: |
Yamada; Toru; (Tokyo,
JP) |
Correspondence
Address: |
SUGHRUE MION, PLLC
2100 PENNSYLVANIA AVENUE, N.W., SUITE 800
WASHINGTON
DC
20037
US
|
Assignee: |
NEC CORPORATION
Tokyo
JP
|
Family ID: |
41255116 |
Appl. No.: |
12/922817 |
Filed: |
April 28, 2009 |
PCT Filed: |
April 28, 2009 |
PCT NO: |
PCT/JP2009/058385 |
371 Date: |
September 15, 2010 |
Current U.S.
Class: |
382/165 ;
382/190 |
Current CPC
Class: |
G06T 2207/30168
20130101; G06T 7/0002 20130101 |
Class at
Publication: |
382/165 ;
382/190 |
International
Class: |
G06K 9/46 20060101
G06K009/46; G06K 9/00 20060101 G06K009/00 |
Foreign Application Data
Date |
Code |
Application Number |
Apr 30, 2008 |
JP |
2008-118349 |
Claims
1. An image quality evaluation system comprising: a difference
calculator that calculates a difference between data representing a
feature of a pixel group comprised of at least one pixel making up
a first image, and data representing a feature of a pixel group
comprised of at least one pixel making up a second image; a
degree-of-focused-area calculator that decides a focused area in
the image having a predetermined feature using at least one of said
first and second images, and calculates a degree of focused area
indicating the degree of being said focused area; a difference
weighting section that applies weighting to the difference in
feature for a pixel group falling within said focused area based on
said degree of focused area; and an image quality value calculator
that calculates an image quality value for said first image based
on the difference weighted by said difference weighting
section.
2. An image quality evaluation system according to claim 1, wherein
said degree-of-focused-area calculator performs decision of a
focused area on a pixel group-by-pixel group basis.
3. An image quality evaluation system according to claim 1, wherein
said degree-of-focused-area calculator decides whether a pixel is a
pixel of interest for said focused area based on at least a pixel
value of said pixel making up said first or second image, and
decides a focused area comprised of at least one or more pixels
based on said pixel of interest.
4. An image quality evaluation system according to claim 3, wherein
said degree-of-focused-area calculator decides a pixel as a pixel
of interest in a case that a pixel value of said pixel falls within
a specific range in a color space.
5. An image quality evaluation system according to claim 4, wherein
said specific range in said color space is a predetermined range
defined by a YCbCr color space represented by a luminance value and
a color difference value.
6. An image quality evaluation system according to claim 4, wherein
said specific range in said color space is a range with a value Y
indicating the luminance ranging 48.ltoreq.y.ltoreq.224, a value Cb
indicating the blue difference ranging 104<Cb<125, and a
value Cr indicating the red difference ranging
135<Cr<171.
7. An image quality evaluation system according to claim 4, wherein
said specific range in said color space is a predetermined range
defined in an RGB color space represented by RGB values indicating
three primary colors, red, blue and green.
8. An image quality evaluation system according to claim 3, wherein
said degree-of-focused-area calculator calculates a degree of
focused area for a pixel group of pixels decided to be said pixel
of interest as one and that for a pixel group of pixels decided not
to be said pixel of interest as zero.
9. An image quality evaluation system according to claim 3, wherein
said degree-of-focused-area calculator accumulates the number of
pixels decided to be said pixel of interest among pixels within
said pixel group, and calculates a degree of focused area for said
pixel group based on a result of said accumulation.
10. An image quality evaluation system according to claim 3,
wherein said degree-of-focused-area calculator accumulates the
number of pixels decided to be a pixel of interest among pixels
within said pixel group and the number of pixels decided to be a
pixel of interest among pixels near said pixel group, and
calculates a degree of focused area for said pixel group based on a
result of the accumulation.
11. An image quality evaluation system according to claim 1,
wherein said data representing a feature for the first image and
that representing a feature for the second image are information on
any one of a luminance value, a color difference value and an RGB
value, or a combination thereof, for at least part of pixels making
up said first and second images.
12. An image quality evaluation system according to claim 1,
wherein said data representing a feature for the first image and
that representing a feature for the second image are an average of
information on any one of a luminance value, a color difference
value and an RGB value, or a combination thereof, for pixels
contained in at least part of pixel groups making up said first and
second images.
13. An image quality evaluation system according to claim 1,
wherein said data representing a feature for the first image and
that representing a feature for the second image are an average of
absolute differences within an image group between an average of
information on any one of a luminance value, a color difference
value and an RGB value, or a combination thereof, and said
information for each pixel within said pixel group, for pixels
contained in at least part of pixel groups making up said first and
second images.
14. An image quality evaluation system according to claim 1,
wherein said data representing a feature for the first image and
that representing a feature for the second image are a variance of
information on any one of a luminance value, a color difference
value and an RGB value, or a combination thereof, for pixels
contained in at least part of pixel groups making up said first and
second images.
15. An image quality evaluation method comprising: calculating a
difference between data representing a feature of a pixel group
comprised of at least one pixel making up a first image, and data
representing a feature of a pixel group comprised of at least one
pixel making up a second image; deciding a focused area in the
image having a predetermined feature using at least one of said
first and second images, and calculating a degree of focused area
indicating the degree of being said focused area; applying
weighting to the difference in feature for a pixel group falling
within said focused area based on said degree of focused area; and
calculating an image quality value for said first image based on
said weighted difference.
16. An image quality evaluation method according to claim 15,
wherein decision of said focused area is performed on a pixel
group-by-pixel group basis.
17. An image quality evaluation method according to claim 15 or 16,
comprising: deciding whether a pixel is a pixel of interest for
said focused area based on at least a pixel value of said pixel
making up said first or second image; and deciding a focused area
comprised of at least one or more pixels based on said pixel of
interest.
18. An image quality evaluation method according to claim 17,
comprising deciding a pixel as a pixel of interest in a case that
the pixel value of said pixel falls within a specific range in a
color space.
19. An image quality evaluation method according to claim 18,
wherein said specific range in said color space is a predetermined
range defined by a YCbCr color space represented by a luminance
value and a color difference value.
20. An image quality evaluation method according to claim 18,
wherein said specific range in said color space is a range with a
value Y indicating the luminance ranging 48.ltoreq.y.ltoreq.224, a
value Cb indicating the blue difference ranging 104<Cb<125,
and a value Cr indicating the red difference ranging
135<Cr<171.
21. An image quality evaluation method according to claim 18,
wherein said specific range in said color space is a predetermined
range defined in an RGB color space represented by RGB values
indicating three primary colors, red, blue and green.
22. An image quality evaluation method according to claim 17,
comprising calculating a degree of focused area for a pixel group
of pixels decided to be said pixel of interest as one and that for
a pixel group of pixels decided not to be said pixel of interest as
zero.
23. An image quality evaluation method according to claim 17,
comprising: accumulating the number of pixels decided to be said
pixel of interest among pixels within a pixel group; and
calculating a degree of focused area for said pixel group based on
a result of said accumulation.
24. An image quality evaluation method according to claim 17,
comprising: accumulating the number of pixels decided to be a pixel
of interest among pixels within a pixel group and the number of
pixels decided to be a pixel of interest among pixels near said
pixel group; and calculating a degree of focused area for said
pixel group based on a result of the accumulation.
25. An image quality evaluation method according to claim 15,
wherein said data representing a feature for the first image and
that representing a feature for the second image are information on
any one of a luminance value, a color difference value and an RGB
value, or a combination thereof, for at least part of pixels making
up said first and second images.
26. An image quality evaluation method according to claim 15,
wherein said data representing a feature for the first image and
that representing a feature for the second image are an average of
information on any one of a luminance value, a color difference
value and an RGB value, or a combination thereof, for pixels
contained in at least part of pixel groups making up said first and
second images.
27. An image quality evaluation method according to claim 15,
wherein said data representing a feature for the first image and
that representing a feature for the second image are an average of
absolute differences within an image group between an average of
information on any one of a luminance value, a color difference
value and an RGB value, or a combination thereof, and said
information for each pixel within said pixel group, for pixels
contained in at least part of pixel groups making up said first and
second images.
28. An image quality evaluation method according to claim 15,
wherein said data representing a feature for the first image and
that representing a feature for the second image are a variance of
information on any one of a luminance value, a color difference
value and an RGB value, or a combination thereof, for pixels
contained in at least part of pixel groups making up said first and
second images.
29. A non-transitory computer readable storage medium storing a
program causing an information processing apparatus to execute the
processing of: calculating a difference between data representing a
feature of a pixel group comprised of at least one pixel making up
a first image, and data representing a feature of a pixel group
comprised of at least one pixel making up a second image; deciding
a focused area in the image having a predetermined feature using at
least one of said first and second images, and calculating a degree
of focused area indicating the degree of being said focused area;
applying weighting to the difference in feature for a pixel group
falling within said focused area based on said degree of focused
area; and calculating an image quality value for said first image
based on said weighted difference.
30. An image quality evaluation method comprising: calculating a
difference between data representing a feature of a pixel group
comprised of at least one pixel making up a first image, and data
representing a feature of a pixel group comprised of at least one
pixel making up a second image; deciding a focused area in the
image having a predetermined feature on a pixel group-by-pixel
group basis using at least one of said first and second images; and
applying weighting to the difference in feature for a pixel group
falling within said decided focused area.
31. An image quality evaluation method according to claim 30,
wherein the decision of said focused area is, based on at least a
pixel value of the pixel making up said first or second image,
deciding whether or not the pixel value of said pixel falls within
a specific range in a color space, and deciding said pixel group as
a focused area based on the pixel which has been decided to be
fallen within said specific range.
32. An image quality evaluation method according to claim 31,
wherein the specific range in said color space is a predetermined
range defined by a YCbCr color space represented by a luminance
value and a color difference value.
33. An image quality evaluation method according to claim 31,
wherein the specific range in said color space is a range with a
value Y indicating luminance ranging 48.ltoreq.y.ltoreq.224, a
value Cb indicating a blue difference ranging 104<Cb<125, and
a value Cr indicating a red difference ranging
135<Cr<171.
34. An image quality evaluation method according to claim 31,
wherein the decision of said focused area is accumulating the
number of pixels which have been decided to be fallen within the
specific range in said color space among pixels within said pixel
group and within a pixel group near said pixel group; and in a case
that said number of the pixels is greater than or equal to a
predetermined threshold, deciding said pixel group as a focused
area.
Description
TECHNICAL FIELD
[0001] The present invention relates to image quality evaluation
system, method and program.
BACKGROUND ART
[0002] Methods of objectively evaluating image quality of image
data transmitted via a network (which will be referred to as an
image hereinbelow) or coded images include methods of using the
absolute difference, squared difference, and S/N ratio of pixel
values (luminances, color differences, RGB values, etc.) between an
original image and an image of interest for evaluation. In the
methods, however, a difference between two images is directly
reflected on the objective image quality value without taking
account of human visual properties, and therefore, there has been a
problem that the correlation with a result of subjective evaluation
made by a person via visual evaluation is low.
[0003] A method relating to the problem is disclosed in Patent
Document 1. The image quality evaluation method as disclosed in
Patent Document 1 improves the correlation with the subjective
evaluation value by applying weighting to the absolute difference,
squared difference, S/N ratio or the like as described above based
on the power of the alternating-current (AC) components for pixel
values while taking account of human visual properties varying with
the spatial frequency. It is known that a human visual property is
insensitive to signals of higher frequency, and weighting is
controlled according to the amount of signals of higher frequency
contained in an image.
[0004] Patent Document 1: JP-3458600B2
DISCLOSURE OF THE INVENTION
Problems to be Solved by the Invention
[0005] The image quality evaluation method disclosed in Patent
Document 1, however, poses a problem that it may still give a low
correlation with a result of subjective evaluation. For example, it
is known that in a result of subjective evaluation, quality
degradation is detected more easily in a video area upon which a
person is likely to focus (which will be referred to as a focused
area hereinbelow) than in other video areas. However, the method
disclosed in Patent Document 1 does not take account of whether an
area is a focused area. Hence, a focused area may be evaluated to
be less degraded when the area contains a larger amount of
high-frequency signals. On the other hand, areas other than the
focused area may be evaluated to be more degraded when the areas
have a smaller amount of high-frequency signals.
[0006] Thus, the present invention has been made in view of such a
problem, and its object is to provide image quality evaluation
system, method and program with which a higher correlation with a
result of subjective evaluation can be obtained.
Means for Solving the Problems
[0007] The present invention for solving the aforementioned problem
is an image quality evaluation system, characterized in comprising:
a difference calculating section for calculating a difference
between data representing a feature of a pixel group comprised of
at least one pixel making up a first image, and data representing a
feature of a pixel group comprised of at least one pixel making up
a second image; a degree-of-focused-area calculating section for
using at least one of said first and second images to decide a
focused area in the image having a predetermined feature, and
calculating a degree of focused area indicating the degree of being
said focused area; a difference weighting section for applying
weighting to the difference in feature for a pixel group falling
within said focused area based on said degree of focused area; and
an image quality value calculating section for calculating an image
quality value for said first image based on the difference weighted
by said difference weighting section.
[0008] The present invention for solving the aforementioned problem
is an image quality evaluation method, characterized in comprising:
calculating a difference between data representing a feature of a
pixel group comprised of at least one pixel making up a first
image, and data representing a feature of a pixel group comprised
of at least one pixel making up a second image; using at least one
of said first and second images to decide a focused area in the
image having a predetermined feature, and calculating a degree of
focused area indicating the degree of being said focused area;
applying weighting to the difference in feature for a pixel group
falling within said focused area based on said degree of focused
area; and calculating an image quality value for said first image
based on said weighted difference.
[0009] The present invention for solving the aforementioned problem
is a program, characterized in causing an information processing
apparatus to execute the processing of: calculating a difference
between data representing a feature of a pixel group comprised of
at least one pixel making up a first image, and data representing a
feature of a pixel group comprised of at least one pixel making up
a second image; using at least one of said first and second images
to decide a focused area in the image having a predetermined
feature, and calculating a degree of focused area indicating the
degree of being said focused area; applying weighting to the
difference in feature for a pixel group falling within said focused
area based on said degree of focused area; and calculating an image
quality value for said first image based on said weighted
difference.
EFFECTS OF THE INVENTION
[0010] The present invention provides a result of image quality
evaluation having a higher correlation with a result of subjective
evaluation.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] FIG. 1 is a block diagram showing a configuration of a first
embodiment of the present invention.
[0012] FIG. 2 is a flow chart for explaining an operation of the
first embodiment of the present invention.
[0013] FIG. 3 is a block diagram showing a configuration of a
second embodiment of the present invention.
[0014] FIG. 4 is a block diagram showing a configuration of a third
embodiment of the present invention.
[0015] FIG. 5 is a diagram for explaining a pixel area examined to
detect a focused area in the third embodiment of the present
invention.
[0016] FIG. 6 is a block diagram showing a computer system, which
is a fourth embodiment of the present invention.
[0017] FIG. 7 is a flow chart for explaining an operation in an
example of the present invention.
EXPLANATION OF SYMBOLS
[0018] 101 Difference calculating section [0019] 102
Degree-of-focused-area calculating section [0020] 103 Difference
weighting section [0021] 104 Image quality calculating section
BEST MODES FOR CARRYING OUT THE INVENTION
First Embodiment
[0022] A first embodiment of the present invention will now be
described in detail with reference to the accompanying
drawings.
[0023] Referring to FIG. 1, an image evaluation system in the first
embodiment of the present invention is comprised of a difference
calculating section 101, a degree-of-focused-area calculating
section 102, a difference weighting section 103, and an image
quality calculating section 104.
[0024] The difference calculating section 101 is supplied as input
with data representing a feature of a first image, which is an
image of interest for evaluation, and data representing a feature
of a second image, which is an original image for use as a control
for comparison, for each group of pixels comprised of at least one
pixel (which will be referred to as a pixel group hereinbelow). It
should be noted that the first and second images may be moving
pictures or still images.
[0025] In the present invention, a pixel group refers to a group of
pixels comprised of at least one pixel, and is a concept
encompassing not only a group of a plurality of pixels but also a
group containing only one pixel. An example of a pixel group having
a plurality of pixels may be a block of 16 by 16 pixels.
[0026] An example of data representing a feature may be a pixel
value of a pixel when the pixel group contains one pixel. The pixel
value refers to, for example, information on the luminance value,
color difference value or RGB value, or a combination thereof.
[0027] Another example of data representing a feature may be an
average of pixel values of pixels within a pixel group when the
pixel group is a group of a plurality of pixels. Besides, the data
representing a feature may be a statistical quantity of AC
components for a pixel group, which may be, for example, an average
of absolute differences obtained by calculating an average of pixel
values of pixels within a pixel group, and calculating an absolute
difference between that average and a pixel value of each pixel
within the pixel group to determine the average of the absolute
differences, or may be a variance of pixel values within the pixel
group. Moreover, the data representing a feature may be a transform
factor after applying orthogonal transform to pixel values in a
pixel group. While the feature corresponding to a pixel group may
be input to the difference calculating section 101 as described
above when the pixel group is a group of a plurality of pixels, a
configuration in which pixel values of pixels in a pixel group are
input to the difference calculating section 101 and a feature is
calculated on a pixel group-by-pixel group basis as described above
at the difference calculating section 101 may be contemplated.
[0028] The difference calculating section 101 calculates a
differential value in feature between the data representing the
feature for the first image and that representing the feature for
the second image. For example, in a case that the first and second
images are moving picture data, data representing a feature input
to the difference calculating section 101 is a feature of a pixel
group in each frame in the moving picture. The differential value
is calculated as an absolute value of a difference or a squared
difference between, for example, a feature of a pixel group at a
position in a frame in the first image and that of a pixel group at
the same position and in a frame at the same time as those of the
first image, in the second image.
[0029] It should be noted that the data representing a feature for
the first image or that representing a feature for the second image
is not always the feature of all pixel groups within the image. For
example, when image transmission in a network is involved, values
for only part of pixel groups within an image can be acquired due
to transmission errors or the like in some cases. In other cases,
to reduce the transmission load, the feature of only part of pixel
groups within an image is transmitted in the first place, for
example, in a case that data representing a feature for the second
image is transmitted for every other pixel group. In such cases, a
differential value is calculated for the feature of a pixel group
only in the frame and at the position that can be referred to by
both the first and second images.
[0030] The degree-of-focused-area calculating section 102 is
supplied with pixel values of at least part of pixels in the first
image as input. Although not shown in FIG. 1, pixel values of at
least part of pixels in the second image may be supplied as input.
Moreover, pixel values of at least part of pixels in the first and
second images may be supplied as input, as described earlier. In
this case, respective degrees of focused area, which will be
discussed later, are calculated for the first and second images,
and the average thereof may be regarded as the degree of focused
area for the current pixel.
[0031] The degree-of-focused-area calculating section 102
identifies a focused area that has a predetermined feature within
an image, and calculates a degree of focused area indicating the
degree of being a focused area.
[0032] As used herein, the predetermined feature refers to, for
example, a specific color or a specific shape (for example,
characters, a face, or a specific building). The focused area is an
area having such a feature in an image.
[0033] In identifying a focused area, first, a pixel value is
looked up on a pixel-by-pixel basis, and a degree of pixel of
interest indicating a degree of how much the current pixel is a
pixel of interest having the predetermined feature is calculated.
As an example, a pixel having a color that resembles a human skin
is defined as pixel of interest. In particular, decision is made as
to whether an input pixel is a pixel of interest depending upon
whether the pixel value of the pixel falls within a specific range
in a color space. Examples of the color spaces include a YCbCr
color space that is represented by luminance value and color
difference values, and an RGB color space represented by RGB values
indicating three primary colors, red, blue and green. It should be
noted that definition of a pixel of interest is not limited to a
color that resembles a human skin, and may be another definition.
For example, in a case that the YCbCr color space is employed,
values of the luminance Y, blue color difference Cb, and red color
difference Cr for each pixel are input as a pixel value to the
degree-of-focused-area calculating section 102.
[0034] Moreover, in a case that the pixel value of an input pixel
has a specific color, the current pixel of interest is decided to
be a pixel of interest. In general, a person tends to focus upon a
human figure appearing in an image, so that a color that resembles
a human skin color is defined as the specific color. In particular,
the specific color is defined as such a range in a YCbCr color
space that covers a whole range in an RGB space containing a color
recognized to resemble a skin color via subjective experimentation
(visual evaluation) as represented in the RGB color space.
Therefore, a range broader than the skin color in RGB
representation satisfies the definition. In a case that the
luminance Y ranges 48.ltoreq.y.ltoreq.224, blue difference Cb
ranges 104<Cb<125, and red difference Cr ranges
135<Cr<171 according to subjective experimentation as
described above, it is considered that the color resembles a human
skin color and that pixel is decided to be a pixel of interest. The
range of pixel values is not limited to the range of values
described above, and may include other values.
[0035] Next, based on a result of the decision as to whether each
pixel is a pixel of interest, the degree of pixel of interest for
the pixel that is a pixel of interest is defined as one, and that
for the pixel that is not a pixel of interest is defined as zero.
It should be noted that the degree of pixel of interest is not
limited to only zero and one, which is based upon whether a current
pixel is a pixel of interest or not, and it may be a continuous or
multi-step value. For example, a method of defining a value
obtained by normalizing the distance from a specific point in the
YCbCr color space as the degree of pixel of interest may be
contemplated.
[0036] Next, a focused area is identified based on the calculated
degree of pixel of interest, and a degree of focused area for the
focused area is calculated. A method of calculating the degree of
focused area involves defining a set of pixels having a degree of
pixel of interest of one as a focused area, defining a degree of
focused area for pixels within the area as one, and defining a
degree of focused area for pixels outside the area as zero. It
should be noted that in a case that the degree of pixel of interest
is defined as a continuous or multi-step value, the method may
involve comparing the degree of pixel of interest with a predefined
threshold, defining a set of pixels having a degree of pixel of
interest higher than the predefined threshold as a focused area,
and defining the degree of focused area of pixels within the
focused area as one.
[0037] The degree of focused area may also be a continuous or
multi-step value, and for example, in a case that the degree of
pixel of interest is defined as a continuous or multi-step value,
and a set of pixels having a higher degree of pixel of interest
than a predefined threshold is defined as a focused area, the
degree of focused area may be defined as a continuous or multi-step
value in proportion with the total or average of the degree of
pixel of interest for pixels contained in the focused area.
Moreover, a pixel having a degree of focused area of zero near a
pixel having a degree of focused area of non-zero may be redefined
to have a degree of focused area of non-zero. This is done by
taking account of the fact that while a person watches a focused
area, pixels near the area come into sight.
[0038] The difference weighting section 103 uses the degree of
focused area calculated at the degree-of-focused-area calculating
section 102 to apply weighting to the differential value for a
pixel group falling within the focused area, and outputs a weighted
differential value.
[0039] In applying weighting to the differential value for a pixel
group, a differential value for a pixel group containing pixels
having a degree of focused area of one is multiplied by A, and that
for a pixel group containing pixels having a degree of focused area
of zero is multiplied by B. A, B are weighting factors, where
A>B. For example, processing is executed such that a
differential value for a pixel group containing pixels having a
degree of focused area of one is multiplied by two, and that for a
pixel group containing pixels having a degree of focused area of
zero is multiplied by one. As another example, in a case that the
degree of focused area for each pixel is calculated as a continuous
value, the weighting factor may be defined such that (weighting
factor)=1+(degree of focused area), or the like.
[0040] In a case that a pixel group is comprised of a plurality of
pixels, part of the pixel group may not fall within a focused area.
In such a case, a decision is made as to whether a pixel group
falls within a focused area according to a proportion of the area
of the pixel group falling within the focused area, and in a case
that the pixel group is regarded as within the focused area, those
pixels falling outside the focused area are also regarded as within
the focused area and weighting is applied thereto using the
weighting factor similar to that described above.
[0041] The image quality value calculating section 104 outputs an
image quality value for evaluating image quality of the first image
based on the differential value weighted by the difference
weighting section 103. The image quality value is calculated in the
form of, for example, an average of weighted differential values of
the whole first image. The image quality value to be output may be
output directly as an average, or converted into another form such
as the S/N ratio and then output.
[0042] Next, an operation in this embodiment will be described with
reference to FIG. 2.
[0043] At Step 11, a difference between data representing a feature
for a first image and data representing a feature for a second
image is calculated.
[0044] At Step 21, a focused area is identified, and a degree of
focused area for a pixel in the focused area is calculated.
[0045] At Step 31, the degree of focused area calculated at Step 21
is used to apply weighting to the difference calculated at Step 11
for each pixel group described above.
[0046] At Step 41, an image quality value for the first image is
calculated based on the difference weighted at Step 31 to evaluate
the first image.
[0047] It should be noted that Steps 11 and 21 may be run in a
temporally reverse order, or in parallel.
[0048] The first embodiment can produce a result of image quality
evaluation having a higher correlation with a result of subjective
evaluation. The reason of this is that image quality evaluation is
made while taking account of whether an area is a focused area that
is focused by a person by calculating the degree of focused area.
Especially, by defining a color resembling a human skin as the
focused area, the degree of focused area can be calculated to have
a high correlation with a degree of actual focusing by a person.
This is because a person generally tends to focus upon a human
figure in a video, if present.
Second Embodiment
[0049] A second embodiment of the present invention will now be
described in detail with reference to the accompanying
drawings.
[0050] The second embodiment will address a case in which a pixel
group is comprised of one pixel.
[0051] Referring to FIG. 3, an image evaluation system in the
second embodiment of the present invention is comprised of a
difference calculating section 201, a degree-of-focused-area
calculating section 202, a difference weighting section 203, and an
image quality value calculating section 204.
[0052] The difference calculating section 201 is supplied as input
with data representing a feature of a first image, which is an
image of interest for evaluation, and data representing a feature
of a second image, which is an original image for use as a control
for comparison, on a pixel-by-pixel basis. An example of data
representing a feature is a pixel value of a pixel when the pixel
group contains one pixel. The pixel value refers to, for example,
information on the luminance value, color difference value or RGB
value, or a combination thereof.
[0053] The difference calculating section 201 calculates a
differential value between the data representing a feature for the
first image and that representing a feature for the second image.
For example, in a case that the first and second images are moving
picture data, data representing a feature input to the difference
calculating section 101 is a pixel value of each pixel in each
frame in the moving picture. In this case, the differential value
is calculated as an absolute value of a difference or a squared
difference between a pixel value of a pixel at a position in a
frame in the first image and that of a pixel at the same position
and in a frame at the same time as those of the first image, in the
second image.
[0054] It should be noted that the data representing a feature for
the first image or that representing a feature for the second image
is not always the pixel value of all pixels within the image. For
example, when image transmission in a network is involved, values
for only part of pixel groups within an image can be acquired due
to transmission errors or the like in some cases. In other cases,
to reduce the transmission load, only part of pixel values within
an image is transmitted in the first place, for example, in a case
that data representing a feature for the second image is
transmitted for every other pixel. In such cases, a differential
value is calculated only in the frame and at the position that can
be referred to by both the first and second images.
[0055] The degree-of-focused-area calculating section 202
calculates a degree of focused area indicating the degree of being
a focused area. Since in this embodiment, a pixel group contains
one pixel, the following description will be made considering the
degree of pixel of interest in the first embodiment described above
directly as the degree of focused area.
[0056] The degree-of-focused-area calculating section 202 first
decides whether a pixel is a pixel of interest for a focused area
from the input pixel value.
[0057] In this embodiment, as an example, a pixel having a color
that resembles a human skin is defined as pixel of interest. In
particular, decision is made as to whether an input pixel is a
pixel of interest depending upon whether the pixel value of the
pixel falls within a specific range in a color space. Examples of
the color spaces include a YCbCr color space that is represented by
luminance value and color difference values, and an RGB color space
represented by RGB values indicating three primary colors, red,
blue and green. It should be noted that definition of a pixel of
interest is not limited to a color that resembles a human skin, and
may be another definition. For example, in a case that the YCbCr
color space is employed, values of the luminance Y, blue color
difference Cb, and red color difference Cr for each pixel are input
as a pixel value to the degree-of-focused-area calculating section
102.
[0058] Moreover, in a case that the pixel value of an input pixel
has a specific color, the current pixel of interest is decided to
be a focused area. In general, a person tends to focus upon a human
figure appearing in a video, so that a color that resembles a human
skin color is defined as the specific color. In particular, the
specific color is defined as such a range in a YCbCr color space
that covers a whole range in an RGB space containing a color
recognized to resemble a skin color via subjective experimentation
(visual evaluation) as represented in the RGB color space.
Therefore, a range broader than the skin color in RGB
representation satisfies the definition. In a case that the
luminance Y ranges 48.ltoreq.y.ltoreq.224, blue difference Cb
ranges 104<Cb<125, and red difference Cr ranges
135<Cr<171 according to subjective experimentation as
described above, it is considered that the color resembles a human
skin color and that pixel is decided to be a pixel of interest. The
range of pixel values is not limited to the range of values
described above, and may include other values.
[0059] Next, based on a result of the decision as to whether each
pixel is a pixel of interest, the degree of focused area for the
pixel is calculated and output. For example, the degree of focused
area for the pixel that is a pixel of interest is defined as one,
and that for the pixel that is not a pixel of interest is defined
as zero. It should be noted that the degree of focused area may be
a continuous or multi-step value, instead of zero and one as
described above. Besides, the degree of focused area for a pixel
lying near the pixel of interest may be defined as 0.5. This is
done by taking account of the fact that while a person watches a
focused area, pixels near the pixel come into sight.
[0060] Moreover, the decision of a pixel of interest is not limited
to that relying upon the aforementioned method, and may rely upon
another method by, for example, detecting a viewer's line of
sight.
[0061] The difference weighting section 203 uses the degree of
focused area calculated at the degree-of-focused-area calculating
section 202 to apply weighting to the differential value calculated
at the difference calculating section 101 on a pixel-by-pixel
basis, and outputs a weighted differential value.
[0062] Now an example of weighting when the degree of focused area
is calculated to have a value of 0, 1 for each pixel at the
degree-of-focused-area calculating section 202 will be described. A
differential value of a pixel calculated to have a degree of
focused area of one is multiplied by A, and that of a pixel
calculated to have a degree of focused area of zero is multiplied
by B. A, B are weighting factors, where A>B. For example,
processing is executed such that a differential value of a pixel
calculated to have a degree of focused area of one is multiplied by
two, and that of a pixel calculated to have a degree of focused
area of one is multiplied by one.
[0063] As another example, in a case that the degree of focused
area for each pixel is calculated as a continuous value, the
weighting factor may be defined such that (weighting
factor)=1+(degree of focused area).
[0064] The image quality value calculating section 204 outputs an
image quality value for evaluating image quality of the first image
based on the differential value weighted by the difference
weighting section 203. The image quality value is calculated in the
form of, for example, an average of weighted differential values
for the whole first image. The image quality evaluation value to be
output may be output directly as an average, or converted into
another form such as the S/N ratio and then output.
Third Embodiment
[0065] Next, a third embodiment of the present invention will now
be described in detail.
[0066] Referring to FIG. 4, an image evaluation system in the third
embodiment of the present invention is comprised of a difference
calculating section 301, a degree-of-focused-area calculating
section 302, a difference weighting section 303, and an image
quality value calculating section 304.
[0067] The third embodiment is similar to the first embodiment
except that in calculating the degree of focused area for each
pixel group at the degree-of-focused-area calculating section 302,
the pixel group is comprised of not one pixel but a group of a
plurality of pixels and the focused area is identified on a pixel
group-by-pixel group (block-by-block) basis. This embodiment
addresses a case in which the difference calculating section 301 is
also supplied with data representing a feature for the first image
and that representing a feature for the second image for each pixel
group that is the same as the pixel group described above. Other
components are similar to those in the first embodiment, and
accordingly, explanation thereof will be omitted.
[0068] The difference calculating section 301 is supplied as input
with the data representing a feature for the first image and that
representing a feature for the second image. For example, in a case
that an average of pixel values within a pixel group is input, an
absolute value of a difference between the average of pixel values
within the pixel group at a position in a frame in the first image
and that within the pixel group at the same position and in a frame
at the same time as those of the first image, in the second image
is calculated as the differential value.
[0069] The degree-of-focused-area calculating section 302 is
supplied with pixel values of at least part of pixels in the first
image, as in the first embodiment, and decides whether each input
pixel is a pixel of interest. The method of decision is similar to
that in the first embodiment, and accordingly, explanation thereof
will be omitted.
[0070] Next, based on a result of the decision as to whether each
pixel is a pixel of interest, a degree of focused area is
calculated and output for each pixel group, which is a group of a
plurality of pixels.
[0071] As an example, a method of calculating a degree of focused
area when a pixel group is a block of 16 by 16 pixels will be
described. First, the number of pixels decided to be a pixel of
interest is accumulated within the pixel group. Then, a calculation
is made such that in a case that the number of pixels is equal to
or greater than a predefined threshold (for example, 128), the
degree of focused area for the pixel group is defined as one;
otherwise, as zero. Besides, the degree of focused area may be
defined such that (degree of focused area)=(the number of pixels
decided to be a pixel of interest)/(the number of pixels within the
pixel group). For example, in a case that the number of pixels
decided to be a pixel of interest is thirty, the degree of focused
area for that pixel group is 30/(16.times.16).
[0072] In addition to accumulating the number of pixels decided to
be a pixel of interest within a pixel group of interest for
decision, the number of pixels decided to be a pixel of interest
may be accumulated in an extent also including pixel groups near
the pixel group. This is done by taking account of the fact that
while a person watches an image, pixel groups near the pixel group
come into sight. Moreover, from a similar reason, pixel groups
having a degree of focused area of zero lying near a pixel group
having a degree of focused area of non-zero may be modified to have
a degree of focused area of non-zero.
[0073] As an example, a case in which a pixel group is a block of
16 by 16 pixels and a range shown in FIG. 5 represents vicinal
pixel groups will now be described.
[0074] In calculating a degree of focused area for a pixel group of
interest for decision, the number of pixels decided to be a pixel
of interest is accumulated in an extent including the pixel group
of interest for decision and vicinal pixel groups together (a block
of 48 by 48 pixels). In a case that the number of pixels decided to
be a pixel of interest is 55, the degree of focused area for the
pixel group of interest for decision is 55/(48.times.48). It should
be noted that the method of calculating a degree of focused area is
not limited to that described above, and various methods may be
contemplated, including a method of applying weighting depending
upon whether a pixel group is a pixel group of interest for
decision or a vicinal pixel group.
[0075] In this embodiment, a value having a higher correlation with
the degree of actual focusing by a person can be calculated by
identifying a focused area for each pixel group that is a group of
a plurality of pixels, and calculating a degree of focused area for
the image group. This is because pixels of interest covering an
extent having a certain size are focused more than only one pixel
is. For example, a person tends to focus upon an extent of a
certain size (such as a whole face) of pixels having a pixel value
that resembles a human skin color, rather than upon one pixel
having a pixel value that resembles a human skin color.
[0076] In defining a pixel group, a mode in which a component such
as an object recognizing section is provided to apply object
recognition to an image beforehand and define each object as a
pixel group may be contemplated. By calculating a degree of focused
area on an object-by-object basis, it is possible to calculate a
degree of focused area more accurately.
Fourth Embodiment
[0077] Next, a fourth embodiment in the present invention will be
described with reference to FIG. 6. In the fourth embodiment, the
moving picture processing apparatus described earlier regarding the
first embodiment is implemented by a computer system.
[0078] Referring to FIG. 6, the present system is provided with a
program-controlled processor 401. The program-controlled processor
401 is connected with a first image data buffer 402 and a second
image data buffer 403, as well as a program memory 404 for storing
therein required programs. Program modules stored in the program
memory 404 comprise a main program 405, as well as those for a
difference calculation processing module 406, a
degree-of-focused-area calculation processing module 407, a
difference weighting processing module 408, and an image quality
calculation processing module 409. The main program 405 is a
principal program for executing the image quality evaluation
processing. The program modules for the difference calculating
module 406, degree-of-focused-area calculation processing module
407, difference weighting processing module 408, and image quality
calculation processing module 409 are processing modules for
implementing the functions of the difference calculating section
101, degree-of-focused-area calculating section 102, difference
weighting section 103, and image quality calculating section 104
described above, respectively.
Example 1
[0079] An example of the present invention will now be
described.
[0080] The present invention is a specific example of the first
embodiment.
[0081] In this example, the second image, which is an original
image, is a moving picture of an SDTV size (720 pixels in a
horizontal direction, 480 pixels in a vertical direction, and 29.97
frames per second). The first image, which is an image of interest
for evaluation, is a moving picture obtained by encoding the moving
picture in an MPEG-2 format at 4 Mbps and decoding the encoded
image.
[0082] FIG. 7 is a flow chart illustrating an operation of this
example.
[0083] At Step 201, one frame of image data for a second image is
input to the difference calculating section 201. At Step 202, one
frame of image data for a first image at the same time as that of
the frame of image data for the second image is input to the
difference calculating section 201. At Step 203, the difference
calculating section 201 extracts pixel values of the image data for
these images at the same position, and calculates an absolute
difference between the pixel values as a differential value.
Assuming that the luminance value Y is 50 at a certain position in
the second image and 52 in the first image, the differential value
is 2.
[0084] At Step 204, the degree-of-focused-area calculating section
202 decides whether the current pixel is a pixel of interest from
the pixel value for the first image extracted at Step 203. In a
case that it is a pixel of interest, the degree of focused area for
that pixel is defined as one; otherwise, as zero. In particular, in
a case that the pixel value falls within a range of a luminance Y
ranging 48.ltoreq.y.ltoreq.224, a blue difference Cb ranging
104<Cb<125, and a red difference Cr ranging 135Cr<171, the
pixel value is decided to be a pixel of interest. For example, in a
case that the first image has a luminance value Y of 52, a blue
difference Cb of 110, and a red difference Cr of 150, the values
fall within the range described above, so that the pixel is decided
to be a pixel of interest and is given a degree of focused area of
one.
[0085] In a case that the degree of focused area is one for the
pixel, the difference weighting section 103 multiplies the
differential value corresponding to the pixel by two, and defines
the resulting value as a weighted differential value at Step 205.
In a case that the degree of focused area is zero for the pixel,
the difference weighting section 103 multiplies the differential
value by one, and defines the resulting value as a weighted
differential value at Step 206.
[0086] At Step 207, to determine a total value of weighted
differential values for the whole image, the weighted differential
value is added to a variable Sum. The initial value of the variable
Sum is zero.
[0087] At Step 208, a check is made as to whether the difference
calculation is completed for all pixels in one frame. In a case
that the calculation is not completed, the process goes back to
Step 203 and similar processing is applied to a pixel for which the
difference calculation is not completed yet.
[0088] In a case that the calculation is completed, a check is made
as to whether the processing is completed for all frames in the
first image at Step 209. In a case that the processing is not
completed, the process goes back to Step 201 and similar processing
is applied to a subsequent frame. In a case that the processing is
completed, the image quality calculating section 104 outputs the
total value Sum of the weighted differential values for the whole
image as an image quality value at Step 210. By this operation, the
processing is terminated.
[0089] The 1st mode of the present invention is characterized in
that an image quality evaluation system comprising: a difference
calculating section for calculating a difference between data
representing a feature of a pixel group comprised of at least one
pixel making up a first image, and data representing a feature of a
pixel group comprised of at least one pixel making up a second
image; a degree-of-focused-area calculating section for deciding a
focused area in the image having a predetermined feature using at
least one of said first and second images, and calculating a degree
of focused area indicating the degree of being said focused area; a
difference weighting section for applying weighting to the
difference in feature for a pixel group falling within said focused
area based on said degree of focused area; and an image quality
value calculating section for calculating an image quality value
for said first image based on the difference weighted by said
difference weighting section.
[0090] The 2nd mode of the present invention, in the
above-mentioned modes, is characterized in that said
degree-of-focused-area calculating section performs decision of a
focused area on a pixel group-by-pixel group basis.
[0091] The 3rd mode of the present invention, in the
above-mentioned modes, is characterized in that said
degree-of-focused-area calculating section decides whether a pixel
is a pixel of interest for said focused area based on at least a
pixel value of said pixel making up said first or second image, and
decides a focused area comprised of at least one or more pixels
based on said pixel of interest.
[0092] The 4th mode of the present invention, in the
above-mentioned modes, is characterized in that said
degree-of-focused-area calculating section decides a pixel as a
pixel of interest in a case that a pixel value of said pixel falls
within a specific range in a color space.
[0093] The 5th mode of the present invention, in the
above-mentioned modes, is characterized in that said specific range
in said color space is a predetermined range defined by a YCbCr
color space represented by a luminance value and a color difference
value.
[0094] The 6th mode of the present invention, in the
above-mentioned modes, is characterized in that said specific range
in said color space is a range with a value Y indicating the
luminance ranging 48.ltoreq.y.ltoreq.224, a value Cb indicating the
blue difference ranging 104<Cb<125, and a value Cr indicating
the red difference ranging 135<Cr<171.
[0095] The 7th mode of the present invention, in the
above-mentioned modes, is characterized in that said specific range
in said color space is a predetermined range defined in an RGB
color space represented by RGB values indicating three primary
colors, red, blue and green.
[0096] The 8th mode of the present invention, in the
above-mentioned modes, is characterized in that said
degree-of-focused-area calculating section calculates a degree of
focused area for a pixel group of pixels decided to be said pixel
of interest as one and that for a pixel group of pixels decided not
to be said pixel of interest as zero.
[0097] The 9th mode of the present invention, in the
above-mentioned modes, is characterized in that said
degree-of-focused-area calculating section accumulates the number
of pixels decided to be said pixel of interest among pixels within
said pixel group, and calculates a degree of focused area for said
pixel group based on a result of said accumulation.
[0098] The 10th mode of the present invention, in the
above-mentioned modes, is characterized in that said
degree-of-focused-area calculating section accumulates the number
of pixels decided to be a pixel of interest among pixels within
said pixel group and the number of pixels decided to be a pixel of
interest among pixels near said pixel group, and calculates a
degree of focused area for said pixel group based on a result of
the accumulation.
[0099] The 11th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are information on any one of a
luminance value, a color difference value and an RGB value, or a
combination thereof, for at least part of pixels making up said
first and second images.
[0100] The 12th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are an average of information on any
one of a luminance value, a color difference value and an RGB
value, or a combination thereof, for pixels contained in at least
part of pixel groups making up said first and second images.
[0101] The 13th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are an average of absolute differences
within an image group between an average of information on any one
of a luminance value, a color difference value and an RGB value, or
a combination thereof, and said information for each pixel within
said pixel group, for pixels contained in at least part of pixel
groups making up said first and second images.
[0102] The 14th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are a variance of information on any
one of a luminance value, a color difference value and an RGB
value, or a combination thereof, for pixels contained in at least
part of pixel groups making up said first and second images.
[0103] The 15th mode of the present invention is characterized in
that an image quality evaluation method comprising: calculating a
difference between data representing a feature of a pixel group
comprised of at least one pixel making up a first image, and data
representing a feature of a pixel group comprised of at least one
pixel making up a second image; deciding a focused area in the
image having a predetermined feature using at least one of said
first and second images, and calculating a degree of focused area
indicating the degree of being said focused area; applying
weighting to the difference in feature for a pixel group falling
within said focused area based on said degree of focused area; and
calculating an image quality value for said first image based on
said weighted difference.
[0104] The 16th mode of the present invention, in the
above-mentioned modes, is characterized in that decision of said
focused area is performed on a pixel group-by-pixel group
basis.
[0105] The 17th mode of the present invention, in the
above-mentioned modes, is characterized in that the image quality
evaluation method comprising deciding whether a pixel is a pixel of
interest for said focused area based on at least a pixel value of
said pixel making up said first or second image; and deciding a
focused area comprised of at least one or more pixels based on said
pixel of interest.
[0106] The 18th mode of the present invention, in the
above-mentioned modes, is characterized in that the image quality
evaluation method comprising deciding a pixel as a pixel of
interest in a case that the pixel value of said pixel falls within
a specific range in a color space.
[0107] The 19th mode of the present invention, in the
above-mentioned modes, is characterized in that said specific range
in said color space is a predetermined range defined by a YCbCr
color space represented by a luminance value and a color difference
value.
[0108] The 20th mode of the present invention, in the
above-mentioned modes, is characterized in that said specific range
in said color space is a range with a value Y indicating the
luminance ranging 48.ltoreq.y.ltoreq.224, a value Cb indicating the
blue difference ranging 104<Cb<125, and a value Cr indicating
the red difference ranging 135<Cr<171.
[0109] The 21st mode of the present invention, in the
above-mentioned modes, is characterized in that said specific range
in said color space is a predetermined range defined in an RGB
color space represented by RGB values indicating three primary
colors, red, blue and green.
[0110] The 22nd mode of the present invention, in the
above-mentioned modes, is characterized in that the image quality
evaluation method comprising: calculating a degree of focused area
for a pixel group of pixels decided to be said pixel of interest as
one and that for a pixel group of pixels decided not to be said
pixel of interest as zero.
[0111] The 23rd mode of the present invention, in the
above-mentioned modes, is characterized in that the image quality
evaluation method comprising: accumulating the number of pixels
decided to be said pixel of interest among pixels within a pixel
group; and calculating a degree of focused area for said pixel
group based on a result of said accumulation.
[0112] The 24th mode of the present invention, in the
above-mentioned modes, is characterized in that the image quality
evaluation method comprising: accumulating the number of pixels
decided to be a pixel of interest among pixels within a pixel group
and the number of pixels decided to be a pixel of interest among
pixels near said pixel group; and calculating a degree of focused
area for said pixel group based on a result of the
accumulation.
[0113] The 25th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are information on any one of a
luminance value, a color difference value and an RGB value, or a
combination thereof, for at least part of pixels making up said
first and second images.
[0114] The 26th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are an average of information on any
one of a luminance value, a color difference value and an RGB
value, or a combination thereof, for pixels contained in at least
part of pixel groups making up said first and second images.
[0115] The 27th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are an average of absolute differences
within an image group between an average of information on any one
of a luminance value, a color difference value and an RGB value, or
a combination thereof, and said information for each pixel within
said pixel group, for pixels contained in at least part of pixel
groups making up said first and second images.
[0116] The 28th mode of the present invention, in the
above-mentioned modes, is characterized in that said data
representing a feature for the first image and that representing a
feature for the second image are a variance of information on any
one of a luminance value, a color difference value and an RGB
value, or a combination thereof, for pixels contained in at least
part of pixel groups making up said first and second images.
[0117] The 29th mode of the present invention is characterized in
that a program causing an information processing apparatus to
execute the processing of: calculating a difference between data
representing a feature of a pixel group comprised of at least one
pixel making up a first image, and data representing a feature of a
pixel group comprised of at least one pixel making up a second
image; deciding a focused area in the image having a predetermined
feature using at least one of said first and second images, and
calculating a degree of focused area indicating the degree of being
said focused area; applying weighting to the difference in feature
for a pixel group falling within said focused area based on said
degree of focused area; and calculating an image quality value for
said first image based on said weighted difference.
[0118] Above, although the present invention has been particularly
described with reference to the preferred embodiments and modes
thereof, it should be readily apparent to those of ordinary skill
in the art that the present invention is not always limited to the
above-mentioned embodiment and modes, and changes and modifications
in the form and details may be made without departing from the
sprit and scope of the invention.
[0119] This application is based upon and claims the benefit of
priority from Japanese patent application No. 2008-118349, filed on
Apr. 30, 2008, the disclosure of which is incorporated herein in
its entirety by reference.
* * * * *