U.S. patent application number 11/892400 was filed with the patent office on 2008-09-11 for method and apparatus for generating user preference data regarding color characteristic of image and method and apparatus for converting image color preference using the method and apparatus.
This patent application is currently assigned to Samsung Electronics Co., Ltd.. Invention is credited to Won-hee Choi, Young-sik Huh, Sang-kyun Kim, Seong-deok Lee, Du-sik Park, Ki-won Yoo.
Application Number | 20080219548 11/892400 |
Document ID | / |
Family ID | 32510712 |
Filed Date | 2008-09-11 |
United States Patent
Application |
20080219548 |
Kind Code |
A1 |
Huh; Young-sik ; et
al. |
September 11, 2008 |
Method and apparatus for generating user preference data regarding
color characteristic of image and method and apparatus for
converting image color preference using the method and
apparatus
Abstract
A method and apparatus for generating user preference data
regarding the color characteristic of an image and a method and
apparatus for converting image color preference using the method
and apparatus is provided. The method for generating user
preference data comprises (a) obtaining an image color
characteristic value of a preference image and a reference image,
(b) generating {preference value, reference value} which
corresponds to a pair of the preference value and the reference
value, and (c) generating the pair {preference value, reference
value} as preference meta-data having at least one feature block.
The method for converting image color preference comprises
calculating a color characteristic value with respect to an input
image, generating preference meta-data having at least one feature
block, the feature block comprising a block header including a
feature identifier corresponding to information identifying a color
characteristic and at least one feature descriptor including the
preference value and the reference value, obtaining a color
characteristic value with respect to the input image using the
calculated color characteristic value of the input image and the
color preference data, and converting the color characteristic of
the input image so that the input image has the obtained color
characteristic value.
Inventors: |
Huh; Young-sik;
(Gyeonggi-do, KR) ; Park; Du-sik; (Gyeonggi-do,
KR) ; Lee; Seong-deok; (Gyeonggi-do, KR) ;
Yoo; Ki-won; (Seoul, KR) ; Choi; Won-hee;
(Gyeongsangbuk-do, KR) ; Kim; Sang-kyun;
(Gyeonggi-do, KR) |
Correspondence
Address: |
BUCHANAN, INGERSOLL & ROONEY PC
POST OFFICE BOX 1404
ALEXANDRIA
VA
22313-1404
US
|
Assignee: |
Samsung Electronics Co.,
Ltd.
Gyeonggi-do
KR
|
Family ID: |
32510712 |
Appl. No.: |
11/892400 |
Filed: |
August 22, 2007 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
10733388 |
Dec 12, 2003 |
|
|
|
11892400 |
|
|
|
|
Current U.S.
Class: |
382/162 |
Current CPC
Class: |
H04N 21/4854 20130101;
H04N 21/4318 20130101; H04N 5/57 20130101; H04N 9/68 20130101; H04N
9/73 20130101; H04N 9/64 20130101; H04N 21/84 20130101 |
Class at
Publication: |
382/162 |
International
Class: |
G06K 9/36 20060101
G06K009/36 |
Foreign Application Data
Date |
Code |
Application Number |
Dec 12, 2002 |
KR |
2002-79316 |
Dec 5, 2003 |
KR |
2003-87993 |
Claims
1. A method for generating user preference data regarding a color
characteristic of an image, when an image converted to have a color
characteristic that a user prefers with respect to a predetermined
reference image is referred to as a preference image, a color
characteristic value of the preference image is referred to as a
preference value and a color characteristic value of the reference
image is referred to as a reference value, the method comprising:
(a) generating {preference value, reference value} which
corresponds to a pair of the preference value and the reference
value; and (b) generating the pair {preference value, reference
value} as preference meta-data having at least one feature block,
wherein the feature block comprises: a block header including a
feature identifier corresponding to information identifying a color
characteristic; and at least one feature descriptor including the
preference value and the reference value.
2. The method of claim 1, wherein the color characteristic is at
least one of color temperature, brightness, contrast, and
saturation.
3. The method of claim 2, before step (a), further comprising:
providing a plurality of images having different color
characteristic values with respect to a predetermined image; and
setting an image that the user has selected from the plurality of
images as a preference image, setting an original image with
respect to the preference image as a reference image, and
generating {preference image, reference image} which corresponds to
a pair of the preference image and the reference image.
4. The method of claim 2, before step (a), further comprising:
installing a unit for controlling a color characteristic of an
image in an image display device; and setting an image of which
color characteristic is adjusted by a user using the unit for
controlling a color characteristic, as a preference image, setting
an original image of which color characteristic is not adjusted by
the user, as a reference image, and generating {preference image,
reference image} which corresponds to a pair of the preference
image and the reference image.
5. The method of claim 3, wherein the generating {preference image,
reference image} is, when the reference image has a contents
identifier, generating {preference image, reference image, contents
identifier} which corresponds to a combination of the preference
image, the reference image, and contents identifier
information.
6. The method of claim 4, wherein the generating {preference image,
reference image} is, when the reference image has a contents
identifier, generating {preference image, reference image, contents
identifier} which corresponds to a combination of the preference
image, the reference image, and contents identifier
information.
7. The method of claim 2, wherein step (a) is, when the reference
image has a contents identifier and when a color characteristic
value of the preference image is referred to as a preference value
and a color characteristic value of the reference image is referred
to as a reference value, generating {preference value, reference
value, contents identifier} which corresponds to a combination of
the preference value, the reference value, and the contents
identifier.
8. The method of claim 2, wherein a color temperature value is
obtained by the following steps comprising: extracting a highlight
region from an input color image; projecting the highlight region
on a chromaticity coordinate and calculating geometric
representation variables with respect to a shape distributed on the
chromaticity coordinate; estimating a color temperature from the
input color image by perceptive light source estimation; and
selecting geometric representation variables around the estimated
color temperature from the geometric representation variables and
calculating a final color temperature using the selected geometric
representation variables.
9. The method of claim 2, wherein a saturation value is obtained by
the following steps comprising: obtaining saturation of each pixel
in a HSV color space from an RGB value of a pixel in the image; and
generating a value obtained by adding saturation of the pixels and
dividing the added saturation by the number of pixels, as a
saturation value.
10. The method of claim 9, wherein the saturation of the pixel is
determined by the following steps comprising: obtaining maximum and
minimum values of the RGB value of the pixel; and when the maximum
value is equal to 0, setting the saturation of a corresponding
pixel to 0, and when the maximum value is not equal to 0, setting a
value obtained by dividing a difference between the maximum value
and the minimum value by the maximum value, as the saturation of a
corresponding pixel.
11. The method of claim 2, wherein a brightness value is determined
by the following steps comprising: obtaining luminance Y of each
pixel in a YCbCr color space from an RGB value of a pixel in the
image; and generating a value obtained by adding luminance of the
pixels and dividing the added luminance by the number of pixels, as
a brightness value.
12. The method of claim 11, wherein the luminance Y of the pixel is
determined by Y=0.299.times.R+0.587.times.G+0.114.times.B.
13. The method of claim 2, wherein a contrast value CV is, when Yx
is luminance of each pixel in the image and NumberofPixels is the
number of pixels in the image, determined using equation: C V = [ x
.di-elect cons. ( pixels ) ( Y x - BV ) 2 ] NumberOfPixels
##EQU00006##
14. The method of claim 2, wherein step (a) further comprising,
when {preference value, reference value} exists before {preference
value, reference value} in step (a) is generated, comparing the
pair {preference value, reference value} generated in step (a) with
an existing pair {preference value, reference value} and updating
the pair {preference value, reference value}, wherein the updating
is, with respect to one preference value, when the reference value
generated in step (a) is compared with the existing reference value
and is the same as or similar to the existing reference value,
removing the existing reference value.
15. The method of claim 7, wherein step (b) further comprising,
when {preference value, reference value} exists before {preference
value, reference value} in step (b) is generated, comparing the
pair {preference value, reference value} generated in step (b) with
an existing pair {preference value, reference value} and updating
the pair {preference value, reference value}, wherein the updating
is, with respect to one preference value, when the reference value
generated in step (b) is compared with the existing reference value
and is the same as or similar to the existing reference value,
removing the existing reference value.
16. The method of claim 14, wherein the updating is, when
quantization levels of the two reference values are different,
converting a value of high level into a value of low level and
comparing with each other, and when image contents identifiers are
added to the characteristic value pairs, even though the two
reference values are the same as or similar to each other, if the
image contents identifiers are different, without removing the
existing reference value.
17. The method of claim 2, wherein the number of the feature blocks
is four, and each of the feature blocks corresponds to the four
characteristic values.
18. The method of claim 2, wherein the block header of the feature
block represents color temperature if the value of the feature
identifier is `0`, brightness if the value thereof is `1`, contrast
if the value thereof is `2`, and saturation if the value thereof is
`3`.
19. The method of claim 2, wherein the block header of the feature
block further comprises a number-of-descriptors value indicating
the number of feature descriptors contained in the feature
block.
20. The method of claim 2, wherein the feature descriptor further
comprises: a Bin number indicating a quantization level of the
characteristic value; a contents ID flag indicating the presence of
an image contents identifier; and a contents identifier if the
image contents identifier exists.
21. An apparatus for generating user preference data regarding a
color characteristic of an image, when an image converted to have a
color characteristic that a user prefers with respect to
predetermined reference image is referred to as a preference image,
a color characteristic value of the preference image is referred to
as a preference value and a color characteristic value of the
reference image is referred to as a preference value and a color
characteristics value of the reference image is referred to as a
reference value, the apparatus comprising: a color characteristic
calculating unit, which obtains an image color characteristic value
of the preference image and the reference image, and generates
{preference value, reference value} which corresponds to a pair of
the preference value and the reference value; and a meta-data
generating unit, which generates the pair {preference value,
reference value} generated in the color characteristic calculating
unit as preference meta-data having at least one feature block,
wherein the feature block comprises: a block header including a
feature identifier corresponding to information identifying a color
characteristic; and at least one feature descriptor including the
preference value and the reference value.
22. The apparatus of claim 21, wherein the color characteristic is
at least one of color temperature, brightness, contrast, and
saturation.
23. The apparatus of claim 21, further comprising a first sample
image obtaining unit, which sets an image that the user has
selected from a plurality of images having different color
characteristic values with respect to a predetermined image, sets
an original image with respect to the preference image as a
reference image, generates {preference image, reference image}
which corresponds to a pair of the preference image and the
reference image, and outputs the pair to the color characteristic
calculating unit.
24. The apparatus of claim 21, further comprising a second sample
image obtaining unit, which, when a unit for controlling a color
characteristic of an image is installed in an image display device,
sets an image of which color characteristic is adjusted by a user
using the unit for controlling a color characteristic, as a
preference image, sets an original image of which color
characteristic is not adjusted by the user, as a reference image,
generates {preference image, reference image} which corresponds to
a pair of the preference image and the reference image, and outputs
the pair to the color characteristic calculating unit.
25. The apparatus of claim 23, wherein the generating {preference
image, reference image} is, when the reference image has a contents
identifier, generating {preference image, reference image, contents
identifier} which corresponds to a combination of the preference
image, the reference image, and contents identifier
information.
26. The apparatus of claim 25, wherein the color characteristic
calculating unit, when the reference image has a contents
identifier, further comprises a contents identifier in the pair
{preference value, reference value} and generates a combination
{preference value, reference value, contents identifier}.
27. The apparatus of claim 22, wherein the color characteristic
calculating unit comprises a color temperature value calculating
portion, which obtains a color temperature value, and wherein the
color temperature value calculating portion comprises: a highlight
detecting part, which extracts a highlight region from an input
color image; a highlight variable calculating part, which projects
the highlight region on a chromaticity coordinate and calculates
geometric representation variables with respect to a shape
distributed on the chromaticity coordinate; a color temperature
estimating part, which estimates a color temperature from the input
color image by perceptive light source estimation; and a color
temperature calculating part, which selects geometric
representation variables around the estimated color temperature
from the geometric representation variables and calculates a final
color temperature using the selected geometric representation
variables.
28. The apparatus of claim 22, wherein the color characteristic
calculating unit comprises a saturation value calculating portion,
which obtains saturation of each pixel in a HSV color space from an
RGB value of a pixel in the image and generates a value obtained by
adding saturation of the pixels and dividing the added saturation
by the number of pixels, as a saturation value, and wherein the
saturation of the pixel is determined by the following steps
comprising: obtaining maximum and minimum values of the RGB value
of the pixel; and when the maximum value is equal to 0, setting the
saturation of a corresponding pixel to 0, and when the maximum
value is not equal to 0, setting a value obtained by dividing a
difference between the maximum value and the minimum value by the
maximum value, as the saturation of a corresponding pixel.
29. The apparatus of claim 22, wherein the color characteristic
calculating unit comprises a brightness value calculating portion,
which obtains luminance Y of each pixel in a YCbCr color space from
an RGB value of a pixel in the image and generates a value obtained
by adding luminance of the pixels and dividing the added luminance
by the number of pixels, as a brightness value, and wherein the
luminance Y of the pixel is determined by
Y=0.299.times.R+0.587.times.G+0.114.times.B.
30. The apparatus of claim 22, wherein the color characteristic
calculating unit comprises a contrast value calculating portion,
which, when Yx is luminance of each pixel in the image and
NumberofPixels is the number of pixels in the image, calculates a
contrast value determined using equation 3:
31. The apparatus of claim 22, further comprising a meta-data
updating unit, which compares the pair {preference value, reference
value} generated in the color characteristic calculating unit with
an existing pair {preference value, reference value}, updates the
pair {preference value, reference value}, and outputs the pair to
the meta-data generating unit, wherein the updating is, with
respect to one preference value, when the reference value generated
in step (b) is compared with the existing reference value and is
the same as or similar to the existing reference value, removing
the existing reference value, and the updating is, when
quantization levels of the two reference values are different,
converting a value of high level into a value of low level and
comparing with each other, and when image contents identifiers are
added to the characteristic value pairs, even though the two
reference values are the same as or similar to each other, if the
image contents identifiers are different, without removing the
existing reference value.
32. The apparatus of claim 22, wherein the block header of the
feature block further comprises a number-of-descriptors value
indicating the number of feature descriptors contained in the
feature block.
33. An image preference data recording medium on which, when an
image converted to have a color characteristic that a user prefers
with respect to a predetermined image is referred to as a
preference image, the predetermined image is referred to as a
reference image, a color characteristic value of the preference
image is referred to as a preference value, and a color
characteristic value of the reference image is referred to as a
reference value, preference meta-data having at least one feature
block, the feature block comprising a block header including a
feature identifier corresponding to information identifying a color
characteristic and at least one feature descriptor including the
preference value and the reference value is recorded.
34. The recording medium of claim 33, wherein the block header of
the feature block further comprises a number-of-descriptors value
indicating the number of feature descriptors contained in the
feature block, and wherein the feature descriptor further
comprises: a Bin number indicating a quantization level of the
characteristic value; a contents ID flag indicating the presence of
an image contents identifier; and a contents identifier if the
image contents identifier exists.
35. A computer readable recording medium on which the invention of
claim 1 is recorded as an executable program code.
Description
[0001] This application claims the priority of Korean Patent
Application Nos. 2002-79316 filed on Dec. 12, 2002, 2003-87993
filed on Dec. 5, 2003, respectively, in the Korean Intellectual
Property Office, the disclosures of which are incorporated herein
in its entirety by reference.
BACKGROUND OF THE INVENTION
[0002] 1. Field of the Invention
[0003] The present invention relates to image processing, and more
particularly, a method and apparatus for generating user preference
data regarding the color characteristic of an image and a method
and apparatus for converting image color preference using the
method and apparatus.
[0004] 2. Description of the Related Art
[0005] There are a variety of methods of converting images so that
video has a better color characteristic when a user views video.
There are color characteristics considered for conversion in the
methods, such as brightness, saturation, contrast, and color
temperature. In the methods, a color characteristic value of an
input image is obtained, and then, the input image is converted so
that input video has a target color characteristic value. However,
the target color characteristic value is determined as desirable in
each of the methods and is set collectively. Thus, in the methods,
conversion which satisfies user's characteristics, cannot be
performed.
SUMMARY OF THE INVENTION
[0006] The present invention provides a method and apparatus for
generating user preference data regarding the color characteristic
of an image, in which a conversion target value is set according to
user characteristics.
[0007] The present invention also provides a method and apparatus
for converting image color preference using user preference data
regarding the color characteristic of an image.
[0008] The present invention also provides a recording medium on
which user preference data regarding the color characteristic of an
image is recorded.
[0009] The present invention also provides a computer readable
recording medium on which the method for generating user preference
data regarding the color characteristic of an image and the method
for converting image color preference using user preference data
regarding the color characteristic of an image are recorded as an
executable program code.
[0010] According to an aspect of the present invention, there is
provided a method for generating user preference data regarding a
color characteristic of an image, the method comprising (a) when an
image converted to have a color characteristic that a user prefers
with respect to a predetermined image is referred to as a
preference image and the predetermined image is referred to as a
reference image, obtaining an image color characteristic value of
the preference image and the reference image, (b) when a color
characteristic value of the preference image is referred to as a
preference value and a color characteristic value of the reference
image is referred to as a reference value, generating {preference
value, reference value} which corresponds to a pair of the
preference value and the reference value, and (c) generating the
pair {preference value, reference value} as preference meta-data
having at least one feature block, and the feature block comprises
a block header including a feature identifier corresponding to
information identifying a color characteristic, and at least one
feature descriptor including the preference value and the reference
value. The color characteristic may be at least one of color
temperature, brightness, contrast, and saturation. The method,
before step (a), may further comprise providing a plurality of
images having different color characteristic values with respect to
a predetermined image, and setting an image that the user has
selected from the plurality of images as a preference image,
setting an original image with respect to the preference image as a
reference image, and generating {preference image, reference image}
which corresponds to a pair of the preference image and the
reference image. The method, before step (a), may further comprise
installing a unit for controlling a color characteristic of an
image in an image display device, and setting an image of which
color characteristic is adjusted by a user using the unit for
controlling a color characteristic, as a preference image, setting
an original image of which color characteristic is not adjusted by
the user, as a reference image, and generating {preference image,
reference image} which corresponds to a pair of the preference
image and the reference image. The generating {preference image,
reference image} may be, when the reference image has a contents
identifier, generating {preference image, reference image, contents
identifier} which corresponds to a combination of the preference
image, the reference image, and contents identifier information.
Step (b) may be, when the reference image has a contents identifier
and when a color characteristic value of the preference image is
referred to as a preference value and a color characteristic value
of the reference image is referred to as a reference value,
generating {preference value, reference value, contents identifier}
which corresponds to a combination of the preference value, the
reference value, and the contents identifier. A color temperature
value in step (a) may be obtained by the following steps comprising
extracting a highlight region from an input color image, projecting
the highlight region on a chromaticity coordinate and calculating
geometric representation variables with respect to a shape
distributed on the chromaticity coordinate, estimating a color
temperature from the input color image by perceptive light source
estimation, and selecting geometric representation variables around
the estimated color temperature from the geometric representation
variables and calculating a final color temperature using the
selected geometric representation variables. A saturation value in
step (a) may be obtained by the following steps comprising
obtaining saturation of each pixel in a HSV color space from an RGB
value of a pixel in the image, and generating a value obtained by
adding saturation of the pixels and dividing the added saturation
by the number of pixels, as a saturation value. The saturation of
the pixel may be determined by the following steps comprising
obtaining maximum and minimum values of the RGB value of the pixel,
and when the maximum value is equal to 0, setting the saturation of
a corresponding pixel to 0, and when the maximum value is not equal
to 0, setting a value obtained by dividing a difference between the
maximum value and the minimum value by the maximum value, as the
saturation of a corresponding pixel. A brightness value in step (a)
may be determined by the following steps comprising obtaining
luminance Y of each pixel in a YCbCr color space from an RGB value
of a pixel in the image, and generating a value obtained by adding
luminance of the pixels and dividing the added luminance by the
number of pixels, as a brightness value. The luminance Y of the
pixel may be determined by
Y=0.299.times.R+0.587.times.G+0.114.times.B. A contrast value CV in
step (a) may be, when Y.sub.x is luminance of each pixel in the
image and NumberofPixels is the number of pixels in the image,
determined using equation 3:
C V = [ x .di-elect cons. ( pixels ) ( Y x - BV ) 2 ]
NumberOfPixels . ##EQU00001##
[0011] Step (b) may further comprise, when {preference value,
reference value} exists before {preference value, reference value}
in step (b) is generated, comparing the pair {preference value,
reference value} generated in step (b) with an existing pair
{preference value, reference value} and updating the pair
{preference value, reference value}, and the updating is, with
respect to one preference value, when the reference value generated
in step (b) is compared with the existing reference value and is
the same as or similar to the existing reference value, removing
the existing reference value. The updating may be, when
quantization levels of the two reference values are different,
converting a value of high level into a value of low level and
comparing with each other, and when image contents identifiers are
added to the characteristic value pairs, even though the two
reference values are the same as or similar to each other, if the
image contents identifiers are different, without removing the
existing reference value.
[0012] The number of the feature blocks may be four, and each of
the feature blocks may correspond to the four characteristic
values. The block header of the feature block may represent color
temperature if the value of the feature identifier is `0`,
brightness if the value thereof is `1`, contrast if the value
thereof is `2`, and saturation if the value thereof is `3`. The
block header of the feature block may further comprise a
number-of-descriptors value indicating the number of feature
descriptors contained in the feature block. The feature descriptor
may further comprise a Bin number indicating a quantization level
of the characteristic value, a contents ID flag indicating the
presence of an image contents identifier, and a contents identifier
if the image contents identifier exists.
[0013] According to another aspect of the present invention, there
is provided an apparatus for generating user preference data
regarding a color characteristic of an image, the apparatus
comprising a color characteristic calculating unit, which, when an
image converted to have a color characteristic that a user prefers
with respect to a predetermined image is referred to as a
preference image and the predetermined image is referred to as a
reference image, obtains an image color characteristic value of the
preference image and the reference image, and when a color
characteristic value of the preference image is referred to as a
preference value and a color characteristic value of the reference
image is referred to as a reference value, generates {preference
value, reference value} which corresponds to a pair of the
preference value and the reference value, and a meta-data
generating unit, which generates the pair {preference value,
reference value} generated in the color characteristic calculating
unit as preference meta-data having at least one feature block, and
the feature block comprises
a block header including a feature identifier corresponding to
information identifying a color characteristic, and at least one
feature descriptor including the preference value and the reference
value.
[0014] The apparatus may further comprise a first sample image
obtaining unit, which sets an image that the user has selected from
a plurality of images having different color characteristic values
with respect to a predetermined image, sets an original image with
respect to the preference image as a reference image, generates
{preference image, reference image} which corresponds to a pair of
the preference image and the reference image, and outputs the pair
to the color characteristic calculating unit.
[0015] The apparatus may further comprise a second sample image
obtaining unit, which, when a unit for controlling a color
characteristic of an image is installed in an image display device,
sets an image of which color characteristic is adjusted by a user
using the unit for controlling a color characteristic, as a
preference image, sets an original image of which color
characteristic is not adjusted by the user, as a reference image,
generates {preference image, reference image} which corresponds to
a pair of the preference image and the reference image, and outputs
the pair to the color characteristic calculating unit. The
generating {preference image, reference image} may be, when the
reference image has a contents identifier, generating {preference
image, reference image, contents identifier} which corresponds to a
combination of the preference image, the reference image, and
contents identifier information.
[0016] The color characteristic calculating unit, when the
reference image has a contents identifier, may further comprise a
contents identifier in the pair {preference value, reference value}
and may generate a combination {preference value, reference value,
contents identifier}. The color characteristic calculating unit
comprises a color temperature value calculating portion, which
obtains a color temperature value, and the color temperature value
calculating portion comprises a highlight detecting part, which
extracts a highlight region from an input color image, a highlight
variable calculating part, which projects the highlight region on a
chromaticity coordinate and calculates geometric representation
variables with respect to a shape distributed on the chromaticity
coordinate, a color temperature estimating part, which estimates a
color temperature from the input color image by perceptive light
source estimation, and a color temperature calculating part, which
selects geometric representation variables around the estimated
color temperature from the geometric representation variables and
calculates a final color temperature using the selected geometric
representation variables. The color characteristic calculating unit
comprises a saturation value calculating portion, which obtains
saturation of each pixel in a HSV color space from an RGB value of
a pixel in the image and generates a value obtained by adding
saturation of the pixels and dividing the added saturation by the
number of pixels, as a saturation value, and the saturation of the
pixel is determined by the following steps comprising obtaining
maximum and minimum values of the RGB value of the pixel, and when
the maximum value is equal to 0, setting the saturation of a
corresponding pixel to 0, and when the maximum value is not equal
to 0, setting a value obtained by dividing a difference between the
maximum value and the minimum value by the maximum value, as the
saturation of a corresponding pixel. The color characteristic
calculating unit comprises a brightness value calculating portion,
which obtains luminance Y of each pixel in a YCbCr color space from
an RGB value of a pixel in the image and generates a value obtained
by adding luminance of the pixels and dividing the added luminance
by the number of pixels, as a brightness value, and the luminance Y
of the pixel is determined by
Y=0.299.times.R+0.587.times.G+0.114.times.B. The color
characteristic calculating unit comprises a contrast value
calculating portion, which, when Y.sub.x is luminance of each pixel
in the image and NumberofPixels is the number of pixels in the
image, calculates a contrast value determined using equation 3:
C V = [ x .di-elect cons. ( pixels ) ( Y x - BV ) 2 ]
NumberOfPixels . ##EQU00002##
[0017] The apparatus may further comprise a meta-data updating
unit, which compares the pair {preference value, reference value}
generated in the color characteristic calculating unit with an
existing pair {preference value, reference value}, updates the pair
{preference value, reference value}, and outputs the pair to the
meta-data generating unit, and the updating may be, with respect to
one preference value, when the reference value generated in step
(b) is compared with the existing reference value and is the same
as or similar to the existing reference value, removing the
existing reference value, and the updating is, when quantization
levels of the two reference values are different, converting a
value of high level into a value of low level and comparing with
each other, and when image contents identifiers are added to the
characteristic value pairs, even though the two reference values
are the same as or similar to each other, if the image contents
identifiers are different, without removing the existing reference
value.
[0018] The block header of the feature block may further comprise
the number of descriptor indicating the number of feature
descriptors contained in the feature block. The feature descriptor
may further comprise a Bin number indicating a quantization level
of the characteristic value, a contents ID flag indicating the
presence of an image contents identifier, and a contents identifier
if the image contents identifier exists.
[0019] According to another aspect of the present invention, there
is provided an apparatus for converting image color preference, the
apparatus comprising an input image color characteristic
calculating unit, which calculates a color characteristic value
with respect to an input image, a color preference data unit, which
generates preference meta-data having at least one feature block,
the feature block comprising a block header including a feature
identifier corresponding to information identifying a color
characteristic and at least one feature descriptor including the
preference value and the reference value, an image color
characteristic mapping unit, which determines a target color
characteristic value with respect to the input image using the
color characteristic value of the input image calculated by the
input image color characteristic calculating unit and the color
preference data output from the color preference data unit, and an
image color characteristic converting unit, which converts the
color characteristic of the input image so that the input image has
a color characteristic value obtained from the image color
characteristic mapping unit. The block header of the feature block
of the color preference data unit may further comprise a
number-of-descriptors value indicating the number of feature
descriptors contained in the feature block, and the feature
descriptor of the color preference data unit may further comprise a
Bin number indicating a quantization level of the characteristic
value, a contents ID flag indicating the presence of an image
contents identifier, and a contents identifier if the image
contents identifier exists.
[0020] When a contents identifier of the input image exists, the
image color characteristic mapping unit may determine a target
color characteristic value with respect to the input image using
the color characteristic value of the input image calculated by the
input image color characteristic calculating unit and a color
characteristic value of same contents identifier stored in the
color preference data unit. The image color characteristic
converting unit may comprise a color temperature converting
portion, which converts the input image so that the input image has
a color temperature value generated in the image color
characteristic mapping unit, a brightness converting portion, which
converts the input image so that the input image has a brightness
value generated in the image color characteristic mapping unit, a
contrast converting portion, which converts the input image so that
the input image has a contrast value generated inn the image color
characteristic mapping unit, and a saturation converting portion,
which converts the input image so that the input image has a
saturation value generated in the image color characteristic
mapping unit.
[0021] According to another aspect of the present invention, there
is provided a method for converting image color preference, the
method comprising (a) calculating a color characteristic value with
respect to an input image, (b) generating preference meta-data
having at least one feature block, the feature block comprising a
block header including a feature identifier corresponding to
information identifying a color characteristic and at least one
feature descriptor including the preference value and the reference
value, (c) determining a target color characteristic value with
respect to the input image using the color characteristic value of
the input image calculated in step (a) and the color preference
data output in step (b), and (d) converting the color
characteristic of the input image so that the input image has the
color characteristic value obtained in step (c). The block header
of the feature block of the color preference data unit may further
comprise a number-of-descriptors value indicating the number of
feature descriptors contained in the feature block, and the feature
descriptor in step (b) may further comprise a Bin number indicating
a quantization level of the characteristic value, a contents ID
flag indicating the presence of an image contents identifier, and a
contents identifier if the image contents identifier exists. In
step (c), when a contents identifier of the input image exists, a
color characteristic value with respect to the input image may be
obtained using the color characteristic value of the input image
calculated in step (a) and a color characteristic value of same
contents identifier output in step (b).
[0022] According to another aspect of the present invention, there
is provided an image preference data recording medium on which,
when an image converted to have a color characteristic that a user
prefers with respect to a predetermined image is referred to as a
preference image, the predetermined image is referred to as a
reference image, a color characteristic value of the preference
image is referred to as a preference value, and a color
characteristic value of the reference image is referred to as a
reference value, preference meta-data having at least one feature
block, the feature block comprising a block header including a
feature identifier corresponding to information identifying a color
characteristic and at least one feature descriptor including the
preference value and the reference value is recorded. The block
header of the feature block may further comprise a
number-of-descriptors value indicating the number of feature
descriptors contained in the feature block, and the feature
descriptor may further comprise a Bin number indicating a
quantization level of the characteristic value, a contents ID flag
indicating the presence of an image contents identifier, and a
contents identifier if the image contents identifier exists.
[0023] There is provided a computer readable recording medium on
which the invention is recorded as an executable program code.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The above and other aspects and advantages of the present
invention will become more apparent by describing in detail
exemplary embodiments thereof with reference to the attached
drawings in which:
[0025] FIG. 1 is a block diagram illustrating a structure of an
apparatus for generating user preference data regarding the color
characteristic of an image according to the present invention;
[0026] FIG. 2 is a block diagram illustrating a structure of a
sample image obtaining unit;
[0027] FIG. 3 is a block diagram illustrating a structure of a
color characteristic calculating unit;
[0028] FIG. 4 is a block diagram illustrating a structure of a
color temperature value calculating portion;
[0029] FIG. 5 illustrates a structure of preference meta-data
according to the present invention;
[0030] FIG. 6 is a flowchart illustrating a method for generating
user preference data regarding the color characteristic of an image
according to the present invention;
[0031] FIG. 7 is a block diagram illustrating a structure of an
apparatus for converting image color preference using preference
data regarding the color characteristic of an image according to
the present invention;
[0032] FIG. 8 is a block diagram illustrating a structure of an
image color characteristic converting unit; and
[0033] FIG. 9 is a flowchart illustrating a method for converting
image color preference using user preference data regarding the
color characteristic of an image according to the present
invention.
DETAILED DESCRIPTION OF THE INVENTION
[0034] Hereinafter, a method and apparatus for generating user
preference data regarding the color characteristic of an image and
a method and apparatus for converting image color preference using
user preference data regarding the color characteristic of an image
according to the present invention will be described in detail with
reference to the accompanying drawings.
[0035] First, a method and apparatus for generating user preference
data regarding the color characteristic of an image according to
the present invention will be described as below.
[0036] FIG. 1 is a block diagram illustrating a structure of an
apparatus for generating user preference data regarding the color
characteristic of an image according to the present invention. The
apparatus for generating user preference data regarding the color
characteristic of an image according to the present invention
includes a preference obtainer 100 and a preference data generator
150. FIG. 6 is a flowchart illustrating a method for generating
user preference data regarding the color characteristic of an image
according to the present invention. The method for generating user
preference data regarding the color characteristic of an image
according to the present invention comprises obtaining a sample
image (step 600), calculating a color characteristic value (step
620), updating meta-data (step 640), and generating preference
meta-data (step 660).
[0037] First, terms used in the present invention will be
described. An image converted to have a color characteristic that a
user prefers with respect to a predetermined image is referred to
as a preference image, and the predetermined image is referred to
as a reference image. In addition, a color characteristic value of
the preference image is referred to as a preference value, and a
color characteristic value of the reference image is referred to as
a reference value.
[0038] Preferably, a color characteristic considered in the present
invention includes at least one of color temperature, brightness,
contrast, and saturation. All of the color temperature, the
brightness, the contrast, and the saturation are applied to a
general case to which the present invention is applied. In
addition, the color characteristic considered in the present
invention is not limited to the four color characteristics, but
other color characteristics may be considered in the present
invention.
[0039] The preference obtainer 100 obtains a preference image and a
reference image from a predetermined image, generates a preference
value and a reference value from the obtained preference image and
the reference image, respectively, and includes a sample image
obtaining unit 10 and a color characteristic calculating unit
12.
[0040] FIG. 2 is a block diagram illustrating a structure of the
sample image obtaining unit 10. The sample image obtaining unit 10
generates a pair of a preference image and a reference image, from
a user's selection or a user's color adjustment (step 600). There
are two methods for generating the pair of the preference image and
the reference image. The methods correspond to a first sample image
obtaining portion 200 and a second sample image obtaining portion
250 shown in FIG. 2, respectively. Thus, the present embodiment,
the sample image obtaining unit 10 includes at least one of the
first sample image obtaining portion 200 and the second sample
image obtaining portion 250.
[0041] The first sample image obtaining portion 200 sets an image
that the user has selected from a plurality of images having
different color characteristic values with respect to a
predetermined image as a preference image, sets an original image
with respect to the preference image as a reference image,
generates {preference image, reference image} which corresponds to
a pair of the preference image and the reference image, and outputs
the pair to the color characteristic calculating unit 12. In other
words, the first sample image obtaining portion 200 constitutes a
set of images converted to have different characteristic values
from an original image with respect to each of four
characteristics, such as color temperature, brightness, contrast,
and saturation. Then, the first sample image obtaining portion 200
sets an image that the user has preferred, as a preference image,
sets the original images as a reference image, and sets a pair
{preference image, reference image}.
[0042] When a unit (not shown) for controlling a color
characteristic of an image is installed in an image display device,
the second sample image obtaining part 250 sets an image of which
color characteristic is adjusted by the user using the unit for
controlling a color characteristic, as a preference image, sets an
original image of which color characteristic is not adjusted by the
user, as a reference image, generates {preference image, reference
image} which corresponds to a pair of the preference image and the
reference image, and outputs the pair to the color characteristic
calculating unit 12. In other words, when the unit for controlling
a color characteristic of an image is installed in the image
display device, the user adjusts a color characteristic using the
unit when viewing video. In this case, the image before the user
adjusts is referred to as a reference image, and the image obtained
after the user has adjusted is referred to as a preference
image.
[0043] When image contents including the reference image obtained
by the first sample image obtaining portion 200 and the second
sample image obtaining portion 250 have a contents identifier set
by MPEG-21, TV anytime, or contents service provider, the sample
image obtaining unit 10 can output an image contents identifier as
well as the pair {preference image, reference image}.
[0044] The color characteristic calculating unit 12 calculates
color characteristic values of the preference image and the
reference image, generates a pair {preference value, reference
value} which corresponds to a pair of a preference value and a
reference value, and includes at least one of a color temperature
value calculating portion 300, a saturation value calculating
portion 320, a brightness value calculating portion 340, and a
contrast value calculating portion 360.
[0045] More specifically, the color characteristic calculating unit
12 calculates all or a part of a color temperature value, a
brightness value, a contrast value, and a saturation value with
respect to an input pair {preference image, reference image} and
outputs a pair {preference value, reference value} (step 620). When
the color temperature calculating unit 12 receives an image
contents identifier together with the pair {preference image,
reference image}, the color temperature calculating unit 12 outputs
the image contents identifier together with the pair {preference
value, reference value}. The color characteristic calculating unit
12 calculates each color characteristic of an input image by the
following method.
[0046] FIG. 4 is a block diagram illustrating a structure of the
color temperature value calculating portion 300. The color
temperature value calculating portion 300 calculates a color
temperature value from an input color image and includes a
highlight detecting part 400, a highlight variable calculating part
420, a color temperature estimating part 440, and a color
temperature calculating part 460. The highlight detecting part 400
extracts highlight regions from the input color image. The
highlight variable calculating part 420 projects the highlight
regions on a chromaticity coordinate and calculates geometric
representation variables with respect to a shape distributed on the
chromaticity coordinate. The color temperature estimating part 440
estimates a color temperature from the input color image by
perceptive light source estimation. The color temperature
calculating part 460 selects geometric representation variables
around the estimated color temperature from the geometric
representation variables and calculates a final color temperature
using the selected geometric representation variables. The color
temperature value may be expressed as 8-bit data, as specified in a
color temperature descriptor ISO/IEC 15938-3.
[0047] The saturation value calculating portion 320 obtains
saturation corresponding to S in a HSV color space of each pixel
from an RGB value of a pixel in an image to be displayed and
generates a value obtained by adding saturation of the pixels and
dividing the added saturation by the number of pixels, as a
saturation value. When maximum and minimum values of the RGB value
of the pixel are obtained and the maximum value is equal to 0, the
saturation of a corresponding pixel is set to 0, and when the
maximum value is not equal to 0, a value obtained by dividing a
difference between the maximum value and the minimum value by the
maximum value, is set as the saturation of a corresponding
pixel.
[0048] This will be expressed using the following equation.
Saturation S in the HSV color space is obtained from (R,G,B) values
of each pixel in an input image as below.
Max=max(R,G,B),
Min=min(R,G,B)
if(Max==0) S=0 else S=(Max-Min)/Max
[0049] A saturation value SV is calculated using equation 1.
S V = [ x .di-elect cons. ( pixels ) S x ] NumberOfPixels ( 1 )
##EQU00003##
[0050] Here, S.sub.x is an S-value of each pixel in an image to be
displayed.
[0051] The brightness value calculating portion 340 obtains
luminance Y corresponding to Y in a YCbCr color space of each pixel
from an RGB value of a pixel in an image to be displayed and
generates a value obtained by adding luminance of the pixels and
dividing the added luminance by the number of pixels, as a
brightness value. The luminance Y of the pixel is determined by
Y=0.299.times.R++0.587.times.G+0.114.times.B.
[0052] A brightness value BV is calculated using equation 2.
B V = [ x .di-elect cons. ( pixels ) Y x ] NumberOfPixels ( 2 )
##EQU00004##
[0053] Here, Y.sub.x is a Y-value of each pixel in an image to be
displayed.
[0054] When Y.sub.x is luminance of each pixel in the image,
NumberofPixels is the number of pixels in the image and a contrast
value is CV, the contrast value calculating portion 360 calculates
the contrast value CV using equation 3.
C V = [ x .di-elect cons. ( pixels ) ( Y x - BV ) 2 ]
NumberOfPixels ( 3 ) ##EQU00005##
[0055] The preference obtainer 100 may include only the color
characteristic calculating unit 12 without the sample image
obtaining unit 10 when a preference image and a reference image are
previously prepared.
[0056] Meanwhile, the preference data generator 150 receives the
preference value and the reference value generated in the
preference obtainer 100, generates preference meta-data having at
least one feature block, and includes a meta-data updating unit 16
and a metal-data generating unit 18.
[0057] The meta-data updating unit 16 compares the pair {preference
value, reference value} generated in the color characteristic
calculating unit 12 with an existing pair {preference value,
reference value}, updates and outputs the pair {preference value,
reference value} to the meta-data generating unit 18 (step
640).
[0058] The meta-data updating unit 16 operates when preference
meta-data already exists. The meta-data updating unit 16 receives a
pair or a plurality of pairs {preference value, reference value}
from the preference obtainer 100. In addition, the meta-data
updating unit 16 receives characteristic value pairs in existing
preference meta-data, removes redundancy and contradiction of the
preference data, and then, outputs updated pairs {preference value,
reference value}. The meta-data updating unit 16 operates with
respect to each preference as below.
[0059] When with respect to one preference, a newly-input
characteristic value pair is A and an existing characteristic value
pair is B, a reference value of A is compared with a reference
value of B. When the reference value of A is the same as or similar
to the reference value of B, B is removed. Here, a similar case
means a case where a difference between the reference value of A
and the reference value of B is within a predetermined range. In
addition, when quantization levels of the two reference values are
different, a value of high level is converted into a value of low
level and is compared with each other.
[0060] When image contents identifiers are added to the
characteristic value pairs, even though the reference value of A is
the same as or similar to the reference value of B, if the image
contents identifiers are different, B is not removed. The existing
characteristic value pairs and the new characteristic value pairs,
which are not removed in the above procedure, are output.
[0061] The meta-data generating unit 18 receives the pair
{preference value, reference value} from the meta-data updating
unit 16 or the color characteristic calculating unit 12 and
generates preference meta-data having at least one feature block
(step 660). The feature block includes a block header including a
feature identifier corresponding to information identifying a color
characteristic and at least one feature descriptor including the
preference value and the reference value.
[0062] The preference meta-data will be described in more detail as
below. When a contents identifier is input, identifier information
is added to meta-data, as shown in FIG. 5.
[0063] The preference meta-data is composed of four feature blocks
500, 505, 510, and 515. Each of the four feature blocks 500, 505,
510, and 515 has information corresponding to each of four
preferences, such as color temperature, saturation, brightness, and
contrast.
[0064] Each feature block includes one block header 520 and a
plurality of feature descriptors 525, 530, . . . , and 535 or one
thereof. The block header 520 includes a feature identifier 540
indicating that a color characteristic to be represented, and a
number-of-descriptors value 545 indicating how many feature
descriptors exist in a corresponding block. If the above data
structure is represented in a binary sequence, the feature
identifier 540 may be represented with a 2-bit flag. In this case,
if the value of the feature identifier 540 is `0`, the feature
identifier 540 may be represented with color temperature. If the
value thereof is `1`, the feature identifier 540 may be represented
with brightness. If the value thereof is `2`, the feature
identifier 540 may be represented with contrast, and if the value
thereof is `3`, the feature identifier 540 may be represented with
saturation. In addition, the number-of-descriptors value 545 may be
represented with a variable flag indicating that, for example, 3
bits of 4 bits mean the number of bits and if last 1 bit is 1, 4
bits continue.
[0065] If the above data structure is represented with XML, the
feature identifier 540 is represented with a string, and the
number-of-descriptors value 545 is not represented.
[0066] The feature descriptor 530 includes a Bin number 550
indicating a quantization level of the characteristic value, a
contents ID flag 555 indicating the presence of an image contents
identifier, a contents identifier 560, a preference value 565, and
a reference value 570.
[0067] The Bin number indicates the quantization level of
characteristic value representation. The quantization level of the
characteristic value is within a range of 8 bits with respect to
color temperature and is within a range of 12 bits with respect to
other characteristic values.
[0068] If the data structure is represented with XML, the contents
ID flag is not represented. The above-described color preference
data may be represented/recorded as xml-data according to
xml-schema definition.
[0069] A data format generated in the meta-data generating unit may
be represented as xml-data according to xml-schema definition.
[0070] 1. DisplayPresentationPreferences
[0071] DisplayPresentationPreferences specifies the preferences of
a user regarding the display of images and video.
TABLE-US-00001 1.1 DisplayPresentationPreferences syntax <!--
################################################ --> <!--
Definition of DisplayPresentationPreferences --> <!--
################################################ -->
<complexType name="DisplayPresentationPreferencesType">
<complexContent> <extension base="dia:DIABaseType">
<sequence> <element name="ColorTemperaturePreference"
type="dia:ColorPreferenceType" minOccurs="0"/> <element name
= "BrightnessPreference" type = "dia:ColorPreferenceType"
minOccurs="0"/> <element name="SaturationPreference"
type="dia:ColorPreferenceType" minOccurs="0"/> <element
name="ContrastPreference" type = "dia:ColorPreferenceType"
minOccurs="0"/> </sequence> </extension>
</complexContent> </complexType>
[0072] 1.2 DisplayPresentationPreferences Semantics
[0073] DisplayPresentationPreferencesType is a tool that describes
the display presentation preferences of a user.
[0074] ColorTemperaturePreference describes the color temperature
that a User prefers. The color temperature is defined as the
correlated color temperature of estimated illumination of the image
to be displayed.
[0075] BrightnessPreference describes the brightness that a user
prefers. The brightness is defined as an attribute of a visual
sensation according to which an area appears to emit more or less
light.
[0076] SaturationPreference describes the saturation that a user
prefers. The saturation is defined as the colorfulness of an area
judged in proportion to its brightness.
[0077] ContrastPreference describes the contrast that a user
prefers. The contrast is defined to be the ratio of luminance
between the lightest and darkest elements of a scene.
[0078] StereoscopicVideoConversion describes the preferred
parameters of a user for stereoscopic video conversion.
[0079] 2. ColorPreference
[0080] ColorPreference specifies color preference. In
DisplayPresentationPreference, to express a users preference
regarding the color of displayed image and video, four attributes
of color are considered; color temperature, brightness, saturation
and contrast. The ColorPreferenceType is a tool to describe
preferences related to such attributes of color.
TABLE-US-00002 2.1 ColorPreference syntax <!--
############################################### --> <!--
Definition of ColorPreference --> <!--
################################################ -->
<complexType name="ColorPreferenceType">
<complexContent> <extension base="dia:DIABaseType">
<sequence> <element name="BinNumber" type="mpeg7:
unsigned12"/> <element name="Value" minOccurs="0"
maxOccurs="unbounded"> <complexType> <sequence>
<element name = "PreferredValue" type="mpeg7:unsigned12"/>
<element name = "ReferenceValue" type="mpeg7:unsigned12"/>
</sequence> </complexType> </element>
</sequence> </extension> </complexContent>
</complexType>
[0081] 2.2 ColorPreference Sematics
[0082] ColorPreferenceType is a tool that describes the color
preferences of a user when viewing visual images. The color
preference can be described in terms of color temperature,
brightness, saturation and contrast.
[0083] BinNumber describes the quantization level that
PreferredValue and ReferenceValue take.
[0084] Value Indicates the minimal unit that describes the color
preference of a user. It includes two subelements: PreferredValue
and ReferenceValue. If PreferredValue is equal to v1, and
ReferenceValue is equal to v2, it indicates that the user wants to
convert an image of value v2 into an image of value v1 with respect
to an attribute of color that ColorPreferenceType descriptor
specifies.
[0085] PreferredValue describes the value of a color attribute that
a user prefers.
[0086] ReferenceValue describes the value of a color attribute in
an image that is used as reference to express the PreferredValue.
If ReferenceValue is equal to zero, it means that ReferenceValue is
not considered.
[0087] Table 1 gives the value definition of PreferredValue and
ReferenceValue for four attributes of color: color temperature,
brightness, saturation and contrast of images and videos to be
displayed.
TABLE-US-00003 TABLE 1 Value Range, Number of Attribute Name Value
Type Value Definition Bins, Quantization Type Color Temperature
Color Temperature is Correlated color temperature of The range
[1667, 25000] is specified in ISO/IEC estimated illumination of the
quantized into 2.sup.8 bins in a 15938-3 image to be displayed
non-uniform way as specified in ISO/IEC 15938-3 Brightness Y-value
in the YCbCr* Mean value of Y-values of all The range [0, 1] is
uniformly color space pixels in the image to be quantized. Number
of bins .ltoreq. 2.sup.12. displayed Saturation S-value in the HSV*
Mean value of S-values of all The range [0, 1] is uniformly color
space pixels in the image to be quantized. Number of bins .ltoreq.
2.sup.12. displayed Contrast Y-value in the YCbCr* Standard
deviation of Y-values The range [0, 1] is uniformly color space of
all pixels in the image to be quantized. Number of bins .ltoreq.
2.sup.12. displayed
[0088] In Table 1, the color spaces YCbCr and HSV are specified in
ISO/IEC 15938-3. The standard expressions of Y value and S value
are also specified there.
[0089] 2.3 ColorPreference Examples
[0090] ColorPreference allows multiple occurrences of the pair,
(PreferredValue,ReferenceValue) so that the pairs can be used to
find an optimal mapping of color attributes, for example, through
selection among available mapping functions or interpolation by
using the pairs as poles. Based on the obtained mapping strategy,
an application may convert images so that the resulting images
satisfy the user preference for color. The following example shows
the use of the DisplayPresentationPreferences description tool to
express color preference of user.
TABLE-US-00004 <DIA> <Description
xsi:type="UsageEnvironmentType"> <UsageEnvironment
xsi:type="UserCharacteristicsType"> <UserCharacteristics
xsi:type="PresentationPreferencesType"> <Display>
<ColorTemperaturePreference>
<BinNumber>255</BinNumber> <Value>
<PreferredValue>110</PreferredValue>
<ReferenceValue>127</ReferenceValue> </Value>
<Value> <PreferredValue>156</PreferredValue>
<ReferenceValue>151</ReferenceValue> </Value>
<Value> <PreferredValue>200</PreferredValue>
<ReferenceValue>192</ReferenceValue> </Value>
</ColorTemperaturePreference> <BrightnessPreference>
<BinNumber>255</BinNumber> <Value>
<PreferredValue>138</PreferredValue>
<ReferenceValue>103</ReferenceValue> </Value>
<Value> <PreferredValue>152</PreferredValue>
<ReferenceValue>150</ReferenceValue> </Value>
</BrightnessPreference> <SaturationPreference>
<BinNumber>255</BinNumber> <Value>
<PreferredValue>94</PreferredValue>
<ReferenceValue>80</ReferenceValue> </Value>
</SaturationPreference> <ContrastPreference>
<BinNumber>255</BinNumber> <Value>
<PreferredValue>80</PreferredValue>
<ReferenceValue>70</ReferenceValue> </Value>
</ContrastPreference> </Display>
</UserCharacteristics> </UsageEnvironment>
</Description> </DIA>
[0091] Next, a method and apparatus for converting image color
preference using user preference data regarding the color
characteristic of an image according to the present invention will
be described in detail.
[0092] FIG. 7 is a block diagram illustrating a structure of an
apparatus for converting image color preference using preference
data regarding the color characteristic of an image according to
the present invention. The apparatus for converting image color
preference includes an input image color characteristic calculating
unit 700, a color preference data unit 720, an image color
characteristic mapping unit 740, and an image color characteristic
converting unit 760. FIG. 9 is a flowchart illustrating a method
for converting image color preference using user preference data
regarding the color characteristic of an image according to the
present invention. The method for converting image color preference
comprises calculating an input image color characteristic value
(step 900), generating and providing user preference meta-data
(step 920), generating a target color characteristic value from the
input image color characteristic value and the user preference
meta-data (step 940), and converting a color characteristic of an
input image (step 960).
[0093] The input image color characteristic calculating unit 700
calculates a color characteristic value with respect to an input
image (step 900). The input image color characteristic calculating
unit 700 calculates at least one of four color characteristics,
such as color temperature, saturation, brightness, and contrast,
from the input image. The color characteristic value may be
calculated by the color temperature value calculating portion 300,
the saturation value calculating portion 320, the brightness value
calculating portion 340, and the contrast value calculating portion
360 of the color characteristic calculating unit 12 shown in FIG.
3. The calculated color characteristic value is recorded in each
frame of an image or in each image time period, and thus, an input
image color characteristic is output.
[0094] The color preference data unit 720 generates preference
meta-data having at least one feature block and provides the
preference meta-data to the image color characteristic mapping unit
740 (step 920). The color preference data unit 720 is the same as
the apparatus for generating user preference data regarding the
color characteristic of an image of FIG. 1.
[0095] The image color characteristic mapping unit 740 determines a
target color characteristic value with respect to the input image
using the color characteristic value of the input image calculated
by the input image color characteristic calculating unit 700 and
the color preference data output from the color preference data
unit 720 (step 940).
[0096] More specifically, input image color characteristic data is
input into the image color characteristic mapping unit 740, and
then, the image color characteristic mapping unit 740 receives
color preference data having a reference value being equal to or
approximating the input image color characteristic value from the
color preference data unit 720. If a contents identifier is
contained in an input image, the image color characteristic mapping
unit 740 may receive color preference data composed of a
combination {preference value, reference value, image contents
identifier} having same contents identifier from the color
preference data unit 720. As a result, the image color
characteristic mapping unit 740 determines and outputs a target
color characteristic value using the input image color
characteristic value and the color preference data.
[0097] The image color characteristic converting unit 760 converts
the color characteristic of the input image so that the input image
has a target color characteristic value determined using the image
color characteristic mapping unit 740 (step 960). FIG. 8 is a block
diagram illustrating a structure of the image color characteristic
converting unit 760. The image color characteristic converting unit
760 includes the a color temperature converting portion 800, a
brightness converting portion 820, a contrast converting portion
840, and a saturation converting portion 860.
[0098] The color temperature converting portion 800 converts the
input image so that the input image has a target color temperature
value determined by the image color characteristic mapping unit
740. Color temperature conversion may be performed diversely. An
example of color temperature conversion is as follows: an input
color temperature of an input image is estimated. The estimated
color temperature of the input image and a user preference color
temperature are received. When a predetermined reference color
temperature is converted into the user preference color temperature
by a predetermined mapping method, a target color temperature of an
output image in which the color temperature of the input image is
converted by the mapping method is obtained. Then, a color
temperature conversion coefficient is obtained using the input
color temperature and the output color temperature, and the input
image is converted into an output image having the target color
temperature based on the color temperature conversion
coefficient.
[0099] The brightness converting portion 820 converts the input
image so that the input image has a brightness value generated in
the image color characteristic mapping unit 740. Brightness
conversion of an image may be performed diversely. An example of
image brightness conversion is as follows: first, a brightness
enhancement reference value of a predetermined pixel is obtained. A
brightness enhancement ratio is obtained by dividing the brightness
enhancement reference value by a maximum component value. Then,
brightness is enhanced by multiplying each component of the
predetermined pixel by the brightness enhancement ratio.
[0100] The contrast converting portion 840 converts the input image
so that the input image has a contrast value generated inn the
image color characteristic mapping unit 740. Contrast conversion of
an image may be performed diversely. An example of image contrast
conversion is as follows: first, average brightness in one frame of
an image is obtained. A brightness enhancement parameter is
calculated from the average brightness of the image. Maximum and
minimum values of the brightness range of the image are calculated.
An application brightness range maximum value/minimum value is
calculated. Brightness range extension is calculated in each pixel
and in each section. Brightness enhancement values are calculated
in each pixel using the brightness enhancement parameter.
Calculation of brightness range extension in each pixel and in each
section and calculation of brightness enhancement values in each
pixel using the brightness enhancement parameter are repeatedly
performed until all pixels in one frame of the image are
processed.
[0101] The saturation converting portion 860 converts the input
image so that the input image has a saturation value generated in
the image color characteristic mapping unit 740. Saturation
conversion of an image may be performed diversely. An example of
image saturation conversion is as follows: first, a saturation
component is extracted from an input image. A saturation
enhancement function used to enhance saturation of the input image
is determined according to a predetermined reference value. Then,
the extracted saturation component is changed using the saturation
enhancement function, the changed saturation component and the
remaining component of the input image are synthesized, thereby
generating an output color value. An output image is generated
based on the output color value.
[0102] The present invention can also be embodied on computer
(including all devices having an information processing function)
readable recording media. The computer readable recording media
include all types of recording devices in which data that can be
read by a computer system are stored, such as ROMs, RAMs, CD-ROMs,
magnetic tapes, floppy discs, and optical data storage units.
[0103] As described above, in a method and apparatus for generating
user preference data regarding the color characteristic of an image
and a method and apparatus for converting image color preference
using the method and apparatus, a target value that satisfies
user's characteristics is set when a color characteristic of an
image is converted, such that a converted image that satisfies user
preference can be obtained.
[0104] In addition, since a color preference data structure
generated according to the present invention can be represented
with a data format, such as XML and binary sequence, the color
preference data structure can be commonly used to generate video
that satisfies user preference in an image display device having a
variety of obtained color preferences, image display software, and
a service system and apparatus for supplying video to a user via
wire/wireless transmission.
[0105] While this invention has been particularly shown and
described with reference to preferred embodiments thereof, it will
be understood by those skilled in the art that various changes in
form and details may be made therein without departing from the
spirit and scope of the invention as defined by the appended
claims.
* * * * *