U.S. patent application number 13/886553 was filed with the patent office on 2013-12-19 for array camera imaging system and method.
This patent application is currently assigned to Sony Mobile Communications AB. The applicant listed for this patent is SONY MOBILE COMMUNICATIONS AB. Invention is credited to Jonas Gustavsson, Henrik Heringslack, Daniel Linaker, Mats Wernersson.
Application Number | 20130335598 13/886553 |
Document ID | / |
Family ID | 48190318 |
Filed Date | 2013-12-19 |
United States Patent
Application |
20130335598 |
Kind Code |
A1 |
Gustavsson; Jonas ; et
al. |
December 19, 2013 |
ARRAY CAMERA IMAGING SYSTEM AND METHOD
Abstract
Aspects of the disclosed technology relate to an imaging system
and method in which an array camera is employed along with an image
processor to make use of parallax convergence differences in color
and luminance between different cameras or groups of cameras within
an array camera to provide improved super resolution
performance.
Inventors: |
Gustavsson; Jonas; (Lund,
SE) ; Heringslack; Henrik; (Lund, SE) ;
Linaker; Daniel; (Lund, SE) ; Wernersson; Mats;
(Lund, SE) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
SONY MOBILE COMMUNICATIONS AB |
Lund |
|
SE |
|
|
Assignee: |
Sony Mobile Communications
AB
Lund
SE
|
Family ID: |
48190318 |
Appl. No.: |
13/886553 |
Filed: |
May 3, 2013 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61661199 |
Jun 18, 2012 |
|
|
|
Current U.S.
Class: |
348/234 |
Current CPC
Class: |
H04N 5/232939 20180801;
H04N 9/04557 20180801; H04N 5/2258 20130101; H04N 13/232 20180501;
G06T 3/4069 20130101; H04N 5/23232 20130101; H04N 5/2351 20130101;
H04N 9/045 20130101; H04N 2013/0088 20130101; H04N 5/232933
20180801 |
Class at
Publication: |
348/234 |
International
Class: |
H04N 5/235 20060101
H04N005/235 |
Claims
1. An imaging device comprising: an array camera having at least
M.times.N cameras, where each of M and N is at least two, the array
camera producing M.times.N images, wherein the array camera
includes N groups of M cameras; and an image processor operatively
coupled to the array camera, the image processor configured to: for
each group of M cameras, parallax converge the images from the
group of M cameras based on a different predefined offset point
within a predefined sampling pattern, producing N group outputs;
parallax converge the N group outputs based on the different
predefined offset points within the predefined sampling pattern to
create a color sub-pixel sampling pattern; parallax converge
luminance output of all cameras within the array camera to the
predefined sampling pattern to create a luminance sub-pixel
sampling pattern; create a super resolution image by: sampling the
color sub-pixel sampling pattern at a first increased resolution;
sampling the luminance sub-pixel sampling pattern at a second
increased resolution greater than the first increased resolution;
and combining the output from the sampled color sub-pixel sampling
pattern and the luminance sub-pixel sampling pattern.
2. The imaging device of claim 1, wherein M=N=4 and the cameras
within the array camera are monochromatic cameras.
3. The imaging device of claim 2, wherein the first increased
resolution is at least 3.times., and the second increased
resolution is at least 4.times..
4. The imaging device of claim 1, wherein the image processor is
configured to sample the color sub-pixel sampling pattern using a
base sample contribution falloff.
5. The imaging device of claim 1, wherein the image processor is
configured to sample the luminance sub-pixel sampling pattern using
a base sample contribution falloff.
6. The imaging device of claim 1, wherein the image processor is
configured to sample the color sub-pixel sampling pattern using a
spherical sample contribution falloff sampling to surrounding
samples.
7. The imaging device of claim 1, wherein the image processor is
configured to sample the luminance sub-pixel sampling pattern using
a spherical sample contribution falloff sampling to surrounding
samples.
8. The imaging device of claim 1, wherein the predefined sampling
pattern is one of a Hilbert sampling pattern or a recursive
sampling pattern.
9. The imaging device of claim 1, wherein the different predefined
offset points create vertical, horizontal and diagonal differences
between the 4 group outputs.
10. The imaging device of claim 1, wherein the predefined offset
points include at least four points per pixel that are relatively
offset vertically, horizontally and diagonally.
11. A method of generating an image, the method comprising:
capturing image data using an array camera having at least
M.times.N cameras, where each of M and N is at least two, the array
camera producing M.times.N images, wherein the array camera
includes N groups of M cameras; for each group of M cameras,
parallax converging the images from the group of M cameras based on
a different predefined offset point within a predefined sampling
pattern, producing N group outputs; parallax converging the N group
outputs based on the different predefined offset points within the
predefined sampling pattern to create a color sub-pixel sampling
pattern; parallax converge luminance output of all cameras within
the array camera to the predefined sampling pattern to create a
luminance sub-pixel sampling pattern; creating a super resolution
image by: sampling the color sub-pixel sampling pattern at a first
increased resolution; sampling the luminance sub-pixel sampling
pattern at a second increased resolution greater than the first
increased resolution; and combining the output from the sampled
color sub-pixel sampling pattern and the luminance sub-pixel
sampling pattern.
12. The method of claim 11, wherein M=N=4 and the cameras within
the array camera are monochromatic cameras.
13. The method of claim 11, wherein the first increased resolution
is at least 3.times., and the second increased resolution is at
least 4.times..
14. The method of claim 11, wherein sampling the color sub-pixel
sampling pattern includes using one of a base sample contribution
falloff or a spherical sample contribution falloff to surrounding
samples.
15. The method of claim 11, wherein sampling the luminance
sub-pixel sampling pattern includes using one of a base sample
contribution falloff or a spherical sample contribution falloff to
surrounding samples.
16. The method of claim 11, wherein the predefined sampling pattern
is one of a Hilbert sampling pattern or a recursive sampling
pattern.
17. The method of claim 11, wherein the different predefined offset
points create vertical, horizontal and diagonal differences between
the 4 group outputs.
18. The method of claim 11, wherein the predefined offset points
include at least four points per pixel that are relatively offset
vertically, horizontally and diagonally.
19. The method of claim 11, wherein M=N=3.
Description
TECHNICAL FIELD OF THE INVENTION
[0001] The technology of the present disclosure relates generally
to an imaging device and method, and more particularly to an array
camera imaging device and method for producing super resolution
images.
BACKGROUND
[0002] Conventional cameras typically include one or more lenses
and a photodetector element for capturing image data. Array cameras
(also known as light-field cameras or plenoptic cameras) employ an
array of cameras that capture images from slightly different
perspectives. Array cameras can be useful for capturing images when
motion is involved. One drawback of conventional array cameras is
that image resolution can be less than commercially desirable.
SUMMARY
[0003] Aspects of the disclosed technology relate to an imaging
system and method in which an array camera is employed along with
an image processor to make use of parallax convergence differences
in color and luminance between different cameras or groups of
cameras within an array camera to provide improved super resolution
performance.
[0004] One aspect of the disclosed technology relates to an imaging
device that includes an array camera having at least M.times.N
cameras, where each of M and N is at least two, the array camera
producing M.times.N images, wherein the array camera includes N
groups of M cameras; and an image processor operatively coupled to
the array camera, the image processor configured to: for each group
of M cameras, parallax converge the images from the group of M
cameras based on a different predefined offset point within a
predefined sampling pattern, producing N group outputs; parallax
converge the N group outputs based on the different predefined
offset points within the predefined sampling pattern to create a
color sub-pixel sampling pattern; parallax converge luminance
output of all cameras within the array camera to the predefined
sampling pattern to create a luminance sub-pixel sampling pattern;
create a super resolution image by: sampling the color sub-pixel
sampling pattern at a first increased resolution; sampling the
luminance sub-pixel sampling pattern at a second increased
resolution greater than the first increased resolution; and
combining the output from the sampled color sub-pixel sampling
pattern and the luminance sub-pixel sampling pattern.
[0005] According to one feature, M=N=4 and the cameras within the
array camera are monochromatic cameras.
[0006] According to one feature, the first increased resolution is
at least 3.times., and the second increased resolution is at least
4.times..
[0007] According to one feature, the image processor is configured
to sample the color sub-pixel sampling pattern using a base sample
contribution falloff.
[0008] According to one feature, the image processor is configured
to sample the luminance sub-pixel sampling pattern using a base
sample contribution falloff.
[0009] According to one feature, the image processor is configured
to sample the color sub-pixel sampling pattern using a spherical
sample contribution falloff sampling to surrounding samples.
[0010] According to one feature, the image processor is configured
to sample the luminance sub-pixel sampling pattern using a
spherical sample contribution falloff sampling to surrounding
samples.
[0011] According to one feature, the predefined sampling pattern is
a Hilbert sampling pattern.
[0012] According to one feature, the predefined sampling pattern is
a recursive sampling pattern.
[0013] According to one feature, the different predefined offset
points create vertical, horizontal and diagonal differences between
the 4 group outputs.
[0014] According to one feature, the predefined offset points
include at least four points per pixel that are relatively offset
vertically, horizontally and diagonally.
[0015] According to one feature, a portable communication device
includes the imaging device as described above.
[0016] According to one feature, the portable communication device
is a mobile phone.
[0017] Another aspect of the disclosed technology relates to a
method of generating an image that includes capturing image data
using an array camera having at least M.times.N cameras, where each
of M and N is at least two, the array camera producing M.times.N
images, wherein the array camera includes N groups of M cameras;
for each group of M cameras, parallax converging the images from
the group of M cameras based on a different predefined offset point
within a predefined sampling pattern, producing N group outputs;
parallax converging the N group outputs based on the different
predefined offset points within the predefined sampling pattern to
create a color sub-pixel sampling pattern; parallax converge
luminance output of all cameras within the array camera to the
predefined sampling pattern to create a luminance sub-pixel
sampling pattern; creating a super resolution image by: sampling
the color sub-pixel sampling pattern at a first increased
resolution; sampling the luminance sub-pixel sampling pattern at a
second increased resolution greater than the first increased
resolution; and combining the output from the sampled color
sub-pixel sampling pattern and the luminance sub-pixel sampling
pattern.
[0018] According to one feature, M=N=4 and the cameras within the
array camera are monochromatic cameras.
[0019] According to one feature, the first increased resolution is
at least 3.times., and the second increased resolution is at least
4.times..
[0020] According to one feature, sampling the color sub-pixel
sampling pattern includes using a base sample contribution
falloff.
[0021] According to one feature, sampling the color sub-pixel
sampling pattern includes using a spherical sample contribution
falloff to surrounding samples.
[0022] According to one feature, sampling the luminance sub-pixel
sampling pattern includes using a base sample contribution
falloff.
[0023] According to one feature, sampling the luminance sub-pixel
sampling pattern includes using a spherical sample contribution
falloff to surrounding samples.
[0024] According to one feature, the predefined sampling pattern is
a Hilbert sampling pattern.
[0025] According to one feature, the predefined sampling pattern is
a recursive sampling pattern.
[0026] According to one feature, the different predefined offset
points create vertical, horizontal and diagonal differences between
the 4 group outputs.
[0027] According to one feature, the predefined offset points
include at least four points per pixel that are relatively offset
vertically, horizontally and diagonally.
[0028] According to one feature, M=N=3.
[0029] These and further features will be apparent with reference
to the following description and attached drawings. In the
description and drawings, particular embodiments of the invention
have been disclosed in detail as being indicative of some of the
ways in which the principles of the invention may be employed, but
it is understood that the invention is not limited correspondingly
in scope. Rather, the invention includes all changes, modifications
and equivalents coming within the scope of the claims appended
hereto.
BRIEF DESCRIPTION OF THE DRAWINGS
[0030] FIG. 1 is a functional block diagram of an imaging device in
accordance with one aspect of the disclosed technology;
[0031] FIG. 2 is a diagrammatic illustration of an exemplary array
camera in accordance with one aspect of the disclosed
technology;
[0032] FIG. 3 is a diagrammatic illustration of an exemplary array
camera in accordance with one aspect of the disclosed
technology;
[0033] FIG. 4 is a flow diagram of an imaging method in accordance
with one exemplary embodiment of the disclosed technology;
[0034] FIG. 5 is a flow diagram of an imaging method in accordance
with one exemplary embodiment of the disclosed technology;
[0035] FIG. 6 is a diagrammatic illustration of predefined offset
positions and an associated sampling technique in accordance with
one aspect of the disclosed technology;
[0036] FIG. 7 is a diagrammatic illustration of a
resolution-enhanced sampling technique in accordance with one
aspect of the disclosed technology;
[0037] FIG. 8 is a diagrammatic illustration of predefined offset
positions and an associated sampling technique in accordance with
one aspect of the disclosed technology;
[0038] FIG. 9 is a diagrammatic illustration of a
resolution-enhanced sampling technique in accordance with one
aspect of the disclosed technology;
[0039] FIG. 10 is an exemplary optical source in the form of a
scene to be captured by an array camera;
[0040] FIG. 11 is an exemplary low resolution pixel output
corresponding to the scene of FIG. 10;
[0041] FIG. 12 is an exemplary color sub-pixel sampling pattern
corresponding to FIG. 10 and FIG. 11;
[0042] FIG. 13 is an exemplary color sub-pixel sampling pattern
corresponding to FIG. 10 and FIG. 11;
[0043] FIGS. 14 and 15 are respectively a front view and a rear
view of an exemplary electronic device that includes an exemplary
array camera; and
[0044] FIG. 16 is a schematic block diagram of the electronic
device of FIGS. 14 and 15 as part of a communications system in
which the electronic device may operate;
DETAILED DESCRIPTION OF EMBODIMENTS
[0045] Embodiments will now be described with reference to the
drawings, wherein like reference numerals are used to refer to like
elements throughout. It will be understood that the figures are not
necessarily to scale. Features that are described and/or
illustrated with respect to one embodiment may be used in the same
way or in a similar way in one or more other embodiments and/or in
combination with or instead of the features of the other
embodiments.
[0046] Described more fully below in conjunction with the appended
figures are various embodiments of an imaging system and method in
which an array camera is employed along with an image processor to
make use of parallax convergence differences between different
cameras or groups of cameras within an array camera to provide
improved super resolution performance. The present disclosure uses
color and luminance difference information from respective images
from cameras within an array camera to create additional image
pixels for resolution enhancement based on color and luminance. In
accordance with one exemplary embodiment in which the array camera
includes four groups of cameras in the case of monochromatic
cameras, use of parallax convergence differences between different
cameras or groups of cameras drives a super resolution
implementation with at least a three times (3.times.) resolution
improvement for color and at least a four times (4.times.)
resolution improvement for luminance (also referred to as detail
information).
[0047] It will be appreciated that the disclosed imaging device and
imaging method technology may be applied to other operational
contexts such as, but not limited to, a dedicated camera or another
type of electronic device that has a camera. Examples of these
other devices include, but are not limited to a video camera, a
digital photo viewer (sometimes referred to as a digital picture
frame), a navigation device (commonly referred to as a "GPS" or
"GPS device"), a personal digital assistant (PDA), a media player
(e.g., an MP3 player), a gaming device, a "web" camera, a computer
(including a laptop, an "ultra-mobile PC" or other type of
computer), and an accessory for another electronic device.
[0048] Turning now to FIG. 1, a functional block diagram of an
imaging device 10 is provided. In accordance with one exemplary
embodiment, the imaging device 10 can include, among other
components, a suitable array camera 12, an image processor 14
operatively coupled to the camera array and a controller 16
operatively coupled to the array camera 12 and the image
processor.
[0049] It will be appreciated that the array camera 12 can take on
any suitable geometry or configuration. For example, in accordance
with one embodiment, the array camera can be configured to include
an array of M.times.N cameras (designated generally as 18), where M
and N are at least two. For example, as shown in FIG. 2 and FIG. 3,
the array camera can include configured as an array N.times.N
cameras 18, where each camera 18 within the array is a
monochromatic camera, and the monochromatic cameras are arranged in
groups. For example, as shown in FIGS. 2 and 3, the array camera
can be configured as an array of 4.times.4 monochromatic cameras
arranged in four groups. The cameras 18 may be arranged in a Bayer
pattern (see, FIG. 3, for example) or in any other suitable
geometry or configuration for producing color images.
[0050] The array camera 12 can be configured to include a number of
single cameras (e.g., including individual lenses and individual
photosensors) or as an array of lenses or microlenses together with
a larger photosensor). The individual cameras within the array
camera can be arranged in any suitable configuration without
departing from the scope of the disclosed technology. As discussed
above, in one embodiment, cameras are arranged into a grid format.
In other embodiments, the cameras can be arranged in a non-grid
format. For example, the cameras can be arranged in a linear
pattern, a circular pattern or any other suitable pattern,
including sub-pixel offsets. It also will be appreciated that the
array camera could include as few as two camera groups (although
super resolution performance would be less that an array camera
having at least four camera groups).
[0051] In accordance with one embodiment, the array camera can be
fabricated on a semiconductor chip to include a plurality of
photosensor elements. Each of the cameras can include a plurality
of pixels (e.g., 0.32 Mega pixels). The array camera can include
two or more types of heterogeneous cameras, with each camera
including two or more sensor elements or pixels. It will be
appreciated that each one of the cameras can have different imaging
characteristics. Alternatively, there may be two or more different
types of cameras where the same type of imager shares the same
imaging characteristics.
[0052] In accordance with one embodiment, each camera can include
its own filter and/or optical element (e.g., lens). Each of the
cameras or group of cameras can be associated with spectral color
filters to receive certain wavelengths of light. Example filters
include a traditional filter used in the Bayer pattern (R, G, B or
their complements C, M, Y), an IR-cut filter, a near-IR filter, a
polarizing filter, and a custom filter to suit the needs of
hyper-spectral imaging. Some cameras can have no filter to allow
reception of both the entire visible spectra and near-IR, which
increases the cameras signal-to-noise ratio. The number of distinct
filters may be as large as the number of cameras in the camera
array. Further, each of the cameras or group of cameras can receive
light through lenses having different optical characteristics
(e.g., focal lengths) or apertures of different sizes.
[0053] It will be appreciated that a photosensor, a sensor element
or pixel refers to an individual light sensing element in a camera.
The light sensing element can be, but is not limited to,
traditional CIS (CMOS Image Sensor), CCD (charge-coupled device),
high dynamic range pixel, multispectral pixel and various
alternatives thereof. In addition, a photosensor or simply a sensor
can refer to a two-dimensional array of pixels used to capture an
image formed on the sensor by the optics of the camera. The sensor
elements of each sensor have similar physical properties and
receive light through the same optical component. Further, the
sensor elements in the each sensor may be associated with the same
color filter.
[0054] Image data is captured by each camera within the array. The
image data is processed by the processor 14 as set forth in more
detail below to provide super resolution enhancement both with
respect to color and luminance (detail information). In accordance
with one exemplary embodiment, where the array camera includes at
least four groups of monochromatic cameras), the processor 14
processes the image data from the various cameras to provide an
output image with at least three times (3.times.) resolution
increase with respect to color and at least four times (4.times.)
resolution increase with respect to luminance. It will be
appreciated that other super resolution performance may be achieved
depending on the number of cameras within the array camera.
[0055] It will be appreciated that the array camera 12 can include
other related circuitry. The other circuitry may include, among
others, circuitry to control imaging parameters and sensors to
sense physical parameters. The control circuitry may control
imaging parameters such as exposure times, gain, and black level
offset. The sensor may include dark pixels to estimate dark current
at the operating temperature. The dark current may be measured for
on-the-fly compensation for any thermal creep that the substrate
may suffer from. Alternatively, compensation of thermal effects
associated with the optics, e.g., because of changes in refractive
index of the lens material, may be accomplished by calibrating the
PSF (point spread function) for different temperatures.
[0056] In accordance with one embodiment, the controller 16 (e.g.,
a circuit for controlling imaging parameters) may trigger each
camera independently or in a synchronized manner. The start of the
exposure periods for the various cameras in the array camera
(analogous to opening a shutter) may be staggered in an overlapping
manner so that the scenes are sampled sequentially while having
several cameras being exposed to light at the same time. In a
conventional video camera sampling a scene at X exposures per
second, the exposure time per sample is limited to 1/X seconds.
With a plurality of cameras, there is no such limit to the exposure
time per sample because multiple cameras may be operated to capture
images in a staggered manner.
[0057] Each camera can be operated independently. Entire or most
operations associated with each individual camera can be
individualized. In one embodiment, a master setting is programmed
and deviation (i.e., offset or gain) from such master setting is
configured for each camera. The deviations may reflect functions
such as high dynamic range, gain settings, integration time
settings, digital processing settings or combinations thereof.
These deviations can be specified at a low level (e.g., deviation
in the gain) or at a higher level (e.g., difference in the ISO
number, which is then automatically translated to deltas for gain,
integration time, or otherwise as specified by context/master
control registers) for the particular array camera. By setting the
master values and deviations from the master values, higher levels
of control abstraction can be achieved to facilitate a simpler
programming model for many operations. In one embodiment, the
parameters for the cameras are arbitrarily fixed for a target
application. In another embodiment, the parameters are configured
to allow a high degree of flexibility and programmability.
[0058] In accordance with one embodiment, the array camera can be
designed as a drop-in replacement for existing camera image sensors
used in cell phones and other mobile devices. For this purpose, the
array camera can be designed to be physically compatible with
conventional image sensors of approximately the same resolution,
although the achieved resolution of the array camera can exceed
conventional image sensors in many photographic situations. Taking
advantage of the increased performance, the array camera, in
accordance with embodiments of the disclosed technology, can
include fewer pixels to obtain equal or better quality images
compared to conventional image sensors. Alternatively, the size of
the pixels in the imager may be reduced compared to pixels in
conventional image sensors while achieving comparable results.
[0059] The controller 16 can be configured or otherwise implemented
as hardware, software, firmware or a combination thereof for
controlling various operation parameters of the camera array 12 and
the image processor 14. The controller 16 can receive inputs from a
user or other external components and sends operation signals to
control the camera array 12. The controller 16 may also send
information to the image processor 14 to assist in processing the
images.
[0060] Likewise, the image processor 14 can include suitable
hardware, firmware, software or a combination for processing the
images received from the array camera 12. The image processor 14
processes multiple images from the cameras or groups of cameras,
for example, as described below in detail with reference to FIG. 4
and FIG. 5. The super resolution processed image then can be sent
for display, storage, transmittal or further processing.
[0061] As is discussed more fully below, the processor is
configured to converge image data from different cameras onto
different sample points that are relatively offset vertically,
horizontally and/or diagonally. This convergence based on offset
sample points can be performed with respect to color image data and
luminance image data. After the sample points are redistributed
onto a new image, it is possible to achieve an image that is higher
in resolution than the original image (a so-called super resolution
image).
[0062] Turning now to FIG. 4 and FIG. 5, exemplary imaging methods
are illustrated. FIG. 4 illustrates a simplified exemplary imaging
method based on a number of images captured using an array camera
including groups of monochromatic cameras (e.g., an array camera
including a 4.times.4 array of monochromatic cameras arranged in a
Bayer pattern). FIG. 5 illustrates a more detailed exemplary
imaging method also based on a number of images captured using an
array camera including groups of monochromatic cameras (e.g., an
array camera including a 4.times.4 array of monochromatic cameras
arranged in a Bayer pattern). It will be appreciated that while
FIG. 4 and FIG. 5 include steps 100 and 150 for capturing image
data with an array camera, the below-described imaging methods can
be performed by simply receiving image data corresponding to array
camera image data (e.g., previously captured array camera image
data) without departing from the scope of the disclosed
technology.
[0063] Variations to the illustrated methods are possible and,
therefore, the illustrated embodiments should not be considered the
only manner of carrying out the techniques that are disclosed in
this document. Also, while FIG. 4 and FIG. 5 show a specific order
of executing functional logic blocks, the order of executing the
blocks may be changed relative to the order shown and/or may be
implemented in an object-oriented manner or a state-oriented
manner. In addition, two or more blocks shown in succession may be
executed concurrently or with partial concurrence. Certain blocks
also may be omitted. The exemplary method may be carried out by
executing code stored by the electronic device, for example. The
code may be embodied as a set of logical instructions that may be
executed by a processor. Therefore, the methods may be embodied as
software in the form of a computer program that is stored on a
computer readable medium, such as a memory.
[0064] With reference to FIG. 4, at step 100, image data is
captured using the array camera. As is discussed more fully above,
the array camera can be configured in numerous ways without
departing from the scope of the disclosed technology. For example,
array camera can be configured to include four RGB cameras or four
groups of four monochromatic cameras in a Bayer configuration. At
step 110, the image data from the array camera is parallax
converged based on predefined sampling offset points to create a
color sub-pixel sampling pattern. In accordance with one
embodiment, the predefined sampling offset points are selected to
include at least four points per pixel that are relatively offset
vertically, horizontally and diagonally. It will be appreciated
that defining offset sampling points (e.g., sampling points offset
vertically, horizontally and diagonally relative to one another),
provides a situation where the images are converged, but not
completely converged (only converged at designated sample
points).
[0065] FIG. 6 is a diagrammatic representation of a pixel 30 (e.g.,
a pixel of image data captured with one or more cameras within an
array camera) together with predefined offset sample points A, B, C
and D within the pixel 30. The exemplary sampling pattern provides
parallax converging in an indirect manner to capture vertical,
horizontal and diagonal details (difference details). Each of
offset sample points A, B, C and D within pixel 30 is offset
vertically, horizontally and/or diagonally with respect to the
center of the pixel and with respect to each other. The exemplary
predefined offset sampling pattern shown in FIG. 8 can be thought
of as a slightly shifted cross pattern that provides vertical,
horizontal and diagonal differences between the four images
captured by the array camera. In accordance with one embodiment,
the sampling points A, B, C and D are defined at offset positions
as far apart as possible, while providing a repeatable sampling
pattern throughout the image data. FIG. 8 shows a different
exemplary sampling pattern for a pixel 30 in which the predefined
offset points for parallax converging are point 6, point 3, point 5
and point 2. Like with the sampling pattern shown in FIG. 6, the
sampling points 6, 3, 5 and 2 in FIG. 8 are defined at offset
positions as far apart as possible, while providing a repeatable
sampling pattern throughout the image data.
[0066] Stated differently, the incomplete parallax convergence
takes one pixel from each of the four images and uses the
difference between the different respective image pixels (e.g.,
pixel 1 on each of the four images) to converge on the different
sample points to allow re-creation of information within the image.
Instead of converging at the center of a given pixel for each
camera, the convergence is offset to a predefined location
desirable for sampling the image data from the specific camera to
allow re-creation of information within the image.
[0067] While this embodiment has been described with respect to
four predefined offset sample points, it will be appreciated that
the disclosed technology is not limited to this number of
predefined offset sampling points. For example, three, four, five
or more sampling points per pixel could be defined without
departing from the scope of the disclosed technology. Regardless of
the number of predefined offset sampling points, the sampling
points should be chosen to be spaced evenly over the given pixel
and the adjacent pixels.
[0068] It will be appreciated that the parallax converging
discussed above can be achieved using a suitable parallax
correction algorithm (except the algorithm is based on the
predefined offset points). It also will be appreciated that the
disclosed sampling methodology allows for sampling image
information outside the current pixel of interest (e.g., outside
pixel 30 in FIG. 6).
[0069] At step 120, the monochromatic image data from the array
camera is parallax converged based on predefined sampling offset
points to create a luminance sub-pixel sampling pattern. In
accordance with one embodiment, the predefined sampling offset
points are selected to include a number of points per pixel equal
to the total number of camera images, where the sampling offset
points are relatively offset vertically, horizontally and
diagonally. It will be appreciated that defining offset sampling
points (e.g., sampling points offset vertically, horizontally and
diagonally relative to one another), provides a situation where the
image data is converged, but not completely converged (only
converged at designated sample points). FIG. 8 shows an example in
which nine monochromatic cameras are included in the array camera,
so that there are nine predefined offset sampling points for
luminance within pixel 30.
[0070] Like with the sampling pattern shown in FIG. 6, the sampling
points 1-9 in FIG. 8 are defined at offset positions as far apart
as possible, while providing a repeatable sampling pattern
throughout the image data. It will be appreciated that this
methodology for determining offset sampling points will be employed
regardless of the number of sampling points for luminance values
within a given pixel.
[0071] Stated differently, the incomplete parallax convergence
takes one pixel from each of the images (in the FIG. 8 example,
nine images) and uses the difference between the different
respective image pixels (e.g., pixel 1 on each of the images) to
converge on the different sample points to allow re-creation of
information within the image with respect to luminance (detail
information). Instead of converging at the center of a given pixel
for each camera, the convergence is offset to a predefined location
desirable for sampling the luminance image data from the specific
camera to allow re-creation of information within the image.
[0072] While this embodiment has been described in FIG. 8 with
respect to nine predefined offset sample points, it will be
appreciated that the disclosed technology is not limited to this
number of predefined offset sampling points. For example, sixteen
or more luminance sampling points per pixel could be defined
without departing from the scope of the disclosed technology.
Regardless of the number of predefined offset sampling points, the
sampling points should be chosen to be spaced evenly over the given
pixel and the adjacent pixels.
[0073] It will be appreciated that the parallax converging
discussed above can be achieved using a suitable parallax
correction algorithm (except the algorithm is based on the
predefined offset points). It also will be appreciated that the
disclosed sampling methodology allows for sampling image
information outside the current pixel of interest (e.g., outside
pixel 30 in FIG. 6 or FIG. 8).
[0074] At step 130, the color sub-pixel sampling pattern created at
step 110 and the luminance sub-pixel sampling pattern created at
step 120 are sampled according to the predefined sampling pattern
at an increased resolution (providing a super resolution image). In
accordance with one exemplary embodiment in which four images from
four cameras are converged, the resolution increased sampling in
the form of a three-times (3.times.) super resolution increase for
color and at least a four-times (4.times.) super resolution
increase for luminance.
[0075] FIG. 6 shows an exemplary sample distribution in which the
single pixel 30 and the sampling points A, B, C, D are depicted. In
accordance with one embodiment, for every pixel of interest in the
original image data, the color sub-pixel sampling pattern is
sampled at nine pixels for every original one pixel of interest
using a spherical falloff (e.g., linear falloff) from each
predefined offset position. The circles in FIG. 6 are
two-dimensional representations of the spherical falloff sampling,
showing the contribution value to the output pixel on the
resolution enhanced (e.g., 3.times. resolution enhanced) output
image. This contribution falloff is linear from one sample point to
the next. For example, with respect to sampling point D, the
contribution from sampling point D falls off linearly from
approximately 100% to approximately 0% at sampling point C.
[0076] In accordance with one exemplary embodiment, the predefined
sampling pattern is a Hilbert pattern. It will be appreciated that
other sampling patterns can be employed without departing from the
scope of the disclosed technology.
[0077] FIG. 7 shows the single pixel 30 depicted in FIG. 6 now
sampled at nine pixels, providing a three-times (3.times.)
resolution increase. The arrows in FIG. 7 also show sample
contributions to the individual super resolution pixels (the nine
pixels within original pixel 30). It will be appreciated that the
linear falloff means that samples closer to a destination pixel
will contribute more to the individual pixel in the new
resolution-enhanced sampling pattern.
[0078] FIG. 8 shows an exemplary sample distribution in which the
single pixel 30 and the sampling points 1-9 are depicted. In
accordance with one embodiment, for every pixel of interest in the
original luminance image data, the luminance sub-pixel sampling
pattern is sampled at sixteen pixels for every original one pixel
of interest using a spherical falloff (e.g., linear falloff) from
each predefined offset position. FIG. 8 shows a single pixel 30
depicted in FIG. 6 now sampled for luminance at sixteen pixels,
providing a four-times (4.times.) resolution increase for
luminance. It will be appreciated that the linear falloff means
that samples closer to a destination pixel will contribute more
luminance to the individual pixel in the new resolution-enhanced
sampling pattern. This luminance contribution falloff is linear
from one sample point to the next. For example, with respect to
sampling point 9, the luminance contribution from sampling point 9
falls off linearly from approximately 100% to approximately 0% at
sampling point 5.
[0079] At step 140, the output from the sampled color sub-pixel
sampling pattern and the sampled luminance sub-pixel sampling
pattern are combined to form a resolution-enhanced output image. As
discussed above, in accordance with one exemplary embodiment in
which four images from four cameras are converged, the resolution
increased sampling in the form of a three-times (3.times.) super
resolution increase for color and at least a four-times (4.times.)
super resolution increase for luminance. Generation of the output
image can be thought of as placing the 3.times.3 grid of FIG. 7 on
top of or over the 4.times.4 grid of FIG. 8 or FIG. 9.
[0080] Turning now to FIG. 5, a more detailed exemplary imaging
method is illustrated where image data is captured using an array
camera including groups of monochromatic cameras (e.g., an array
camera including a 4.times.4 array of monochromatic cameras
arranged in a Bayer pattern). Aspects of the method in FIG. 5,
which correspond to elements depicted in FIG. 4, will not be
re-described here. Rather, reference will be made to the above
description for FIG. 4. At step 150 image data is captured using
the array camera. In this exemplary embodiment, the array camera
includes four groups of four monochromatic cameras in a Bayer
configuration. Of course, other array camera configurations could
be employed (e.g., a 3.times.3 array, a 3.times.4 array, a
5.times.5 array etc.).
[0081] At step 160, for each camera group, image data (e.g., RGB
image data) is parallax converged based on the predefined offset
sample point within the predefined sampling pattern. In other
words, the four monochromatic images for each camera group are
parallax converged into a full color image to an offset sampling
point within the predefined offset sampling points. In the
exemplary embodiment discussed here, the 16 monochromatic images
are parallax converged into 4 color images, each based on a
different predefined offset sampling point. Step 160 of FIG. 5
corresponds substantially to step 110 of FIG. 4, so the detailed
discussion will not be repeated here. Rather, reference is made to
the above discussion of step 110 in FIG. 4. Similarly, step 170 of
FIG. 5 corresponds substantially to step 120 of FIG. 4, so the
detailed discussion will not be repeated here. Rather, reference is
made to the above discussion of step 120 in FIG. 4.
[0082] At step 180, the color sub-pixel sampling pattern is sampled
at a first increased resolution. For example, as discussed above
with respect to FIG. 4, for every pixel of interest in the original
image data, the color sub-pixel sampling pattern is sampled at nine
pixels for every original one pixel of interest using a spherical
falloff (e.g., linear falloff) from each predefined offset
position. The circles in FIG. 6 are two-dimensional representations
of the spherical falloff sampling, showing the contribution value
to the output pixel on the resolution enhanced (e.g., 3.times.
resolution enhanced) output image. This contribution falloff is
linear from one sample point to the next. For example, with respect
to sampling point D, the contribution from sampling point D falls
off linearly from approximately 100% to approximately 0% at
sampling point C. FIG. 7 shows the single pixel 30 depicted in FIG.
6 now sampled at nine pixels, providing a three-times (3.times.)
resolution increase. The arrows in FIG. 7 also show sample
contributions to the individual super resolution pixels (the nine
pixels within original pixel 30). It will be appreciated that the
linear falloff means that samples closer to a destination pixel
will contribute more to the individual pixel in the new
resolution-enhanced sampling pattern.
[0083] At step 190, the luminance sub-pixel sampling pattern is
sampled at a second increased resolution greater than the first
increased resolution (e.g., a 4.times. resolution increase). For
example, as described above, FIG. 8 shows an exemplary sample
distribution in which the single pixel 30 and the sampling points
1-9 are depicted. In accordance with one embodiment, for every
pixel of interest in the original luminance image data, the
luminance sub-pixel sampling pattern is sampled at sixteen pixels
for every original one pixel of interest using a spherical falloff
(e.g., linear falloff) from each predefined offset position. FIG. 8
shows a single pixel 30 depicted in FIG. 6 now sampled for
luminance at sixteen pixels, providing a four-times (4.times.)
resolution increase for luminance. It will be appreciated that the
linear falloff means that samples closer to a destination pixel
will contribute more luminance to the individual pixel in the new
resolution-enhanced sampling pattern. This luminance contribution
falloff is linear from one sample point to the next. For example,
with respect to sampling point 9, the luminance contribution from
sampling point 9 falls off linearly from approximately 100% to
approximately 0% at sampling point 5.
[0084] Step 200 of FIG. 5 corresponds substantially to step 140 of
FIG. 4, so the detailed discussion will not be repeated here.
[0085] As is discussed above, the imaging method provides for
converging image data from different cameras onto different sample
points that are relatively offset vertically horizontally and
diagonally. After the sample points are redistributed onto a new
image, it is possible to achieve an image that is higher in
resolution than the original image (a so-called super resolution
image). In accordance with one exemplary embodiment, the array
camera can be configured as 2.5 megapixel camera with 4.times.4
array, which generates a super resolution image of approximately 12
megapixels.
[0086] Turning now to FIGS. 10-13, exemplary image data
illustrating the effect of the above-described exemplary imaging
methods is provided. FIG. 10 shows an exemplary optical source
(e.g., a scene to be captured by the array camera). FIG. 11 shows
exemplary pixel output from one low-resolution camera or camera
group. FIG. 12 shows an exemplary color sub-pixel sampling pattern
(e.g., step 110 of FIG. 4 and step 160 of FIG. 5). FIG. 13 shows an
exemplary luminance sub-pixel sampling pattern generated by the
imaging method described above (e.g., step 120 of FIG. 4 and step
170 of FIG. 5). The color sub-pixel sampling pattern of FIG. 12 is
sampled as described above (at step 130 of FIG. 4 or step 180 of
FIG. 5), while the luminance sub-pixel sampling pattern of FIG. 13
is sampled as described above (at step 130 of FIG. 4 or step 190 of
FIG. 5).
[0087] Turning now to FIGS. 14-16, it will be appreciated that the
imaging device 10 may be embodied in a standalone camera, in a
portable communication device, such as a mobile telephone, or in
any other suitable electronic device.
[0088] Referring initially to FIGS. 14 and 15, an electronic device
40 is shown. The illustrated electronic device 40 is a mobile
telephone. The electronic device 40 includes an imaging device 10
in the form of an array camera 12 for taking digital still pictures
and/or digital video clips. It is emphasized that the electronic
device 40 need not be a mobile telephone, but could be a dedicated
camera or some other device as indicated above.
[0089] With additional reference to FIG. 16, the imaging device 10
can include imaging optics 42 to focus light from a scene within
the field of view of the imaging device 10. As is discussed above,
the imaging device can include an array camera 12 operatively
coupled to a processor 14 and a controller 16.
[0090] The imaging optics 42 may include a lens assembly and any
other components that supplement the lens assembly, such as a
protective window, a filter, a prism, and/or a mirror. To adjust
the focus of the array camera 12, a focusing assembly that includes
focusing mechanics and/or focusing control electronics may be
present in conjunction with the imaging optics 42. A zooming
assembly also may be present to optically change the magnification
of captured images.
[0091] Other array camera 12 components may include a distance
meter (also referred to as a rangefinder), a supplemental
illumination source (e.g., a flash 44), a light meter 46, a display
48 for functioning as an electronic viewfinder, an optical
viewfinder (not shown), and any other components commonly
associated with cameras.
[0092] A user input 50 may be present for accepting user inputs.
The user input 50 may take one or more forms, such as a touch input
associated with the display 48, a keypad, buttons, and so forth.
One user input function may be a shutter key function that allows
the user to command the taking of a photograph. In one embodiment,
the display 48 has a relatively fast refresh rate. Most
commercially available organic light emitting diode (OLED) displays
have a satisfactory refresh rate for purposes of the disclosed
techniques for displaying a photo.
[0093] Another component of the array camera 12 may be an
electronic controller 16 that controls operation of the array
camera 12. As discussed above, the controller 16 may be embodied,
for example, as a processor that executes logical instructions that
are stored by an associated memory, as firmware, as an arrangement
of dedicated circuit components, or as a combination of these
embodiments. Thus, processes for operating the array camera 12 may
be physically embodied as executable code (e.g., software) that is
stored on a computer readable medium (e.g., a memory), or may be
physically embodied as part of an electrical circuit. In another
embodiment, the functions of the controller 16 may be carried out
by a control circuit 50 that is responsible for overall operation
of the electronic device 10. In this case, the controller 16 may be
omitted. In another embodiment, array camera 12 control functions
may be distributed between the controller 16 and the control
circuit 50.
[0094] In addition to the array camera 12, the electronic device 10
may include a rearward facing camera 52. The rearward facing camera
52 may be arranged so as to be directed toward and capable of
capturing images or video of the user of the electronic device 10
when the user views the display 48. In one embodiment, the rearward
camera 52 may be used in connection with video telephony to capture
images or video of the user for transmission to a called or calling
device.
[0095] Additional features of the electronic device 40, when
implemented as a mobile telephone, will be described with
additional reference to FIG. 16. The display 48 displays graphical
user interfaces, information and content (e.g., images, video and
other graphics) to a user to enable the user to utilize the various
features of the electronic device 10. The display 48 may be coupled
to the control circuit 50 by a video processing circuit 54 that
converts video data to a video signal used to drive the display 48.
The video processing circuit 54 may include any appropriate
buffers, decoders, video data processors and so forth.
[0096] The electronic device 40 includes communications circuitry
that enables the electronic device 40 to establish communication
with another device. Communications may include voice calls, video
calls, data transfers, and the like. Communications may occur over
a cellular circuit-switched network or over a packet-switched
network (e.g., a network compatible with IEEE 802.11, which is
commonly referred to as WiFi, or a network compatible with IEEE
802.16, which is commonly referred to as WiMAX). Data transfers may
include, but are not limited to, receiving streaming content,
receiving data feeds, downloading and/or uploading data (including
Internet content), receiving or sending messages (e.g., text
messages, instant messages, electronic mail messages, multimedia
messages), and so forth. This data may be processed by the
electronic device 40, including storing the data in the memory 56,
executing applications to allow user interaction with the data,
displaying video and/or image content associated with the data,
outputting audio sounds associated with the data, and so forth.
[0097] In the exemplary embodiment, the communications circuitry
may include an antenna 58 coupled to a radio circuit 60. The radio
circuit 60 includes a radio frequency transmitter and receiver for
transmitting and receiving signals via the antenna 58. The radio
circuit 60 may be configured to operate in a mobile communications
system 62. Radio circuit 60 types for interaction with a mobile
radio network and/or broadcasting network include, but are not
limited to, global system for mobile communications (GSM), code
division multiple access (CDMA), wideband CDMA (WCDMA), general
packet radio service (GPRS), WiFi, WiMAX, integrated services
digital broadcasting (ISDB), high speed packet access (HSPA), etc.,
as well as advanced versions of these standards or any other
appropriate standard. It will be appreciated that the electronic
device 40 may be capable of communicating using more than one
standard. Therefore, the antenna 58 and the radio circuit 60 may
represent one or more than one radio transceiver.
[0098] The system 62 may include a communications network 64 having
a server 66 (or servers) for managing calls placed by and destined
to the electronic device 40, transmitting data to and receiving
data from the electronic device 40, and carrying out any other
support functions. The server 66 communicates with the electronic
device 40 via a transmission medium. The transmission medium may be
any appropriate device or assembly, including, for example, a
communications base station (e.g., a cellular service tower, or
"cell" tower), a wireless access point, a satellite, etc. The
network 64 may support the communications activity of multiple
electronic devices 40 and other types of end user devices. As will
be appreciated, the server 66 may be configured as a typical
computer system used to carry out server functions and may include
a processor configured to execute software containing logical
instructions that embody the functions of the server 66 and a
memory to store such software. In alternative arrangements, the
electronic device 40 may wirelessly communicate directly with
another electronic device 40 (e.g., another mobile telephone or a
computer) and without an intervening network.
[0099] As indicated, the electronic device 40 may include the
primary control circuit 50 that is configured to carry out overall
control of the functions and operations of the electronic device
40. The control circuit 50 may include a processing device 68, such
as a central processing unit (CPU), microcontroller or
microprocessor. The processing device 68 executes code stored in a
memory (not shown) within the control circuit 50 and/or in a
separate memory, such as the memory 56, in order to carry out
operation of the electronic device 40. The memory 56 may be, for
example, one or more of a buffer, a flash memory, a hard drive, a
removable media, a volatile memory, a non-volatile memory, a random
access memory (RAM), or other suitable device. In a typical
arrangement, the memory 56 may include a non-volatile memory for
long term data storage and a volatile memory that functions as
system memory for the control circuit 50. The memory 56 may
exchange data with the control circuit 30 over a data bus.
Accompanying control lines and an address bus between the memory 56
and the control circuit 50 also may be present.
[0100] The electronic device 40 further includes a sound signal
processing circuit 70 for processing audio signals transmitted by
and received from the radio circuit 60. Coupled to the sound
processing circuit 70 are a speaker 72 and a microphone 74 that
enable a user to listen and speak via the electronic device 40, and
hear sounds generated in connection with other functions of the
device 40. The sound processing circuit 70 may include any
appropriate buffers, encoders, decoders, amplifiers and so
forth.
[0101] The electronic device 40 may further include one or more
input/output (I/O) interface(s) 76. The I/O interface(s) 76 may be
in the form of typical mobile telephone I/O interfaces and may
include one or more electrical connectors for operatively
connecting the electronic device 10 to another device (e.g., a
computer) or an accessory (e.g., a personal handsfree (PHF) device)
via a cable. Further, operating power may be received over the I/O
interface(s) 76 and power to charge a battery of a power supply
unit (PSU) 78 within the electronic device 40 may be received over
the I/O interface(s) 76. The PSU 78 may supply power to operate the
electronic device 40 in the absence of an external power
source.
[0102] The electronic device 40 also may include various other
components. A position data receiver 80, such as a global
positioning system (GPS) receiver, may be involved in determining
the location of the electronic device 40. A local wireless
transceiver 820, such as a Bluetooth chipset, may be used to
establish communication with a nearby device, such as an accessory
(e.g., a PHF device), another mobile radio terminal, a computer or
another device.
[0103] Although certain embodiments have been shown and described,
it is understood that equivalents and modifications falling within
the scope of the appended claims will occur to others who are
skilled in the art upon the reading and understanding of this
specification.
* * * * *