U.S. patent application number 16/599761 was filed with the patent office on 2022-03-17 for method and electronic device for switching between first lens and second lens.
This patent application is currently assigned to SAMSUNG ELECTRONICS CO., LTD.. The applicant listed for this patent is SAMSUNG ELECTRONICS CO., LTD.. Invention is credited to Gururaj Bhat, Ravi Prasad Mohan KINI, Girish Kulkarni, Vineeth Thanikonda Munirathnam, Sanjay Narasimha Murthy, Balvinder Singh, Pavan Sudheendra.
Application Number | 20220086352 16/599761 |
Document ID | / |
Family ID | 1000006178452 |
Filed Date | 2022-03-17 |
United States Patent
Application |
20220086352 |
Kind Code |
A9 |
KINI; Ravi Prasad Mohan ; et
al. |
March 17, 2022 |
METHOD AND ELECTRONIC DEVICE FOR SWITCHING BETWEEN FIRST LENS AND
SECOND LENS
Abstract
A method for switching between a first lens and a second lens in
an electronic device includes displaying, by the electronic device,
a first frame showing a field of view (FOV) of the first lens;
detecting, by the electronic device, an event that causes the
electronic device to transition from displaying the first frame to
displaying a second frame showing a FOV of the second lens;
generating, by the electronic device and based on the detecting the
event, at least one intermediate frame for transitioning from the
first frame to the second frame; and switching, by the electronic
device and based on the detecting the event, from the first lens to
the second lens and displaying the second frame, wherein the at
least one intermediate frame is displayed after the displaying the
first frame and before the displaying the second frame while the
switching is performed.
Inventors: |
KINI; Ravi Prasad Mohan;
(Karnataka, IN) ; Bhat; Gururaj; (Karnataka,
IN) ; Sudheendra; Pavan; (Karnataka, IN) ;
Kulkarni; Girish; (Karnataka, IN) ; Munirathnam;
Vineeth Thanikonda; (Karnataka, IN) ; Murthy; Sanjay
Narasimha; (Karnataka, IN) ; Singh; Balvinder;
(Karnataka, IN) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
SAMSUNG ELECTRONICS CO., LTD. |
Suwon-si |
|
KR |
|
|
Assignee: |
SAMSUNG ELECTRONICS CO.,
LTD.
Suwon-si
KR
|
Prior
Publication: |
|
Document Identifier |
Publication Date |
|
US 20200120284 A1 |
April 16, 2020 |
|
|
Family ID: |
1000006178452 |
Appl. No.: |
16/599761 |
Filed: |
October 11, 2019 |
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G02B 13/0015 20130101;
G02B 15/14 20130101; H04N 5/23245 20130101; H04N 5/2259 20130101;
H04N 5/23296 20130101 |
International
Class: |
H04N 5/232 20060101
H04N005/232; G02B 15/14 20060101 G02B015/14; H04N 5/225 20060101
H04N005/225 |
Foreign Application Data
Date |
Code |
Application Number |
Oct 12, 2018 |
IN |
201841038833 |
Oct 7, 2019 |
IN |
201841038833 |
Claims
1. A method for switching between a first lens and a second lens in
an electronic device, comprising: displaying, by the electronic
device, a first frame showing a field of view (FOV) of the first
lens; detecting, by the electronic device, an event that causes the
electronic device to transition from displaying the first frame to
displaying a second frame showing a FOV of the second lens;
generating, by the electronic device and based on the detecting the
event, at least one intermediate frame for transitioning from the
first frame to the second frame; and switching, by the electronic
device and based on the detecting the event, from the first lens to
the second lens and displaying the second frame, wherein the at
least one intermediate frame is displayed after the displaying the
first frame and before the displaying the second frame while the
switching is performed.
2. The method of claim 1, wherein the generating the at least one
intermediate frame comprises: determining a lens switching delay
from a first time at which the first frame is displayed to a second
time at which the second frame is displayed; identifying at least
one transition parameter of the first lens and the second lens to
generate the at least one intermediate frame; obtaining at least
one from among a spatial alignment data, a photometric alignment
data and a color alignment data of the first lens and the second
lens; and generating the at least one intermediate frame based on
the at least one transition parameter, the lens switching delay,
and at least one from among the spatial alignment data, the
photometric alignment data and the color alignment data.
3. The method of claim 1, wherein the at least one intermediate
frame is at least one from among spatially aligned with respect to
the first frame and the second frame, photometrically aligned with
respect to the first frame and the second frame and color aligned
with respect to the first frame and the second frame.
4. The method of claim 2, further comprising determining the at
least one intermediate frame to be generated, wherein determining
the at least one intermediate frame to be generated comprises:
determining the spatial alignment data using the at least one
transition parameter; and determining the at least one intermediate
frame to be generated based on the determined spatial alignment
data and the at least one transition parameter.
5. The method of claim 2, wherein the spatial alignment data is
obtained by: capturing a first single frame associated with the
first lens and a second single frame of a same scene associated
with the second lens when the electronic device is in an idle mode;
resizing the first single frame and the second single frame into a
preview resolution size; computing feature points in the first
single frame and the second single frame; computing a
transformation matrix using the identified at least one transition
parameter and a Homography relationship between the first single
frame and the second single frame, wherein the transformation
matrix includes a first scaling of the first single frame, a second
scaling of the second single frame, a first rotation of the first
single frame, a second rotation of the second single frame, a first
translation of the first single frame, and a second translation of
the second single frame; and obtaining the spatial alignment data
using the transformation matrix.
6. The method of claim 2, wherein the photometric alignment data is
obtained by: computing a transformation matrix for the generated at
least one intermediate frame; computing a correction factor based
on the transformation matrix; and obtaining the photometric
alignment data based on the correction factor.
7. The method of claim 2, wherein the color alignment data is
obtained by: computing a transformation matrix for the generated at
least one intermediate frame; computing a correction factor for the
color alignment data based on the transformation matrix; and
obtaining the color alignment data based on the correction
factor.
8. The method of claim 2, wherein the at least one transition
parameter includes an F-value of the first lens, the FOV of the
first lens, a color profile of the first lens, a saturation profile
of the first lens, an F-value of the second lens, the FOV of the
second lens, a color profile of the second lens, a saturation
profile of the second lens, a scale factor of the first lens, a
scale factor of the second lens, a scale factor between the first
lens and the second lens, a single scale factor of a combination of
the first lens and the second lens, a pivot between the first lens
and the second lens, and a single pivot value of a combination of
the first lens and the second lens.
9. An electronic device for switching between a first lens and a
second lens, comprising: a memory; a processor coupled with the
memory, the processor being configured to: display a first frame
showing a field of view (FOV) of the first lens; detect an event
that causes the electronic device to transition from displaying the
first frame to displaying a second frame showing a FOV of the
second lens; generate, based on detecting the event, at least one
intermediate frame for transitioning from the first frame to the
second frame; and switch, based on detecting the event, from the
first lens to the second lens and display the second frame, wherein
the at least one intermediate frame is displayed after the first
frame is displayed and before the second frame is displayed while
the switching is performed.
10. The electronic device of claim 9, wherein the processor is
further configured to: determine a lens switching delay from a
first time at which the first frame is displayed to a second time
at which the second frame is displayed; identify at least one
transition parameter of the first lens and the second lens to
generate the at least one intermediate frame; obtain at least one
from among a spatial alignment data, a photometric alignment data
and a color alignment data of the first lens and the second lens;
and generate the at least one intermediate frame based on the at
least one transition parameter, the lens switching delay, and at
least one from among the spatial alignment data, the photometric
alignment data and the color alignment data.
11. The electronic device of claim 9, wherein the at least one
intermediate frame is at least one from among spatially aligned
with respect to the first frame and the second frame,
photometrically aligned with respect to the first frame and the
second frame and color aligned with respect to the first frame and
the second frame.
12. The electronic device of claim 10, wherein the processor is
further configured to: determine the spatial alignment data using
the at least one transition parameter; and determine the at least
one intermediate frame to be generated based on the determined
spatial alignment data and the at least one transition
parameter.
13. The electronic device of claim 10, wherein the processor is
further configured to: capture a first single frame associated with
the first lens and a second single frame of a same scene associated
with the second lens when the electronic device is in an idle mode;
resize the first single frame and the second single frame into a
preview resolution size; compute feature points in the first single
frame and the second single frame; compute a transformation matrix
using the identified at least one transition parameter and a
Homography relationship between the first single frame and the
second single frame, wherein the transformation matrix includes a
first scaling of the first single frame, a second scaling of the
second single frame, a first rotation of the first single frame, a
second rotation of the second single frame, a first translation of
the of the first single frame, and a second translation of the
second single frame; and obtain the spatial alignment data using
the transformation matrix.
14. The electronic device of claim 10, wherein the photometric
alignment data is obtained by: computing a transformation matrix
for the generated at least one intermediate frame; computing a
correction factor based on the transformation matrix; and obtaining
the photometric alignment data based on the correction factor.
15. The electronic device of claim 10, wherein the color alignment
data is obtained by: computing a transformation matrix for the
generated at least one intermediate frame; computing a correction
factor for the color alignment data based on the transformation
matrix; and obtaining the color alignment data based on the
correction factor.
16. The electronic device of claim 10, wherein the at least one
transition parameter includes an F-value of the first lens, the FOV
of the first lens, a color profile of the first lens, a saturation
profile of the first lens, an F-value of the second lens, the FOV
of the second lens, a color profile of the second lens, a
saturation profile of the second lens, a scale factor of the first
lens, a scale factor of the second lens, a scale factor between the
first lens and the second lens, a scale factor of a combination of
the first lens and the second lens, a pivot between the first lens
and the second lens, and a single pivot value of a combination of
the first lens and the second lens.
17. A device comprising: a memory configured to store a first
attribute of a first camera and a second attribute of a second
camera; and a processor configured to: generate at least one
intermediate image frame based on the first attribute and the
second attribute; output the at least one intermediate image frame
after outputting a first image frame captured by the first camera
and before outputting a second image frame captured by the second
camera.
18. The device of claim 17, wherein the memory is further
configured to store at least one transition parameter based on the
first attribute and the second attribute, and wherein the processor
is further configured to generate the at least one intermediate
image frame based on the at least one transition parameter.
19. The device of claim 18, wherein the first image frame captured
by the first camera and the second image frame captured by the
second camera are of a same scene, and wherein the processor is
configured to generate a transformation matrix based on a
Homography relationship between the first image frame and the
second image frame, the Homography relationship being determined
based on the at least one transition parameter, and generate the at
least one intermediate image frame using the transformation
matrix.
20. The device of claim 19, wherein the processor is further
configured to determine a coefficient for each intermediate image
frame from among the at least one intermediate image frame and
generate each intermediate image frame based on the transformation
matrix and on the respective determined coefficient.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is based on and claims priority under 35
U.S.C. .sctn. 119 to an Indian Provisional Patent Application No.
201841038833 filed on Oct. 12, 2018 and an Indian Complete Patent
Application No. 201841043788, filed on Oct. 7, 2019, the
disclosures of which are incorporated by reference herein in their
entirety.
BACKGROUND
1. Field
[0002] The disclosure relates to an image processing system, and
more specifically is related to a method and electronic device for
switching between a first lens and a second lens.
2. Description of Related Art
[0003] In general, flagship features like dual and triple cameras
are being introduced in an electronic device (e.g., smart phone or
the like). But there are various constraints in the implementation
of the flagship features. One major constraint is that, all camera
systems cannot be simultaneously kept turned on, which causes a
camera switching delay. The transition between the cameras is not
seamless which results in reducing the user experience.
[0004] In the existing methods, switching from one camera to
another camera may be performed when frames from both cameras are
available during transition. A multi frame fusion module combines
information from the frames and sends a resulting frame to a
display for preview. This also results in reducing the user
experience.
[0005] Thus, it is desired to address the above mentioned
disadvantages or other shortcomings or provide a useful
alternative.
SUMMARY
[0006] In accordance with an aspect of the disclosure, a method for
switching between a first lens and a second lens in an electronic
device includes displaying, by the electronic device, a first frame
showing a field of view (FOV) of the first lens; detecting, by the
electronic device, an event that causes the electronic device to
transition from displaying the first frame to displaying a second
frame showing a FOV of the second lens; generating, by the
electronic device and based on the detecting the event, at least
one intermediate frame for transitioning from the first frame to
the second frame; and switching, by the electronic device and based
on the detecting the event, from the first lens to the second lens
and displaying the second frame, wherein the at least one
intermediate frame is displayed after the displaying the first
frame and before the displaying the second frame while the
switching is performed.
[0007] The generating the at least one intermediate frame may
include determining a lens switching delay from a first time at
which the first frame is displayed to a second time at which the
second frame is displayed; identifying at least one transition
parameter of the first lens and the second lens to generate the at
least one intermediate frame; obtaining at least one from among a
spatial alignment data, a photometric alignment data and a color
alignment data of the first lens and the second lens; and
generating the at least one intermediate frame based on the at
least one transition parameter, the lens switching delay, and at
least one from among the spatial alignment data, the photometric
alignment data and the color alignment data.
[0008] The at least one intermediate frame may be at least one from
among spatially aligned with respect to the first frame and the
second frame, photometrically aligned with respect to the first
frame and the second frame and color aligned with respect to the
first frame and the second frame.
[0009] The method may further include determining the at least one
intermediate frame to be generated, wherein determining the at
least one intermediate frame to be generated includes determining
the spatial alignment data using the at least one transition
parameter; and determining the at least one intermediate frame to
be generated based on the determined spatial alignment data and the
at least one transition parameter.
[0010] The spatial alignment data is obtained by capturing a first
single frame associated with the first lens and a second single
frame of a same scene associated with the second lens when the
electronic device is in an idle mode; resizing the first single
frame and the second single frame into a preview resolution size;
computing feature points in the first single frame and the second
single frame; computing a transformation matrix using a Homography
relationship between the first single frame and the second single
frame, wherein the transformation matrix includes a first scaling
of the first single frame, a second scaling of the second single
frame, a first rotation of the first single frame, a second
rotation of the second single frame, a first translation of the
first single frame, and a second translation of the second single
frame; and obtaining the spatial alignment data using the
transformation matrix.
[0011] The photometric alignment data may be obtained by computing
a transformation matrix for the generated at least one intermediate
frame; computing a correction factor based on the transformation
matrix; and obtaining the photometric alignment data based on the
correction factor.
[0012] The color alignment data may be obtained by computing a
transformation matrix for the generated at least one intermediate
frame; computing a correction factor for the color alignment data
based on the transformation matrix; and obtaining the color
alignment data based on the correction factor.
[0013] The at least one transition parameter may include an F-value
of the first lens, the FOV of the first lens, a color profile of
the first lens, a saturation profile of the first lens, an F-value
of the second lens, the FOV of the second lens, a color profile of
the second lens, a saturation profile of the second lens, a scale
factor of the first lens, a scale factor of the second lens, a
scale factor between the first lens and the second lens, a single
scale factor of a combination of the first lens and the second
lens, a pivot between the first lens and the second lens, and a
single pivot value of a combination of the first lens and the
second lens.
[0014] In accordance with an aspect of the disclosure, an
electronic device for switching between a first lens and a second
lens includes a memory; a processor coupled with the memory, the
processor being configured to display a first frame showing a field
of view (FOV) of the first lens; detect an event that causes the
electronic device to transition from displaying the first frame to
displaying a second frame showing a FOV of the second lens;
generate, based on detecting the event, at least one intermediate
frame for transitioning from the first frame to the second frame;
and switch, based on detecting the event, from the first lens to
the second lens and display the second frame, wherein the at least
one intermediate frame is displayed after the first frame is
displayed and before the second frame is displayed while the
switching is performed.
[0015] The processor may be further configured to determine a lens
switching delay from a first time at which the first frame is
displayed to a second time at which the second frame is displayed;
identify at least one transition parameter of the first lens and
the second lens to generate the at least one intermediate frame;
obtain at least one from among a spatial alignment data, a
photometric alignment data and a color alignment data of the first
lens and the second lens; and generate the at least one
intermediate frame based on the at least one transition parameter,
the lens switching delay, and at least one from among the spatial
alignment data, the photometric alignment data and the color
alignment data.
[0016] The at least one intermediate frame may be at least one from
among spatially aligned with respect to the first frame and the
second frame, photometrically aligned with respect to the first
frame and the second frame and color aligned with respect to the
first frame and the second frame.
[0017] The processor may be further configured to determine the
spatial alignment data using the at least one transition parameter;
and determine the at least one intermediate frame to be generated
based on the determined spatial alignment data and the at least one
transition parameter.
[0018] The processor may be further configured to capture a first
single frame associated with the first lens and a second single
frame of a same scene associated with the second lens when the
electronic device is in an idle mode; resize the first single frame
and the second single frame into a preview resolution size; compute
feature points in the first single frame and the second single
frame; compute a transformation matrix using a Homography
relationship between the first single frame and the second single
frame, wherein the transformation matrix includes a first scaling
of the first single frame, a second scaling of the second single
frame, a first rotation of the first single frame, a second
rotation of the second single frame, a first translation of the of
the first single frame, and a second translation of the second
single frame; and obtain the spatial alignment data using the
transformation matrix.
[0019] The photometric alignment data may be obtained by computing
a transformation matrix for the generated at least one intermediate
frame; computing a correction factor based on the transformation
matrix; and obtaining the photometric alignment data based on the
correction factor.
[0020] The color alignment data may be obtained by computing a
transformation matrix for the generated at least one intermediate
frame; computing a correction factor for the color alignment data
based on the transformation matrix; and obtaining the color
alignment data based on the correction factor.
[0021] The at least one transition parameter may include an F-value
of the first lens, the FOV of the first lens, a color profile of
the first lens, a saturation profile of the first lens, an F-value
of the second lens, the FOV of the second lens, a color profile of
the second lens, a saturation profile of the second lens, a scale
factor of the first lens, a scale factor of the second lens, a
scale factor between the first lens and the second lens, a scale
factor of a combination of the first lens and the second lens, a
pivot between the first lens and the second lens, and a single
pivot value of a combination of the first lens and the second
lens.
[0022] In accordance with an aspect of the disclosure, a device
includes a memory configured to store a first attribute of a first
camera and a second attribute of a second camera; and a processor
configured to generate at least one intermediate image frame based
on the first attribute and the second attribute; output the at
least one intermediate image frame after outputting a first image
frame captured by the first camera and before outputting a second
image frame captured by the second camera.
[0023] The memory may be further configured to store at least one
transition parameter based on the first attribute and the second
attribute, and the processor may be further configured to generate
the at least one intermediate image frame based on the at least one
transition parameter.
[0024] The first image frame captured by the first camera and the
second image frame captured by the second camera may be of a same
scene, and the processor may be configured to generate a
transformation matrix based on a Homography relationship between
the first image frame and the second image frame, the Homography
relationship being determined based on the at least one transition
parameter, and generate the at least one intermediate image frame
using the transformation matrix.
[0025] The processor may be further configured to determine a
coefficient for each intermediate image frame from among the at
least one intermediate image frame and generate each intermediate
image frame based on the transformation matrix and on the
respective determined coefficient.
[0026] The memory may be further configured to store a switching
delay between the first camera and the second camera and a frame
rate, and the processor may be further configured to determine the
respective determined coefficient based on the switching delay and
the frame rate.
[0027] The first image frame captured by the first camera and the
second image frame captured by the second camera may be of a same
scene, and the processor may be further configured to generate a
photometric alignment coefficient based on the at least one
transition parameter and generate the at least one intermediate
image frame using the photometric alignment coefficient.
[0028] The processor may be further configured to determine a
transition coefficient for each intermediate image frame from among
the at least one intermediate image frame and generate each
intermediate image frame based on the photometric alignment
coefficient and on the respective determined transition
coefficient.
[0029] The memory may be further configured to store a switching
delay between the first camera and the second camera and a frame
rate, and the processor may be further configured to determine the
respective determined transition coefficient based on the switching
delay and the frame rate.
[0030] The first image frame captured by the first camera and the
second image frame captured by the second camera may be of a same
scene, and the processor may be further configured to generate a
color alignment coefficient based on the at least one transition
parameter and generate the at least one intermediate image frame
using the color alignment coefficient.
[0031] The processor may be further configured to determine a
transition coefficient for each intermediate image frame from among
the at least one intermediate image frame and generate each
intermediate image frame based on the color alignment coefficient
and on the respective determined transition coefficient.
[0032] The memory may be further configured to store a switching
delay between the first camera and the second camera and a frame
rate, and the processor may be further configured to determine the
respective determined transition coefficient based on the switching
delay and the frame rate.
BRIEF DESCRIPTION OF FIGURES
[0033] The above and other aspects, features, and advantages of
certain embodiments of the present disclosure will be more apparent
from the following description taken in conjunction with the
accompanying drawings, in which:
[0034] FIG. 1 is an example illustration in which an electronic
device switches between a first lens and a second lens, according
to an embodiment;
[0035] FIG. 2 shows various hardware components of the electronic
device, according to an embodiment;
[0036] FIG. 3 shows various hardware components of a processor
included in the electronic device, according to an embodiment;
[0037] FIG. 4 is an example scenario in which a frame generation
engine computes a number of frames to be generated, according to an
embodiment;
[0038] FIG. 5 is a flow chart illustrating a method for switching
between the first lens and the second lens in the electronic
device, according to an embodiment;
[0039] FIG. 6 and FIG. 7 are example scenarios in which seamless
transition between the lenses are depicted, according to an
embodiment;
[0040] FIG. 8 is an example flow chart illustrating various
operations for switching between the first lens and the second lens
in the electronic device, according to an embodiment;
[0041] FIG. 9 is an example scenario in which the electronic device
computes the spatial alignment data, according to an
embodiment;
[0042] FIG. 10 is an example scenario in which the electronic
device computes the spatial alignment data with respect to position
(i.e., frame center alignment) and scale (i.e., size of objects),
according to an embodiment;
[0043] FIG. 11 is an example scenario in which the electronic
device computes the photometric alignment data, according to an
embodiment;
[0044] FIG. 12 is an example scenario in which the electronic
device determines the color alignment data, according to an
embodiment;
[0045] FIG. 13 is an example flow chart illustrating various
operations for generating the intermediate frames for smooth
transformation from the first lens to the second lens, according to
an embodiment;
[0046] FIG. 14 is an example flow chart illustrating various
processes for performing the spatial alignment procedure, according
to an embodiment;
[0047] FIG. 15 is an example flow chart illustrating various
processes for performing the photometric alignment procedure,
according to an embodiment;
[0048] FIG. 16 is an example flow chart illustrating various
processes for performing the color alignment procedure, according
to an embodiment; and
[0049] FIG. 17 is an example flow chart illustrating various
processes for determining the number of frame generation, according
to an embodiment.
DETAILED DESCRIPTION
[0050] The embodiments herein and the various features and
advantageous details thereof are explained more fully with
reference to the non-limiting embodiments that are illustrated in
the accompanying drawings and detailed in the following
description. Descriptions of well-known components and processing
techniques are omitted so as to not unnecessarily obscure the
embodiments herein. Also, the various embodiments described herein
are not necessarily mutually exclusive, as some embodiments can be
combined with one or more other embodiments to form new
embodiments. The term "or" as used herein, refers to a
non-exclusive or, unless otherwise indicated. The examples used
herein are intended merely to facilitate an understanding of ways
in which the embodiments herein can be practiced and to further
enable those skilled in the art to practice the embodiments herein.
Accordingly, the examples should not be construed as limiting the
scope of the embodiments herein.
[0051] Embodiments may be described and illustrated in terms of
blocks which carry out a described function or functions. These
blocks, which may be referred to herein as units or modules or the
like, are physically implemented by analog or digital circuits such
as logic gates, integrated circuits, microprocessors,
microcontrollers, memory circuits, passive electronic components,
active electronic components, optical components, hardwired
circuits, or the like, and may optionally be driven by firmware and
software. The circuits may, for example, be embodied in one or more
semiconductor chips, or on substrate supports such as printed
circuit boards and the like. The circuits constituting a block may
be implemented by dedicated hardware, or by a processor (e.g., one
or more programmed microprocessors and associated circuitry), or by
a combination of dedicated hardware to perform some functions of
the block and a processor to perform other functions of the block.
Each block of the embodiments may be physically separated into two
or more interacting and discrete blocks without departing from the
scope of the disclosure. Likewise, the blocks of the embodiments
may be physically combined into more complex blocks without
departing from the scope of the disclosure.
[0052] The accompanying drawings are used to help easily understand
various technical features and it should be understood that the
embodiments presented herein are not limited by the accompanying
drawings. As such, the present disclosure should be construed to
extend to any alterations, equivalents and substitutes in addition
to those which are particularly set out in the accompanying
drawings. Although the terms first, second, etc. may be used herein
to describe various elements, these elements should not be limited
by these terms. These terms are generally only used to distinguish
one element from another.
[0053] Accordingly embodiments herein achieve a method for
switching between a first lens and a second lens in an electronic
device. The method includes displaying, by the electronic device, a
first frame showing a field of view (FOV) of the first lens.
Further, the method includes detecting, by the electronic device,
an event to switch from the first lens to the second lens. Further,
the method includes generating, by the electronic device, at least
one intermediate frame for smooth transformation from the first
frame to a second frame showing a FOV of the second lens. Further,
the method includes switching, by the electronic device, from the
first lens to the second lens and displaying the second frame
showing the FOV of the second lens. The at least one intermediate
frame is displayed between the first frame and the second frame
while the switching is performed.
[0054] Unlike conventional methods and systems, the electronic
device generates intermediate frames based on offline information
computed relating to spatial transformation, photometric and color
alignment. The generated frames smoothly transform between the
first lens preview (i.e., source lens preview) to a second lens
preview (i.e., destination lens preview). The electronic device
utilizes the source frame, offline spatial alignment data,
photometric data and color alignment data to perform this
transformation. This results in enhancing the user experience.
[0055] The electronic device switches from displaying a frame from
one camera to a frame from another camera. The transition table
contains pre-calculated and pre-calibrated information for various
transitions (e.g., wide to ultra-wide, tele to wide, wide to tele,
etc.). The information includes precise switching delay, FOV of
lenses, color and saturation profiles of the cameras, etc. The
electronic device utilizes the frames from the single camera and
the transition table to generate a final output of intermediate
frames. For example, the electronic device computes an interval
between successive frames, the scale and position transformation
for each generated frame, the color correction for each frame, and
the photometric correction for each frame. This results in
enhancing the user experience.
[0056] FIG. 1 is an example illustration showing a process by which
an electronic device (100) switches between a first lens and a
second lens, according to an embodiment. The electronic device
(100) can be, for example, but not limited to a cellular phone, a
smart phone, a Personal Digital Assistant (PDA), a tablet computer,
a laptop computer, a smart watch, an Internet of Things (IoT)
device, a multi-camera system or the like. The first lens and the
second lens can be, for example, but not limited to a wide lens, an
ultra-wide lens, a tele-lens or the like.
[0057] In an embodiment, the electronic device (100) is configured
to display a first frame showing a FOV of the first lens. Further,
the electronic device (100) is configured to detect an event to
switch from the first lens to the second lens. Further, the
electronic device (100) is configured to generate at least one
intermediate frame for smooth transformation from the first lens to
the second lens.
[0058] In an embodiment, the at least one intermediate frame is
generated for smooth transformation from the first lens to the
second lens by determining a lens switching delay from a first
frame showing the FOV of the first lens to a second frame showing a
FOV of the second lens, detecting at least one lens transition
parameter to generate the at least one intermediate frame,
obtaining at least one of a spatial alignment data, a photometric
alignment data and a color alignment data, and generating the at
least one intermediate frame between the first frame and the second
frame based on the at least one lens transition parameter, the lens
switching delay, and at least one of the spatial alignment data,
the photometric alignment data and the color alignment data.
[0059] In an embodiment, the spatial alignment data is obtained by
capturing a single frame associated with the first lens and a
single frame associated with the second lens when the electronic
device (100) is in an idle mode, resizing the single frame
associated with the first lens and the single frame associated with
the second lens into a preview resolution size, computing feature
points in the single frame associated with the first lens and the
single frame associated with the second lens, computing a
transformation matrix using a Homography relationship between the
single frame associated with the first lens and the single frame
associated with the second lens, wherein the transformation matrix
includes a scaling of the single frame associated with the first
lens and the single frame associated with the second lens, a
rotation of the single frame associated with the first lens and the
single frame associated with the second lens and a translation of
the of the single frame associated with the first lens and the
single frame associated with the second lens, and obtaining the
spatial alignment data using the transformation matrix. The
detailed operations of the spatial alignment procedure are
explained in FIG. 14.
[0060] In an embodiment, the photometric alignment data is obtained
by computing a transformation matrix for the generated frame,
computing a correction factor based on the transformation matrix,
and obtaining the photometric alignment data based on the
correction factor. The detailed operations of the photometric
alignment procedure are explained in FIG. 15.
[0061] In an embodiment, the color alignment data is obtained by
computing a transformation matrix for the generated frame,
computing a correction factor for the color alignment data based on
the transformation matrix, and obtaining the color alignment data
based on the correction factor. The detailed operations of the
color alignment procedure are explained in FIG. 16.
[0062] In an embodiment, the at least one lens transition parameter
is an F-value of the first lens, a FOV of the first lens, a color
profile of the first lens, a saturation profile of the first lens,
an F-value of the second lens, a FOV of the second lens, a color
profile of the second lens, a saturation profile of the second
lens, a scale factor between the first lens and the second lens, a
single scale factor of a combination of the first lens and the
second lens, a pivot between the first lens and the second lens,
and a single pivot value of a combination of the first lens and the
second lens.
[0063] In an embodiment, the at least one intermediate frame is at
least one of spatially aligned with respect to the first frame and
the second frame, photometrically aligned with respect to the first
frame and the second frame and color aligned with respect to the
first frame and the second frame. The detailed operations of the
intermediate frames generated for smooth transformation from the
first lens (150) to the second lens (160) are explained in FIG.
13.
[0064] Further, the electronic device (100) is configured to switch
from the first lens to the second lens and display the second frame
showing the FOV of the second lens. The at least one intermediate
frame is displayed between the first frame and the second frame
while the switching is performed. In other words, the at least one
intermediate frame is displayed after the first frame is displayed
and before the second frame is displayed.
[0065] In an embodiment, the at least one intermediate frame to be
generated is determined by determining the spatial alignment data
using the lens transition parameter, and determining the at least
one intermediate frame to be generated based on the determined
spatial alignment data and the lens transition parameter.
[0066] In an embodiment, the seamless transitions between the
lenses are illustrated in FIG. 6 and FIG. 7.
[0067] For example, as shown in FIG. 7, the user of the electronic
device (100) invokes the pinch-in-zoom on the image with the wider
FOV to cause a smooth transition to an image with a narrower FOV
when enough details are not available in the wide FOV image.
Similarly, the user of the electronic device (100) invokes the
pinch-out-zoom on the image with the narrower FOV to cause a smooth
transition to an image with wider FOV when enough details are not
available in the narrow FOV image.
[0068] FIG. 2 shows various hardware components of the electronic
device (100), according to an embodiment as disclosed herein. In an
embodiment, the electronic device (100) includes a processor (110),
a communicator (120), a memory (130), a display (140), the first
lens (150), the second lens (160), and an application (170). The
processor (110) is provided with the communicator (120), the memory
(130), the display (140), the first lens (150), the second lens
(160), and the application (170). The application (170) can be, for
example, but not limited to a beauty related application, camera
application, health related application or the like.
[0069] In an embodiment, the processor (110) is configured to
display the first frame showing the FOV of the first lens (150).
Further, the processor (110) is configured to detect an event that
causes a switch from the first lens (150) to the second lens (160).
Further, the processor (150) is configured to generate at least one
intermediate frame for smooth transformation from the first lens
(150) to the second lens (160). Further, the processor (110) is
configured to switch from the first lens (150) to the second lens
(160) and display the second frame showing the FOV of the second
lens (160). The at least one intermediate frame is displayed, on
the display (140), between the first frame and the second frame
while the switching is performed. The at least one intermediate
frame displayed on the display (140) may be visible to the user or
it may not be visible to the user.
[0070] The processor (110) is configured to execute instructions
stored in the memory (130) and to perform various processes. The
communicator (120) is configured for communicating internally
between internal hardware components and with external devices via
one or more networks.
[0071] The memory (130) also stores instructions to be executed by
the processor (110). The memory (130) may include non-volatile
storage elements. Examples of such non-volatile storage elements
may include magnetic hard discs, optical discs, floppy discs, flash
memories, or forms of electrically programmable memories (EPROM) or
electrically erasable and programmable (EEPROM) memories. In
addition, the memory (130) may, in some examples, be considered a
non-transitory storage medium. The term "non-transitory" may
indicate that the storage medium is not embodied in a carrier wave
or a propagated signal. However, the term "non-transitory" should
not be interpreted that the memory (130) is non-movable. In some
examples, the memory (130) can be configured to store larger
amounts of information than the memory. In certain examples, a
non-transitory storage medium may store data that can, over time,
change (e.g., in Random Access Memory (RAM) or cache).
[0072] Although FIG. 2 shows various hardware components of the
electronic device (100) but it is to be understood that other
embodiments are not limited thereon. In other embodiments, the
electronic device (100) may include fewer or more components.
Further, the labels or names of the components are used only for
illustrative purpose and do not limit the scope of the disclosure.
One or more components can be combined together to perform the same
or a substantially similar function to switch between the first
lens (150) and the second lens (160) in the electronic device
(100).
[0073] FIG. 3 shows various hardware components of the processor
(110) included in the electronic device (100), according to an
embodiment. In an embodiment, the processor (110) includes an event
detector (110a), a frame generation engine (110b), a lens switching
delay determination engine (110c), a spatial alignment data
determination engine (110d), a photometric alignment data
determination engine (110e) and a color alignment data
determination engine (1100.
[0074] In an embodiment, the event detector (110a) is configured to
display the first frame showing the FOV of the first lens (150) and
detect the event that causes the switch from the first lens (150)
to the second lens (160). Further, the frame generation engine
(110b) is configured to generate at least one intermediate frame
for smooth transformation from the first lens (150) (i.e., from the
first frame showing the FOV of the first lens) to the second lens
(160) (i.e., to a second frame showing the FOV of the second lens)
using the lens switching delay determination engine (110c), the
spatial alignment data determination engine (110d), the photometric
alignment data determination engine (110e) and the color alignment
data determination engine (110f). Further, the frame generation
engine (110b) is configured to switch from the first lens (150) to
the second lens (160) and display the second frame showing the FOV
of the second lens (160). The at least one intermediate frame is
displayed between the first frame and the second frame while the
switching is performed.
[0075] The spatial alignment data determination engine (110d) is
configured to handle the spatial alignment mismatch between the
first and second frames. The photometric alignment data
determination engine (110e) is configured to handle the different
exposure between the first and second frames. The color alignment
data determination engine (1100 is configured to handle the color
difference between the first and second frames.
[0076] Although FIG. 3 shows various hardware components of the
processor (110), it is to be understood that other embodiments are
not limited thereon. In other embodiments, the processor (110) may
include fewer or more components. Further, the labels or names of
the components are used only for illustrative purpose and do not
limit the scope of the disclosure. One or more components can be
combined together to perform the same or a substantially similar
function to switch between the first lens (150) and the second lens
(160) in the electronic device (100).
[0077] FIG. 4 is an example scenario in which the frame generation
engine (110b) computes a number frames to be generated, according
to an embodiment. The frame generation engine (110b) utilizes a
transition table which holds pre-calculated and pre-calibrated
information for various transitions (e.g., wide lens to ultra-wide
lens, tele-lens to wide lens, wide lens to tele-lens, etc.). The
information includes switching delay, scale factor, color and
saturation profiles of the lenses (150 and 160), etc. Using the
transition table, the frame generation engine (110b) computes the
number frames to be generated for display.
[0078] In an example, the transition tables 1-6 are example tables
for the electronic device with three lenses (e.g., Ultra-wide lens,
wide lens, tele-lens or the like). For each combination of lens
transition the transition table shows the corresponding transition
parameters used in the transition. A brief explanation follows of
the parameters shown in the transition table.
[0079] The "Enabled" parameter indicates whether the transition
table is enabled or disabled (depending a lens configuration of the
electronic device (100)). The "Switching delay" parameter indicates
a delay between the frame of the source lens and the frame of the
target lens. The "Scale Factor X" parameter indicates an X Scale
factor between the source lens and the target lens. The "Scale
Factor Y" parameter indicates a Y Scale factor between source and
target lens. The "Pivot X" parameter indicates an X value of the
pivot point for transition between source and target lens frames.
The "Pivot Y" parameter indicates a Y value of the pivot point for
transition between source and target lens frames. The "Brightness"
parameter indicates a Brightness profile of the target lens frame,
expressed in terms of mean and standard deviation. The "Color"
parameter indicates a Color profile of the target lens frame,
expressed in terms of mean and standard deviation. The pivot point
is the point between source and target lens frames which remains
constant during transition and may be specified using X and Y
coordinates.
TABLE-US-00001 Transition Table 1 for Ultra-wide lens to Wide lens
Ultra-wide lens to Wide lens Enabled True Switching delay 700 Scale
Factor X 1.6 Scale Factor Y 1.6 Pivot X 0.55 Pivot Y 0.5 Brightness
1.2, 0.2 Color 1.1, 0.1
TABLE-US-00002 Transition Table 2 for Ultra-wide lens to Tele-lens
Ultra-wide lens to Tele-lens Enabled True Switching delay 700 Scale
Factor X 2.77 Scale Factor Y 2.77 Pivot X 0.55 Pivot Y 0.5
Brightness 1.25, 0.2 Color 1.15, 0.2
TABLE-US-00003 Transition Table 3 for Wide lens to Tele-lens Wide
lens to Tele-lens Enabled True Switching delay 800 Scale Factor X
1.73 Scale Factor Y 1.73 Pivot X 0.55 Pivot Y 0.5 Brightness 1.25,
0.2 Color 1.15, 0.2
TABLE-US-00004 Transition Table 4 for Wide lens to Ultra-wide lens
Wide lens to Ultra-wide lens Enabled True Switching delay 750 Scale
Factor X 0.625 Scale Factor Y 0.625 Pivot X 0.55 Pivot Y 0.5
Brightness 1.2, .02 Color 1.1, 0.2
TABLE-US-00005 Transition Table 5 for Tele-lens to Ultra-wide lens
Tele-lens to Ultra-wide lens Enabled True Switching delay 650 Scale
Factor X 0.361 Scale Factor Y 0.361 Pivot X 0.55 Pivot Y 0.5
Brightness 1.25, 0.2 Color 1.15, 0.2
TABLE-US-00006 Transition Table 6 for Tele-lens to Wide lens
Tele-lens to Wide lens Enabled True Switching delay 700 Scale
Factor X 0.578 Scale Factor Y 0.578 Pivot X 0.55 Pivot Y 0.5
Brightness 1.25, 0.2 Color 1.15, 0.2
[0080] The transition tables 1-6 are only examples and are provided
for the purpose of understanding the transition parameters.
Further, values for the transition tables 1-6 may be varied based
on at least one of the user setting, an original equipment
manufacturer (OEM), and a configuration of the electronic device
(100).
[0081] FIG. 5 is a flow chart (500) illustrating a method for
switching between the first lens (150) and the second lens (160) in
the electronic device (100), according to an embodiment. The
operations (502-508) are performed by the processor (110).
[0082] At 502, the method includes displaying the first frame
showing the FOV of the first lens (150). At 504, the method
includes detecting the event that causes a switch from the first
lens (150) to the second lens (160). At 506, the method includes
generating at least one intermediate frame for smooth
transformation from the first lens (150) to the second lens (160).
At 508, the method includes switching from the first lens (150) to
the second lens (160) and displaying the second frame showing the
FOV of the second lens (160). The at least one intermediate frame
is displayed between the first frame and the second frame while the
switching is performed.
[0083] FIG. 8 is an example flow chart illustrating various
operations for switching between the first lens (150) and the
second lens (160) in the electronic device (100), according to an
embodiment. At 802, the electronic device (100) obtains the frame
(i.e., the last frame from the first lens FOV and the first frame
from the second lens FOV). At 804, the electronic device (100)
generates the intermediate frames with the spatial alignment data
based on the last frame of the first lens FOV and first frame of
the second lens FOV. At 806, the electronic device (100) generates
intermediate frames with the photometric alignment data. At 808,
the electronic device (100) generates intermediate frames with the
color alignment data. At 810, the electronic device (100) combines
the frames. At 812, the electronic device (100) renders the
frames.
[0084] FIG. 9 is an example scenario in which the electronic device
(100) computes the spatial alignment data, according to an
embodiment.
[0085] For example, the electronic device (100) captures the pair
of wide and ultra wide frames keeping the electronic device
stationary. Further, the electronic device (100) resizes both
images to preview resolution. Further, the electronic device (100)
computes corner points in both images. Further, the electronic
device (100) computes the transformation matrix using Homography.
Here, the Homography is a transformation matrix (e.g., a 3.times.3
matrix) that maps the points in one image to the corresponding
points in the other image. When applied to the source frame data,
the transformation matrix effects scaling, rotation and translation
of the source frame data. For spatial alignment, the electronic
device (100) needs scale and pivot data. Further, the electronic
device (100) constructs the transformation matrix using only scale
and pivot data. In an example, below matrix is used for computing
the spatial alignment data.
[ 1 0 p x 0 1 p y 0 0 1 ] [ s x 0 0 0 s y 0 0 0 1 ] [ 1 0 p x 0 1 p
y 0 0 1 ] = [ s x 0 p x .function. ( 1 - s x ) 0 s y p y .function.
( 1 - s y ) 0 0 1 ] [ s x 0 t x 0 s y t y 0 0 1 ] ##EQU00001##
[0086] In this example, s.sub.x and s.sub.y represent scale factors
X and Y and p.sub.x and p.sub.y represent pivot factors X and Y as
described with reference to the Transition Tables above.
[0087] FIG. 10 is an example scenario in which the electronic
device (100) computes the spatial alignment data with respect to
position (i.e., frame center alignment) and scale (i.e., size of
objects), according to an embodiment.
[0088] The center point of the frame is different in the Ultra-wide
frame and the wide frame (for example, the crosshair position on
the displayed bottle is different in the Ultra-wide frame and the
wide frame). In an embodiment, the spatial alignment data gradually
shifts the center from the Ultra-wide center to the wide center
using the generated intermediate frames. The scale of the
Ultra-wide frame and the wide frame are different (for example, the
size of the bottle is smaller in the Ultra-wide frame than in the
wide frame). Hence, the scale as well is shifted gradually from the
Ultra-wide scale to the wide scale using the generated intermediate
frames.
[0089] FIG. 11 is an example scenario in which the electronic
device (100) computes the photometric alignment data, according to
an embodiment.
[0090] In the left side, the photometric histogram of the last
frame from the first lens (i.e., a wide lens) is shown. The last
frame from the first lens is used as a reference image for
photometric alignment. In the right side, the first preview frame
from the second lens (i.e., the ultra-wide lens) is shown. As shown
in FIG. 11, the photometric histogram of the first preview frame
from the second lens is different from the photometric histogram of
the last frame from the first lens. The electronic device (100)
gradually aligns the intermediate frames photometrically to the
target frame using the photometric alignment data determination
engine (110e). The frames from the first lens and the second lens
are photometrically different as seen from the histogram shift.
Without photometric alignment, there will be a sudden change in
brightness.
[0091] FIG. 12 is an example scenario in which the electronic
device (100) determines the color alignment data, according to an
embodiment.
[0092] Consider, the reference image for the color alignment data
is shown in the left side of the FIG. 12 (i.e., last frame from the
first lens (i.e., wide lens). The electronic device (100) generates
the frames without photometric alignment with lower saturation.
After transition, the electronic device (100) shows the first
preview frame from the second lens (i.e., Ultra-Wide lens). The
electronic device (100) generates the frames with photometric
alignment data along with gradual saturation change. The electronic
device (100) modifies the color of the intermediate frames such
that the color changes gradually for seamless transition using the
color alignment data determination engine (1100.
[0093] FIG. 13 is an example flow chart (1300) illustrating various
operations for generating the intermediate frames for smooth
transformation from the first lens (150) to the second lens (160),
according to an embodiment. The operations (1302-1318) are
performed by the frame generation engine (110b).
[0094] At 1302, the user of the electronic device (100) initiates a
smooth transformation from the first lens (150) to the second lens
(160). At 1304, the electronic device (100) computes the alignment
parameters using the lens transition parameter from a corresponding
transition table. At 1306, the electronic device (100) generates
frame indices for each of the intermediate frames to be generated.
The number of intermediate frames N to be generated is determined
based on a switching delay (TSD) and a target frame rate (FPS) as
described below with reference to FIG. 14. At 1308, the electronic
device (100) compares a frame number to a frame number N of a last
frame. If the frame number is not less than that of the last frame,
at 1318, the method will stop. If the frame number is less than
that of the last frame, then at 1310, the electronic device (100)
performs the spatial alignment procedure. At 1312, the electronic
device (100) performs the photometric alignment procedure. At 1314,
the electronic device (100) performs the perform color alignment
procedure. At 1316, the electronic device (100) displays the frames
by combining the frames after performing the spatial alignment
procedure, the photometric alignment procedure, and the color
alignment procedure.
[0095] FIG. 14 is an example flow chart (1400) illustrating various
processes for performing the spatial alignment procedure, according
to an embodiment. The operations (1402-1416) are performed by the
spatial alignment data determination engine (110d).
[0096] At 1402, the electronic device (100) obtains the transition
table information. At 1404, the electronic device (100) computes
the number of frames to be generated (i.e., N=TSD/FPS). At 1406,
the electronic device (100) determines whether a frame number is
greater than that of a last frame. If the frame number is greater
than that of the last frame, at 1416, the method will stop. If the
frame number is not greater than last frame, then at 1408, the
electronic device (100) computes .alpha.f (i.e., .alpha.f=F(f, N,
Mode)). Here, the term .alpha.f is a coefficient to be used when
determining the transformation matrix for frame f. For frame f and
total number of frames N, .alpha.f=(N-f)/N. At 1410, the electronic
device (100) computes the transformation matrix (i.e.,
Mf=.alpha.f*Z+(1-af)*T). Here, the matrix Z is the transformation
matrix for frame 0 and the matrix T is the transformation matrix
for frame N as described below. At 1412, the electronic device
(100) performs an affine transformation with the determined
transformation matrix. At 1414, the electronic device (100)
displays the frames.
[0097] In an example, working of the frame generation engine (110b)
with respect to the spatial alignment data is illustrated
below:
[0098] Spatial Alignment Example:
[0099] The transformation matrix may be computed using the at least
one lens transition parameter identified from the transition table
information.
[0100] Consider, the switching delay (ms): T.sub.SD, Target Frame
Rate (FPS): F, then Frames to be generated (N)=T.sub.SD/F
[0101] Transformation matrix for Frame 0
( Z ) = [ 1 0 0 0 1 0 0 0 1 ] ##EQU00002##
[0102] Transformation matrix for Frame
N .function. ( T ) = [ s x 0 p x .function. ( 1 - sx ) 0 s y p y
.function. ( 1 - sy ) 0 0 1 ] ##EQU00003##
[0103] Then, the transformation matrix for each generated frame
is,
M.sub.f=.alpha.f*Z+(1-.alpha.f)*T
[0104] Where, f is frame number and .alpha..sub.f=F(f, N, Mode)
[0105] Consider an example wherein T.sub.SD=720 ms, F=60 fps, then
N=(720/60)=12, and
Z = [ 1 0 0 0 1 0 0 0 1 ] , .times. T = [ 1.62 0 - 0.25 0 1.62 -
0.18 0 0 1 ] ##EQU00004##
[0106] .alpha..sub.1=0.92, .alpha..sub.2=0.83, .alpha..sub.3=0.75 .
. . .alpha..sub.N=0
[0107] FIG. 15 is an example flow chart (1500) illustrating various
processes for performing the photometric alignment procedure,
according to an embodiment. The operations (1502-1516) are
performed by the photometric alignment data determination engine
(110e).
[0108] At 1502, the electronic device (100) obtains the transition
table information. At 1504, the electronic device (100) computes
the number of frames to be generated (i.e., N=TSD/FPS). At 1506,
the electronic device (100) determines whether a frame number is
greater than that of a last frame. If the frame number is greater
than that of the last frame, at 1516, the method will stop. If the
frame number is not greater than that of the last frame, then at
1508, the electronic device (100) computes .alpha.f (i.e.,
.alpha.f=F(f, N, Mode)). Here, the term .alpha.f is a coefficient
to be used when determining the transformation matrix for frame f.
For frame f and total number of frames N, .alpha.f=(N-f)/N. At
1510, the electronic device (100) computes the intensity mean-SD
Correction Pf=.alpha.f*Y+(1-af)*S. Here, Y is the correction factor
for frame 0 and S is the correction factor for frame N as described
below. At 1512, the electronic device (100) provides the
photometric alignment with the correction factor Pf. At 1514, the
electronic device (100) displays the frames.
[0109] In an example, working of the frame generation engine (110b)
with respect to the photometric alignment data is illustrated
below:
[0110] Consider, switching delay (ms): T.sub.SD, Target Frame Rate
(FPS): F, Frames to be generated (N)=T.sub.SD/F, Mean Correction
factor: C.sub.Mean, Standard Deviation Correction factor:
C.sub.STD
[0111] Correction factor for Frame 0 (Y): Y.sub.mean=1.0,
Y.sub.STD=1.0 and Correction factor for Frame N (S):
S.sub.mean=C.sub.Mean, S.sub.STD=C.sub.STD
[0112] Then the correction factor P.sub.f (i.e., [P.sub.fMean,
P.sub.fSTD]) for each generated frame is,
P.sub.f=.alpha..sub.f*Y+(1-.alpha..sub.f)*S
[0113] Where, f is the frame number and .alpha..sub.f=F(f, N,
Mode)
[0114] Consider, T.sub.SD=720 ms, F=60 fps, N=(720/60)=12,
.alpha..sub.1=0.92, .alpha..sub.2=0.83, .alpha..sub.3=0.75 . . .
.alpha..sub.N=0
[0115] Using the above formula, correction factors P.sub.1,
P.sub.2, P.sub.3 . . . . P.sub.N for each intermediate frame may be
computed.
[0116] Photometric alignment is applied for generating each
intermediate frame according to the following relation:
L ' .times. f .function. ( x , y ) = { PfMean + ( L .function. ( x
, y ) LfMean - 1 ) .times. PfSTD } .times. LfMean ##EQU00005##
[0117] where,
[0118] L.sub.f is the intensity channel of the frame f
[0119] LfMean is the mean intensity of the frame f
[0120] The electronic device (100) applies the same logic for the
color alignment data.
[0121] FIG. 16 is an example flow chart (1500) illustrating various
processes for performing the color alignment procedure, according
to an embodiment. The operations (1602-1616) are performed by the
color alignment data determination engine (1100.
[0122] At 1602, the electronic device (100) obtains the transition
table information. At 1604, the electronic device (100) computes
the number of frames to be generated (i.e., N=TSD/FPS). At 1606,
the electronic device (100) determines whether a frame number is
greater than that of a last frame. If the frame number is greater
than that of the last frame, at 1616, the method will stop. If the
frame number is not greater than that of the last frame, then at
1608, the electronic device (100) computes .alpha.f (i.e.,
.alpha.f=F(f, N, Mode)). Here, the term .alpha.f is a coefficient
to be used when determining the transformation matrix for frame f.
For frame f and total number of frames N, .alpha.f=(N-f)/N. At
1610, the electronic device (100) computes the color mean-SD
correction Cf=.alpha.f*X+(1-af)*R. Here, X is the correction factor
for frame 0 and R is the correction factor for frame N as described
below. At 1612, the electronic device (100) provides the color
alignment data with the correction factor Cf. At 1614, the
electronic device (100) displays the frames.
[0123] FIG. 17 is an example flow chart (1700) illustrating various
processes for determining the number of frames to be generated,
according to an embodiment. The operations (1702-1706) are
performed by the frame generation engine (110b).
[0124] At 1702, the electronic device (100) obtains the transition
table information. At 1704, the electronic device (100) computes
the number of frames to be generated (i.e., N=TSD/FPS). At 1706,
the electronic device (100) displays the frames.
[0125] The various actions, acts, blocks, steps, or the like in the
flow charts (500, 800, and 1300-1700) may be performed in the order
presented, in a different order or simultaneously. Further, in some
embodiments, some of the actions, acts, blocks, steps, or the like
may be omitted, added, modified, skipped, or the like without
departing from the scope of the disclosure.
[0126] The embodiments disclosed herein can be implemented using at
least one software program running on at least one hardware device
and performing network management functions to control the
elements.
[0127] The foregoing description of the specific embodiments will
so fully reveal the general nature of the embodiments herein that
others can, by applying current knowledge, readily modify and/or
adapt for various applications such specific embodiments without
departing from the generic concept, and, therefore, such
adaptations and modifications should and are intended to be
comprehended within the meaning and range of equivalents of the
disclosed embodiments. It is to be understood that the phraseology
or terminology employed herein is for the purpose of description
and not of limitation. Therefore, while the embodiments herein have
been described in terms of preferred embodiments, those skilled in
the art will recognize that the embodiments herein can be practiced
with modification within the spirit and scope of the embodiments as
described herein.
* * * * *