U.S. patent application number 14/370456 was filed with the patent office on 2015-01-01 for alternate view video playback on a second screen.
The applicant listed for this patent is THOMSON LICENSING. Invention is credited to Mark Leroy Walker.
Application Number | 20150003798 14/370456 |
Document ID | / |
Family ID | 47472156 |
Filed Date | 2015-01-01 |
United States Patent
Application |
20150003798 |
Kind Code |
A1 |
Walker; Mark Leroy |
January 1, 2015 |
ALTERNATE VIEW VIDEO PLAYBACK ON A SECOND SCREEN
Abstract
The present disclosure is directed towards a method and system
for providing an alternate view of content being displayed on a
first or primary screen on a second screen device. This alternate
view can be synched with the primary view and displayed on the
second screen device without interrupting the view or playback on
the primary screen device.
Inventors: |
Walker; Mark Leroy;
(Castaic, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
THOMSON LICENSING |
Issy de Moulineaux |
|
FR |
|
|
Family ID: |
47472156 |
Appl. No.: |
14/370456 |
Filed: |
December 27, 2012 |
PCT Filed: |
December 27, 2012 |
PCT NO: |
PCT/US2012/071822 |
371 Date: |
July 2, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61584134 |
Jan 6, 2012 |
|
|
|
Current U.S.
Class: |
386/201 |
Current CPC
Class: |
H04N 21/42204 20130101;
H04N 21/4307 20130101; H04L 65/403 20130101; H04N 9/87 20130101;
G06T 11/206 20130101; G09G 5/12 20130101; G09G 2370/06 20130101;
H04N 21/42203 20130101; H04L 51/32 20130101; H04N 21/42224
20130101; G06F 16/95 20190101; H04N 21/4126 20130101; H04N 21/4122
20130101; G09G 2370/16 20130101; H04L 43/10 20130101; H04N 21/8547
20130101; H04N 21/845 20130101; H04N 21/442 20130101; H04N 21/4788
20130101; G06F 3/1423 20130101; H04L 43/045 20130101; G06Q 30/0255
20130101; G11B 27/32 20130101; H04N 21/8133 20130101; G06T 2215/16
20130101 |
Class at
Publication: |
386/201 |
International
Class: |
G11B 27/32 20060101
G11B027/32; H04N 21/81 20060101 H04N021/81; H04N 21/43 20060101
H04N021/43; H04N 21/41 20060101 H04N021/41; H04N 21/422 20060101
H04N021/422 |
Claims
1. A method of providing alternate version content on a second
screen related to primary content being displayed on first screen,
the method comprising: synching, using a synching mechanism, timing
of events on the second screen device with content being displayed
on the first screen device; and providing alternate version content
on the second screen synched with content being displayed on the
first screen device.
2. The method of claim 1, wherein the step of providing alternate
version content on the second screen comprises: determining that an
event involves alternate version content; and preloading the
alternate version content so the alternate version content can be
synchronized with content being displayed on the first screen
device.
3. The method of claim 2, wherein the step of preloading the
alternate version content comprises: contacting a server; loading
the alternate version content from the contacted server; and
providing the alternate version content on the second screen
device.
4. The method of claim 1, wherein the step of synching comprises
adjusting a reference timer on the second screen device.
5. The method of claim 4, wherein the adjustment to the reference
timer is based on the synching mechanism used.
6. The method of claim 1, wherein the synching mechanism comprises
a wireless synching mechanism.
7. The method of claim 1, wherein the synching mechanism comprises
an audio watermark synching mechanism.
8. The method of claim 1, wherein the alternate version content is
automatically displayed on the second screen device when the
corresponding primary content is displayed on the first screen
device.
9. The method of claim 1, further comprising the step of: providing
a user the ability to have the alternate version content provided
on the seconds screen device displayed on the first screen
device.
10. A second screen device comprising: a screen configured to
display content; storage for storing data; and a processor
configured to synch, using a synching mechanism, timing of events
on the second screen device with content being displayed on the
first screen device, and provide alternate version content on the
second screen synched with content being displayed on the first
screen device.
11. The second screen device of claim 10, further comprising a
wireless network interface for receiving synching data.
12. The second screen device of claim 10, further comprising a
microphone for detecting synching information in the audio from a
first screen display device.
13. The second screen device of claim 10, wherein the second screen
device comprises a touch screen device.
14. The second screen device of claim 10, where the processor is
further configured to maintain a reference timer.
15. A machine readable medium containing instructions that when
executed perform the steps comprising: synching, using a synching
mechanism, timing of events on the second screen device with
content being displayed on the first screen device; and providing
alternate version content on the second screen synched with content
being displayed on the first screen device.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional
Application Ser. No. 61/584,134 filed Jan. 6, 2012 which is
incorporated by reference herein in its entirety.
[0002] This application is also related to the applications
entitled: "METHODS AND SYSTEMS FOR SYNCHONIZING CONTENT ON A SECOND
SCREEN", "METHOD AND SYSTEM FOR PROVIDING A GRAPHICAL
REPRESENTATION ON A SECOND SCREEN OF SOCIAL MESSAGES RELATED TO
CONTENT ON A FIRST SCREEN", "METHOD AND SYSTEM FOR SYNCHING SOCIAL
MESSAGES WITH CONTENT TIMELINE", "METHOD AND SYSTEM FOR PROVIDING A
DISPLAY OF SOCIAL MESSAGES ON A SECOND SCREEN WHICH IS SYNCHED WITH
CONTENT ON A FIRST SCREEN", and "METHOD ANN SYSTEM FOR PROVIDING
DYNAMIC ADVERTISING ON SECOND SCREEN BASED ON SOCIAL MESSAGES"
which have been filed concurrently and are incorporated by
reference herein in their entirety.
BACKGROUND
[0003] 1. Technical Field
[0004] The present invention generally relates to providing
additional content on second screen related to content displayed on
a primary screen device.
[0005] 2. Description of Related Art
[0006] Traditionally additional content related to a displayed
movie or program (such as supplemental materials on a DVD or Blue
Ray disc) has to be viewed separately from the main movie or
program. That is, the user has to stop or otherwise interrupt the
playback of the main movie or program to access the supplemental
materials. One such supplemental feature is alternative version or
"multi-angle" video that displays alternate audio/video from a
different angle or view than the one displayed in the primary view
of the content. In many instances this feature is not utilized as
it requires special indication on the primary viewing screen as
well as the pressing of infrequently used and poorly labeled button
on the remote. Furthermore, once activated, everyone watching the
primary view of the content is forced to view the alternate view.
This can interfere with the viewing enjoyment for some viewers.
SUMMARY
[0007] The present disclosure is directed towards a method and
system for providing an alternate view of content being displayed
on a first or primary screen on a second screen device. This
alternate view can be synched with the primary view and displayed
on the second screen device without interrupting the view or
playback on the primary screen device.
[0008] In accordance with one embodiment, a method of providing
alternate version content on a second screen related to primary
content being displayed on first screen is disclosed. The method
involves synching, using a synching mechanism, timing of events on
the second screen device with content being displayed on the first
screen device, and providing alternate version content on the
second screen synched with content being displayed on the first
screen device.
[0009] In accordance with another embodiment, a second screen
device capable of providing alternative version content synched to
primary content displayed on a first screen device is disclosed.
The second screen device includes a screen, storage, and a
processor. The screen is configured to display content. The storage
is for storing data. The processor is configured to synchronize,
using a synching mechanism, timing of events on the second screen
device with content being displayed on the first screen device, and
provide alternate version content on the second screen synched with
content being displayed on the first screen device.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] FIG. 1 is a system diagram outlining the delivery of video
and audio content to the home in accordance with one
embodiment.
[0011] FIG. 2 is system diagram showing further detail of a
representative set top box receiver.
[0012] FIG. 3 is a diagram depicting a touch panel control device
in accordance with one embodiment.
[0013] FIG. 4 is a diagram depicting some exemplary user
interactions for use with a touch panel control device in
accordance with one embodiment.
[0014] FIG. 5 is system diagram depicting one embodiment of a
system for implementing techniques of the present invention in
accordance with one embodiment.
[0015] FIG. 6 is a flow diagram depicting an exemplary process in
accordance with one embodiment.
[0016] FIG. 7 is a diagram depicting an exemplary methodology of
synching between devices in accordance with one embodiment.
[0017] FIG. 8 is a diagram depicting an exemplary methodology of
synching between devices in accordance with one embodiment.
[0018] FIGS. 9A-9F are exemplary skeletal screen views depicting
features in accordance with one embodiment when used in passive
mode.
[0019] FIGS. 10A-10D are exemplary skeletal screen views depicting
features in accordance with one embodiment when used in active
mode.
[0020] FIGS. 11A-11C are exemplary skeletal views depicting a
social media sharing feature in accordance with one embodiment.
[0021] FIGS. 12A and 12B are exemplary skeletal views depicting a
content selection features in accordance with one embodiment.
[0022] FIGS. 13A-13E are exemplary skeletal views depicting
additional features in accordance with one embodiment.
[0023] FIGS. 14A-14L are exemplary skinned screen views depicting
how certain features could appear to a user.
[0024] FIG. 15 is a diagram depicting an exemplary methodology of
providing alternative version content on a second screen device in
accordance with one embodiment.
[0025] FIG. 16 is an exemplary skeletal view of depicting an event
offering alternative version content.
[0026] FIG. 17 is an exemplary skeletal view depicting an
alternative view being displayed on a second screen device.
DETAILED DESCRIPTION
[0027] Turning now to FIG. 1, a block diagram of an embodiment of a
system 100 for delivering content to a home or end user is shown.
The content originates from a content source 102, such as a movie
studio or production house. The content may be supplied in at least
one of two forms. One form may be a broadcast form of content. The
broadcast content is provided to the broadcast affiliate manager
104, which is typically a national broadcast service, such as the
American Broadcasting Company (ABC), National Broadcasting Company
(NBC), Columbia Broadcasting System (CBS), etc. The broadcast
affiliate manager may collect and store the content, and may
schedule delivery of the content over a deliver network, shown as
delivery network 1 (106). Delivery network 1 (106) may include
satellite link transmission from a national center to one or more
regional or local centers. Delivery network 1 (106) may also
include local content delivery using local delivery systems such as
over the air broadcast, satellite broadcast, or cable broadcast.
The locally delivered content is provided to a receiving device 108
in a user's home, where the content will subsequently be searched
by the user. It is to be appreciated that the receiving device 108
can take many forms and may be embodied as a set top box/digital
video recorder (DVR), a gateway, a modem, etc. Further, the
receiving device 108 may act as entry point, or gateway, for a home
network system that includes additional devices configured as
either client or peer devices in the home network.
[0028] A second form of content is referred to as special or
additional content. Special or additional content may include
content delivered as premium viewing, pay-per-view, or other
content otherwise not provided to the broadcast affiliate manager,
e.g., movies, video games or other video elements. In many cases,
the special content may be content requested by the user. The
special content may be delivered to a content manager 110. The
content manager 110 may be a service provider, such as an Internet
website, affiliated, for instance, with a content provider,
broadcast service, or delivery network service. The content manager
110 may also incorporate Internet content into the delivery system.
The content manager 110 may deliver the content to the user's
receiving device 108 over a separate delivery network, delivery
network 2 (112). Delivery network 2 (112) may include high-speed
broadband Internet type communications systems. It is important to
note that the content from the broadcast affiliate manager 104 may
also be delivered using all or parts of delivery network 2 (112)
and content from the content manager 110 may be delivered using all
or parts of delivery network 1 (106). In addition, the user may
also obtain content directly from the Internet via delivery network
2 (112) without necessarily having the content managed by the
content manager 110.
[0029] Several adaptations for utilizing the separately delivered
additional content may be possible. In one possible approach, the
additional content is provided as an augmentation to the broadcast
content, providing alternative displays, purchase and merchandising
options, enhancement material, etc. In another embodiment, the
additional content may completely replace some programming content
provided as broadcast content. Finally, the additional content may
be completely separate from the broadcast content, and may simply
be a media alternative that the user may choose to utilize. For
instance, the additional content may be a library of movies that
are not yet available as broadcast content.
[0030] The receiving device 108 may receive different types of
content from one or both of delivery network 1 and delivery network
2. The receiving device 108 processes the content, and provides a
separation of the content based on user preferences and commands.
The receiving device 108 may also include a storage device, such as
a hard drive or optical disk drive, for recording and playing back
audio and video content. Further details of the operation of the
receiving device 108 and features associated with playing back
stored content will be described below in relation to FIG. 2. The
processed content is provided to a display device 114. The display
device 114 may be a conventional 2-D type display or may
alternatively be an advanced 3-D display.
[0031] The receiving device 108 may also be interfaced to a second
screen such as a touch screen control device 116. The touch screen
control device 116 may be adapted to provide user control for the
receiving device 108 and/or the display device 114. The touch
screen device 116 may also be capable of displaying video content.
The video content may be graphics entries, such as user interface
entries, or may be a portion of the video content that is delivered
to the display device 114. The touch screen control device 116 may
interface to receiving device 108 using any well known signal
transmission system, such as infra-red (IR) or radio frequency (RF)
communications and may include standard protocols such as infra-red
data association (IRDA) standard, Wi-Fi, Bluetooth and the like, or
any other proprietary protocols. In some embodiments, the touch
screen control device 116 can be interfaced directly with delivery
networks 1 and 2. Operations of touch screen control device 116
will be described in further detail below.
[0032] In the example of FIG. 1, the system 100 also includes a
back end server 118 and a usage database 120. The back end server
118 includes a personalization engine that analyzes the usage
habits of a user and makes recommendations based on those usage
habits. The usage database 120 is where the usage habits for a user
are stored. In some cases, the usage database 120 may be part of
the back end server 118 a. In the present example, the back end
server 118 (as well as the usage database 120) is connected to the
system the system 100 and accessed through the delivery network 2
(112).
[0033] Turning now to FIG. 2, a block diagram of an embodiment of a
receiving device 200 is shown. Receiving device 200 may operate
similar to the receiving device described in FIG. 1 and may be
included as part of a gateway device, modem, set-top box, or other
similar communications device. The device 200 shown may also be
incorporated into other systems including an audio device or a
display device. In either case, several components necessary for
complete operation of the system are not shown in the interest of
conciseness, as they are well known to those skilled in the
art.
[0034] In the device 200 shown in FIG. 2, the content is received
by an input signal receiver 202. The input signal receiver 202 may
be one of several known receiver circuits used for receiving,
demodulation, and decoding signals provided over one of the several
possible networks including over the air, cable, satellite,
Ethernet, fiber and phone line networks. The desired input signal
may be selected and retrieved by the input signal receiver 202
based on user input provided through a control interface or touch
panel interface 222. Touch panel interface 222 may include an
interface for a touch screen device. Touch panel interface 222 may
also be adapted to interface to a cellular phone, a tablet, a
mouse, a high end remote or the like.
[0035] The decoded output signal is provided to an input stream
processor 204. The input stream processor 204 performs the final
signal selection and processing, and includes separation of video
content from audio content for the content stream. The audio
content is provided to an audio processor 206 for conversion from
the received format, such as compressed digital signal, to an
analog waveform signal. The analog waveform signal is provided to
an audio interface 208 and further to the display device or audio
amplifier. Alternatively, the audio interface 208 may provide a
digital signal to an audio output device or display device using a
High-Definition Multimedia Interface (HDMI) cable or alternate
audio interface such as via a Sony/Philips Digital Interconnect
Format (SPDIF). The audio interface may also include amplifiers for
driving one more sets of speakers. The audio processor 206 also
performs any necessary conversion for the storage of the audio
signals.
[0036] The video output from the input stream processor 204 is
provided to a video processor 210. The video signal may be one of
several formats. The video processor 210 provides, as necessary a
conversion of the video content, based on the input signal format.
The video processor 210 also performs any necessary conversion for
the storage of the video signals.
[0037] A storage device 212 stores audio and video content received
at the input. The storage device 212 allows later retrieval and
playback of the content under the control of a controller 214 and
also based on commands, e.g., navigation instructions such as
fast-forward (FF) and rewind (Rew), received from a user interface
216 and/or touch panel interface 222. The storage device 212 may be
a hard disk drive, one or more large capacity integrated electronic
memories, such as static RAM (SRAM), or dynamic RAM (DRAM), or may
be an interchangeable optical disk storage system such as a compact
disk (CD) drive or digital video disk (DVD) drive.
[0038] The converted video signal, from the video processor 210,
either originating from the input or from the storage device 212,
is provided to the display interface 218. The display interface 218
further provides the display signal to a display device of the type
described above. The display interface 218 may be an analog signal
interface such as red-green-blue (RGB) or may be a digital
interface such as HDMI. It is to be appreciated that the display
interface 218 will generate the various screens for presenting the
search results in a three dimensional grid as will be described in
more detail below.
[0039] The controller 214 is interconnected via a bus to several of
the components of the device 200, including the input stream
processor 202, audio processor 206, video processor 210, storage
device 212, and a user interface 216. The controller 214 manages
the conversion process for converting the input stream signal into
a signal for storage on the storage device or for display. The
controller 214 also manages the retrieval and playback of stored
content. Furthermore, as will be described below, the controller
214 performs searching of content and the creation and adjusting of
the grid display representing the content, either stored or to be
delivered via the delivery networks, described above.
[0040] The controller 214 is further coupled to control memory 220
(e.g., volatile or non-volatile memory, including RAM, SRAM, DRAM,
ROM, programmable ROM (PROM), flash memory, electronically
programmable ROM (EPROM), electronically erasable programmable ROM
(EEPROM), etc.) for storing information and instruction code for
controller 214. Control memory 220 may store instructions for
controller 214. Control memory may also store a database of
elements, such as graphic elements containing content. The database
may be stored as a pattern of graphic elements. Alternatively, the
memory may store the graphic elements in identified or grouped
memory locations and use an access or location table to identify
the memory locations for the various portions of information
related to the graphic elements. Additional details related to the
storage of the graphic elements will be described below. Further,
the implementation of the control memory 220 may include several
possible embodiments, such as a single memory device or,
alternatively, more than one memory circuit communicatively
connected or coupled together to form a shared or common memory.
Still further, the memory may be included with other circuitry,
such as portions of bus communications circuitry, in a larger
circuit.
[0041] The user interface process of the present disclosure employs
an input device that can be used to express functions, such as fast
forward, rewind, etc. To allow for this, a touch panel device 300
may be interfaced via the user interface 216 and/or touch panel
interface 222 of the receiving device 200, as shown in FIG. 3. The
touch panel device 300 allows operation of the receiving device or
set top box based on hand movements, or gestures, and actions
translated through the panel into commands for the set top box or
other control device. In one embodiment, the touch panel 300 may
simply serve as a navigational tool to navigate the grid display.
In other embodiments, the touch panel 300 will additionally serve
as the display device allowing the user to more directly interact
with the navigation through the grid display of content. The touch
panel device may be included as part of a remote control device
containing more conventional control functions such as actuator or
activator buttons. The touch panel 300 can also includes at least
one camera element. In some embodiments, the touch panel 300 may
also include a microphone.
[0042] Turning now to FIG. 4, the use of a gesture sensing
controller or touch screen, such as shown, provides for a number of
types of user interaction. The inputs from the controller are used
to define gestures and the gestures, in turn, define specific
contextual commands. The configuration of the sensors may permit
defining movement of a user's fingers on a touch screen or may even
permit defining the movement of the controller itself in either one
dimension or two dimensions. Two-dimensional motion, such as a
diagonal, and a combination of yaw, pitch and roll can be used to
define any three-dimensional motion, such as a swing. A number of
gestures are illustrated in FIG. 4. Gestures are interpreted in
context and are identified by defined movements made by the
user.
[0043] Bumping 420 is defined by a two-stroke drawing indicating
pointing in one direction, either up, down, left or right. The
bumping gesture is associated with specific commands in context.
For example, in a TimeShifting mode, a left-bump gesture 420
indicates rewinding, and a right-bump gesture indicates
fast-forwarding. In other contexts, a bump gesture 420 is
interpreted to increment a particular value in the direction
designated by the bump. Checking 440 is defined as in drawing a
checkmark. It is similar to a downward bump gesture 420. Checking
is identified in context to designate a reminder, user tag or to
select an item or element. Circling 440 is defined as drawing a
circle in either direction. It is possible that both directions
could be distinguished. However, to avoid confusion, a circle is
identified as a single command regardless of direction. Dragging
450 is defined as an angular movement of the controller (a change
in pitch and/or yaw) while pressing a button (virtual or physical)
on the tablet 300 (i.e., a "trigger drag"). The dragging gesture
450 may be used for navigation, speed, distance, time-shifting,
rewinding, and forwarding. Dragging 450 can be used to move a
cursor, a virtual cursor, or a change of state, such as
highlighting outlining or selecting on the display. Dragging 450
can be in any direction and is generally used to navigate in two
dimensions. However, in certain interfaces, it is preferred to
modify the response to the dragging command. For example, in some
interfaces, operation in one dimension or direction is favored with
respect to other dimensions or directions depending upon the
position of the virtual cursor or the direction of movement.
Nodding 460 is defined by two fast trigger-drag up-and-down
vertical movements. Nodding 460 is used to indicate "Yes" or
"Accept." X-ing 470 is defined as in drawing the letter "X." X-ing
470 is used for "Delete" or "Block" commands. Wagging 480 is
defined by two trigger-drag fast back-and-forth horizontal
movements. The wagging gesture 480 is used to indicate "No" or
"Cancel."
[0044] Depending on the complexity of the sensor system, only
simple one dimensional motions or gestures may be allowed. For
instance, a simple right or left movement on the sensor as shown
here may produce a fast forward or rewind function. In addition,
multiple sensors could be included and placed at different
locations on the touch screen. For instance, a horizontal sensor
for left and right movement may be placed in one spot and used for
volume up/down, while a vertical sensor for up and down movement
may be place in a different spot and used for channel up/down. In
this way specific gesture mappings may be used.
[0045] The system and methodology can be implemented in any number
of ways depending on the hardware and the content involved.
Examples of such deployment include DVD, Blu-Ray disc (BD);
streaming video or video on demand (VOD), and broadcast (satellite,
cable, over the air). Each of these deployments would have
different architectures but one could standardize the triggers for
each of these events (the additional content) that represents what
would be queued by the application running on the second screen.
For example, event A and event B would be triggered by a synching
mechanism associated with any of these sources of a video. When the
tablet encounters "event A", the program running on the second
screen device (e.g. tablet) will enact "event A". Similarly, if
"event B" is encountered, the program running on the second screen
device would do "event B".
[0046] FIG. 5 depicts a generic system 500 on which such
methodology could be implemented. Here the system 500 includes a
first screen device 510, a second screen device 520, a playback
device 530, a network 540 and server 550. Each of these elements
will be discussed in more detail below.
[0047] The first screen device 510 is a display device, such as
display device 114 described above in relation to FIG. 1, for
displaying content such as television programs, movies, and
websites. Examples of such first screen display devices include,
but are not limited to, a television, monitor, projector, or the
like. The first screen device 510 is connected to the playback
device 530 which can provide the primary content to the first
screen device 510 for display. Examples of such communication
include, but are not limited to HDMI, VGA, Display port, USB,
component, composite, radio frequency (RF), and infrared (IR), and
the like. In certain embodiments, the first screen display device
510 may be connected to the network 540, in either a wired or
wireless (WiFi) manner, providing additional connection to the
second screen device 520 and server 550. In some embodiments, the
first display device 510 may include the functionality of the
playback device 530. In still other embodiments, the first screen
display device 510 may be in non-networked communication 560 with
the second screen device 520. Examples of such non-networked
communication 560 include, but are not limited to, RF, IR,
Blue-Tooth, and other audio communication techniques and
protocols.
[0048] The second screen device 520 is device capable of displaying
additional content related to the primary content being displayed
on the first screen device 510. The second screen device may be a
touch screen control device 116 or touch screen device 300 as
described above. Examples of second screen devices include, but are
not limited to, a smart phone, tablet, laptop, personal media
player (e.g. ipod), or the like. The second screen device 520 is in
communication with playback device 530 using either network 540,
non-networked communication 560, or both. The second screen device
550 is also in communication with the server 550 via the network
540 for requesting and receiving additional content related to the
primary content being displayed on the first screen device 510. In
some embodiments, the second screen device 520 may be in networked
or non-networked communication 560 with the first screen device
510. Examples of such non-networked communication 560 include, but
are not limited to, RF, IR, Blue-Tooth (BT), audio communication
techniques and protocols, or the like.
[0049] The playback device 530 is device capable of providing
primary content for display on the first screen device 510.
Examples of such playback display devices include, but are not
limited to, a DVD player, Blue-Ray Disc (BD) player, game console,
receiver device (cable or satellite), Digital Video Recorder (DVR),
streaming device, personal computer, or the like. The playback
device 530 is connected to the first screen device 510 for
providing the primary content to the first screen device 510 for
display. Examples of such connections include, but are not limited
to HDMI, VGA, Display port, USB, component, composite, radio
frequency (RF), and infrared (IR), and the like. The playback
device 530 is also connected to the network 540, in either a wired
or wireless (WiFi) manner, providing connection to the second
screen device 520 and server 550. In some embodiments, the
functionality of the playback device 530 may be included in the
first screen display device 510. In still other embodiments, the
playback device 530 may be in non-networked communication 560 with
the second screen device 520. Examples of such non-networked
communication 560 include, but are not limited to, RF, IR,
Blue-Tooth (BT), and other audio communication techniques and
protocols.
[0050] The network 540 can be a wired or wireless communication
network implemented using Ethernet, MoCA, and wireless protocols or
a combination thereof. Examples of such a network include, but are
not limited to, delivery network 1 (106) and delivery network 2
(112) discussed above.
[0051] The server 550 is a content server configured to provide
additional content to the second screen device 520. In certain
embodiments, the server may also provide the primary content for
display on the first screen device 510. The service is connected to
the network 540 and can communicate with any of the devices that
are also connected. Examples of such a server include, but are not
limited to, content source 102, broadcast affiliate manager 104,
content manager 110, and the back end server described above.
[0052] FIG. 6 depicts a flow diagram 600 for a methodology for
displaying additional content related to primary content being
viewed is disclosed. The method includes the following steps:
Displaying primary content on a first screen device 510 (step 610).
Providing, in association with the display of the primary content
on the first screen, a synching mechanism to synch additional
content (step 620). Displaying, on a second screen device 520,
additional content related to the primary content on the first
screen 510 that is synched to the content displayed on the first
screen device according to the synching mechanism (step 630). In
certain embodiments, the method also includes the steps of
receiving commands from the second screen device 520 to control the
display of primary content on the first screen device 510 (step
640) and controlling the display of the primary content on the
first screen device 510 based on the commands received from the
second screen device 520 (step 650). Each of these steps will be
described in more detail below.
[0053] The step of displaying primary content (step 610), such as a
movie or television show, is performed on the first screen device
510. This involves the primary content being provided to the first
screen display 510. The primary content can be provided by the
playback device 530 or be received directly from a content provider
at the first screen display device 510. The primary content is then
shown or otherwise displayed on the first screen device 510. The
display of the primary content also includes the control of the
content being displayed. This can include the traditional playback
commands of play, stop, pause, rewind, and fast forward as well as
the navigation of on screen menus to select the content and other
playback options. In certain embodiments, the display on the first
screen device 510 (step 620) further includes displaying an
indicator of the type of additional content being displayed on the
second screen device 520.
[0054] The provided synching mechanism (step 620) can be
implemented in a number of ways. In certain embodiments the
synching mechanism is performed by an application running on the
second screen device 520, the playback mechanism 530, the first
screen device 510 or any combination thereof. At its most basic,
the second screen device 520 is configured (via an application) to
detect synching signals, cues, or other type of indicators that
directs the second screen device 520 to update the display of
additional content to coincide with the primary content being
displayed on the first screen 510. The synching signals, cues or
other type of indicators, can be provided as part of the primary
content or can be generated by the playback device 530 of first
screen device 510 (via an application) in accordance with the
chapter, scene, time-code, subject matter, or content being
displayed. The synching signals, cues or other type of indicators
can be transmitted to the second screen device 520 using the
network, in either a wired or wireless (WiFi) manner, or using
non-networked communication 560 such as audio signals. Examples of
some of the implementations are given below. Other possible
implementations will be apparent given the benefit of this
disclosure.
[0055] The step of displaying the additional content, such as
supplemental materials, video clips, websites, and the like (step
630) is performed on the second screen device 520. The additional
content can be stored locally on the second screen device 520 or be
provided by the server 550, playback device 530, or first screen
device 510. The display of the additional content is synched to the
primary content being displayed on the first screen device 510
according to the synching mechanism. For example, when the second
screen device 520 detects a synching signal, cue or other type of
indicator, the second screen device 520 updates the display of the
additional content accordingly. In some embodiments, this further
involves contacting and requesting the additional content from the
server 550, playback device 530, or first screen device 510 and
subsequently downloading and displaying the additional content. In
some embodiments, the additional content to be displayed can be
selected, modified, or omitted based on the user using the
system.
[0056] In certain embodiments, the display on the second screen
device 520 (step 630) further includes displaying the status of the
display of the primary content on the first screen device 510 such
as whether the display of the primary content on the first screen
device 510 has been paused. In certain other embodiments, the
display on the second screen device 520 (step 630) further includes
displaying the status of the synch between the additional content
on the second screen device 520 and the primary content on the
first screen device 510.
[0057] In certain embodiments, the second screen device 520 is
capable of transmitting as well as receiving. The optional steps
640 and 650 address this capability. In step 640 commands are
received from the second screen device 520. Ideally, these commands
are received at the device controlling the playback of the primary
content on the first screen device 510. In certain embodiment, the
playback device 530 is the device receiving the commands. The
commands can be sent via the network 540 or non-networked
communication 560. Once received, the commands can control the
display of the primary content (step 650). Examples of such control
include, but are not limited to, play, stop, pause, rewind,
fast-forward, as well as chapter, scene, and selection. These
commands can also be used to synch the primary content displayed on
the first screen device 510 with the additional content being
displayed on the second screen device 520.
[0058] FIG. 7 provides a high level overview of one example of
system 700 with a synching mechanism implemented using a
non-networked communication 560. In this system 700, the
non-networked communication synching mechanism is audio
watermarking 710. In this example, audio watermarking 710 involves
inserting a high-frequency signal, cue, or other indicator into the
audio signal of the primary content being displayed on the first
screen device 510. The audio watermark is inaudible to humans but
can be detected by a microphone in the second screen device 520.
When the second screen device 520 detects an audio watermark, the
displayed additional content is updated to synch with the primary
content being displayed on the first screen device 510 based on the
detected watermark. The audio watermarks can be incorporated into
the primary content at the source of the content or inserted
locally by the playback device 520 or first screen device 510.
[0059] FIG. 8 provides a high-level overview of one example of a
system 800 with a synching mechanism implemented using the network
540. In this system 800 the synching mechanism is wireless
communication (WiFi) 810 between a playback device 530 (a Blu-Ray
Disc player) and the second screen device 520 (an iOS device
running an application). In the example of FIG. 7, the features and
protocols of a BD-Live enabled device are used. There are two main
components of this protocol: connection and communication. Both are
described below. For simplicity the second screen iOS application
will be referred to as the "iPad" and the BD-Live enabled device
will be referred to as the "disc".
[0060] Connection occurs when an iOS enabled device 520 first
launches the second screen application and attempts to connect to a
BD-Live enabled device 530 on the same Wi-Fi network 540. [0061] 1.
Disc is inserted into BD player [0062] 2. Disc enters UDP
`listening` loop [0063] 3. iPad launches second screen application
[0064] 4. iPad performs UDP broadcast of authentication token
[0065] 5. Disc receives authentication token and authenticates
[0066] 6. Disc retrieves IP from tokens sender (iPad's IP) [0067]
7. Disc responds to authentication with its IP and PORT [0068] 8.
iPad confirms IP and PORT [0069] 9. iPad closes UDP socket
communication [0070] 10. iPad establishes direct TCP socket
communication with disc based on IP and PORT provided.
[0071] Communication occurs after a connection has been established
between the second screen iOS application and a BD-Live enabled
device. [0072] 1. iPad and Disc are aware of each other's IP's as
well as what PORT communication should occur using [0073] 2. TCP
socket communication is maintained for the duration of the
applications lifecycle.
[0074] One advantage of such a wireless communication as seen in
this example is that it is bi-directional allowing the second
screen device to transmit as well as receive commands. This allows
for two way synching as well as control of playback from the second
screen device 520.
[0075] In certain embodiments, the application of the second screen
device 520 could be specific to a specific program or movie on a
specific system (e.g. BD). In other embodiments, the second screen
application could be generic to a studio with available plug-ins to
configure the application to a particular program or movie. In
still other embodiments the second screen application could be
universal to system (BD, VOD, broadcast), content, or both. Other
possible implementations and configurations will be apparent to one
skilled in the art given the benefit of this disclosure.
[0076] The system can be operated in with a passive approach or an
interactive approach. In the passive approach icons displayed on
first screen device 510 prompt the user to look at the second
screen device 520 for an additional content event being displayed
that is related to the primary content displayed on the first
screen device 510. The icon preferably indicates what type of
additional content event is available on the second screen device
520 (e.g., a shopping cart icon indicates a purchase event, an "I"
icon indicates an information event, a stickman icon indicates a
character information event, etc.) FIG. 9A-F depicts some of the
aspects that may be displayed to the user in passive mode.
[0077] FIGS. 9A-F depict skeletal examples of what may be displayed
on the screen 900 of the second screen device to a user when using
an application in passive mode that provides additional content on
the second screen device 520 that is synched with the primary
content on the first screen device 510.
[0078] FIG. 9A depicts a splash screen that may be displayed to the
user when the application is launched. It includes the product logo
and indication of the primary content 902. Here new content screens
transition in from right in a conveyer-belt like manner as indicate
by arrow 904.
[0079] FIG. 9B depicts a pop-up message 906 that is displayed to a
user when no playback device 530 is detected by second screen
device 520.
[0080] The screen 900 of FIG. 9C shows a synch button/icon 908,
chapter timeline 910, active chapter indicator 912, chapter-event
indicator 914, chapter number indicator 916, event timeline 918,
chapter background 920, event card 922, and timeline view icons
924. The synch button 908 provides a mechanism to synch the content
between the first and second screen devices 510, 520. The synch
button 908 may also indicate the status of the synch between the
content on the first and second screen devices 510, 520. The
chapter timeline 910 indicates the chapters of the primary content.
The movie title leader is in the background of the chapter timeline
910 and indicates the primary content. As the primary content
progresses the chapters move along chapter timeline in a
conveyer-belt life fashion with the active chapter indicator 912
indicating the current chapter in the primary content via highlight
and center position of the chapter timeline 912. The chapter-event
indicator 914 indicates that events displayed in the event timeline
918 are part of the active chapter shown in the chapter timeline
910. The event timeline 918 displays event cards 922 indicating
events that correspond to what is transpiring in the current
chapter of the primary content. For each chapter, the first
displayed event card 922 indicates the chapter that the following
events occur in. As the primary content progresses the event cards
922 move along event timeline 918 in a conveyer-belt like fashion
with the current event in the center position of the event timeline
918. Each chapter may be provided with a unique background 920 for
the events of that particular chapter. The timeline view
icon/button 924 indicates that the viewer is in timeline view
showing the chapter timeline 910 and event timeline 918 as well as
provides a mechanism to access the timeline view.
[0081] The screens 900 of FIGS. 9D and 9E depict how event cards
922 progress across the event timeline 918. Here the synch
button/icon 908 indicates that the timeline view of the additional
content is in synch with the primary content on the first screen
device 510. In FIG. 9D, the current triggered event card 926 is
shown in the center position of the event timeline 918 and
represents the first triggered event. To the left of the current
triggered event card 926 in the event timeline 918 is the previous
event card 928, in this case the card indicating the chapter. To
the right of the current triggered event card 926 in the event
timeline 918 is the next event card 930, in this case the card
indicating the next scheduled event. Since, in FIG. 9D, this is the
current triggered event card 926 is for the first triggered event,
the chapter indicator 916 indicates that it is chapter 1. The
current triggered event card 926 includes the additional content
932 related to the primary content. The current triggered event
card 926 also provides an indicator 934 as to what type of
additional content is displayed. In certain embodiments this
indicator matches an indicator shown on the first screen display
510. The current event card 926 also includes buttons/icons for
synching 936 and sharing 938. The synch button/icon 936 provides a
mechanism that causes the primary content displayed on the first
screen device 520 to be synched with the current event. The share
button/icon 938 provides a mechanism to share the additional
content of the event with a social network. The elements of the
screen 900 of FIG. 9E are similar to the elements of FIG. 9D except
that the current triggered event card 926 is for an event that
happens later in the timeline as indicated by the chapter indicator
916 which indicates the current chapter is chapter 3.
[0082] FIG. 9F depicts examples of other possible functionality
that may be provided as part of display on the second screen device
920. Here the chapter timeline 910 is provided with a collapse
icon/button 940 which provides a mechanism to toggle the chapter
timeline between visible 940a and hidden 940b. Likewise the synch
button/icon 908 can toggle between status indicating whether synch
is currently active 908a and status indicating synch has been lost
and re-synching is available 908b. In some embodiments a volume
button icon 942 is provided. The volume button/icon 942 provides a
mechanism to turn the sound of the first screen display "OFF" or
"ON". The volume button 942 may also indicate the status of whether
the volume is "ON" indicating muting is available 942a, or "OFF"
indicating sound is available 942b. In some other embodiments a
play/pause button/icon 944 is provided. The play/pause button 944
provides a mechanism to pause or resume playback of content on the
first screen display 510. The pause/play button may also indicate
the status of whether the playback can be paused 944a, or "or
resumed 944b.
[0083] In the interactive approach, the user selects an additional
content event on the second screen device 520 and what is displayed
on the primary screen device 510 is synched to the selected event.
As indicated previously, the events of additional content are
synched to the primary content. If the user swipes the movie
timeline or the events, the events become out of synch with the
movie being shown on the main screen. To re-synch touches the synch
button on the tablet. The timeline or events are the synched back
to what is being displayed on the main screen. Likewise, a user can
select a trivia event or map event, touch the synch button, and the
scene in the movie related to the selected trivia or map event will
be played on the main screen. Examples of this can be seen in FIG.
10A-D.
[0084] FIG. 10A depicts how a user may interact with the chapter
timeline 910 and event timeline 918 on the screen 900. Here icons
1000 and 1002 represent how the user can touch the screen to scroll
left or right in the chapter or event timelines 910, 918.
[0085] FIG. 10B depicts one embodiment of the screen 900 when a
user interacts with the chapter timeline 910. In this example the
synch button/icon 908 indicates that the additional content on the
second screen display 520 is out of synch with the primary content
on the first screen display 510. Icon 1000 represents the user
scrolling through the chapter timeline 910. The current chapter
remains highlighted 912 until the transition to the new chapter is
completed. When navigating through the chapter timeline 910 a
chapter position indicator 1004 is provided that indicates what
chapter of the available chapters is selected. The chapter
indicator 916 also indicates the selected chapter and updates when
the transition to the new chapter is complete. In this example,
while the user is navigating through the chapter timeline 910, the
event timeline 918 is dimmed. In certain embodiments, the user may
jump directly to a particular chapter by selecting the chapter from
the timeline 910.
[0086] FIG. 10C depicts one embodiment of the screen 900 when a
user interacts with the event timeline 918. Icon 1002 represents
the user scrolling through the event timeline 918. Here, the
timeline 918 is being transitioned from current triggered event
card 926 to the next event card 930. When navigating through the
event timeline 918 an event position indicator 1004 is provided
that indicates what event of the available events is selected.
[0087] FIG. 10D depicts one embodiment of the screen 900 when a
user interacting with the event timeline 918 causes a transition
from one chapter to another. Icon 1002 represents the user
scrolling through the event timeline 910 causing a chapter change.
Here, the timeline 918 is being transitioned a new event card 922
indicating a new set of events related to a new chapter. When
navigating through the event timeline 918 causes a transition to a
new chapter the event position indicator 1004 is centered until the
new series of events begins.
[0088] FIGS. 11A-C and 12A-B indicate some of the other interactive
activities that can be accessed via the event cards 922. FIGS.
11A-C depict the social media sharing feature. FIGS. 12A-B depict
the chapter selection as well as selection and playback of
additional media files.
[0089] FIG. 11A-C shows various pop-up fields on the display 900
when the sharing feature is active via the share button/icon 937.
FIG. 11A shows the field 1100 displayed when the user has logged
into their social network (in this case Facebook). Area 1102
indicates the event being shared and area 1104 indicates the
comments the user is going to share about the event. Button 1106
provides the mechanism to submit the event and comments to be
shared. FIG. 11B shows the field 1100 displayed when the user has
not yet signed in to the social network. In this example button
1108 is provided to sign into Facebook and button 1110 is provided
to sign into twitter. Options to sign into other social networks
may also be provided. FIG. 11C shows a onscreen Qwerty keyboard
1112 that may be used to enter comments into area 1104 for user's
comments. In certain embodiments, this may be a default keyboard
provided by the second screen device 520.
[0090] FIG. 12A-B shows the selection of chapters as well media
content for playback by the user. In the example of 12A, if the
user single taps 1200 the currently playing chapter shown in the
chapter timeline 912 the playback on the first screen device 510 is
paused. If the user double taps 1202 the currently playing chapter
shown in the chapter timeline, playback of on the first screen
device will jump to the beginning of the chapter and the events
timeline 918 will be set to the first event of that chapter. In
some embodiments, the event cards 922 may include media files 1204
such as video or audio clips. If the media file is an audio clip,
then selection of the audio clip results in playback on the current
screen 900. If the media file is a video clip, then selection of
the video clip results in the launching of a full-screen media
player 1206 as seen in FIG. 12B. In this example the media player
includes on-screen controls 1208. To return to the previous screen,
the user only needs to tap the non-video surface 1210 of the media
player.
[0091] FIG. 13A-E depicts some other possible features regarding
the additional content. These include a map view 1300, family tree
1310, and settings 1320. FIG. 13A depicts the menu bars for these
options. In this example each of these menu bars are provided with
first screen device controls 1330 including pause/resume and
mute/un-mute. FIG. 13B depicts the map view display 1300. The map
view display 1300 includes a map 1302 including marked locations
1304 and information about the locations 1306. Icons are also
provided to select other maps 1308. FIG. 13C depicts the family
tree view 1310. The family tree view shows the family tree with
fields 1312 indication the relationship between the family members.
In this example the button/icon 1314 at the bottom indicates what
view is currently being shown (i.e. the family tree view). If a
field 1312 is selected, a pop-up field 1316 is displayed, as shown
in FIG. 13D, providing information about the person in the field
1312. FIG. 13e depicts the settings view 1320. In view 1320 the
user is provided with controls for adjusting the preferences for
the audio and video 1322, events 1324, and social network sharing
1326.
[0092] FIGS. 14A-L depict skinned examples of what may be displayed
on the screen 900 of the second screen device to a user when using
an application that provides additional content on the second
screen device 520 that is synched with the primary content on the
first screen device 510. FIG. 14A is a skinned version of the
splash screen as shown and described in relation to FIG. 9A. FIGS.
14B-F depict skinned versions of the timeline view as seen and
described in relation to FIGS. 9C-F and 10A-D. FIG. 14G depicts a
skinned version of a screen display wherein all the available video
clips that are part of the additional content are displayed for the
user. FIG. 14H depicts a skinned version of a screen display
wherein all the available audio clips that are part of the
additional content are displayed for the user. FIG. 14I depicts a
skinned version of the maps view as shown and described in relation
to FIGS. 13B. FIGS. 14J and 14K depict skinned version of the
family tree view as shown and described in relation to FIGS. 13C
and 13D respectively. FIG. 14L depicts a skinned version of the
settings view as shown and described in relation to FIG. 13E.
[0093] The events and features shown in the figures are just some
examples of possible events. In certain embodiments, a user may be
able to configure or otherwise select what events they wish to be
shown (e.g., don't show me purchase events). In other embodiments
the user may be able to select or bookmark events for viewing at a
later time. In still other embodiments certain events may
unavailable or locked out depending on the version of the program
being viewed (i.e. purchased vs. rented or BD vs. VOD vs.
Broadcast). In other embodiments, the events available can be
personalized for a user based on previous viewing habits (i.e. in
system such as TIVO where a user's viewing habits are tracked or
using the personalization engine 118 of FIG. 1).
[0094] Other possible configurations include shopping features. For
example, a store front could be provided and accessible from the
second screen to for purchasing movie merchandise. In another
embodiment points or awards could be provided to a user for
watching, reviewing, or recommending a program or film. For
example, the more movies watched or shared with friends, the more
points awarded. The points can then be used for prizes or discounts
on related goods.
[0095] Similarly, achievements can also be awarded. These
achievements could be pushed to a social networking site. Example
achievements could include: [0096] Watching certain
scenes--Achievement [0097] Watching certain discs in a
series--Achievement [0098] Watching certain discs by a particular
studio or actor--Achievement
[0099] In still other implementations a Wild feature could be
implemented. A running Wiki could let a user and other users of a
disc comment on certain scene. For example, tacking metadata could
be created which is pushed to a web based wild. Such metadata could
include:
[0100] Chapter Information
[0101] Time Codes
[0102] Thumbnails of Scenes
[0103] Actor/Director Information
[0104] This pushed information can be used to create a running Wild
which lets others comment on the movie. These comments could then
be reintegrated into the second screen application as events which
can be accessed.
[0105] Additional features and screens are also possible. For
example in some embodiments alternate view content can be provided
on the second screen device 520 with is synchronized with content
being displayed on the first screen device 510. A second screen
techniques described herein are designed to display curated
"events" in sync with primary video playback. The events can
represent trivia, social content, bonus material, etc that are
timed for display at relevant points in the primary video playback.
Alternate view or alternate-angle video can be presented just as
other events at the appropriate times. Given that alternate view
videos are typically only for select scenes this allows the
interleaving of alternate view content with other multimedia
elements.
[0106] As used herein, alternate version content refers to content
such as video, audio, or the like that relates to particular
content but is not presented as part of the primary playback of the
content. On common example of such alternative view content is the
multi-angle feature that can be provided on DVDs. With this scene
are provided from different angles than what is shown in the
primary view. Other possible implementations include alternate
language text, green screen footage showing the scene before
special effects are added, raw audio before sound effects or
soundtrack are added, or the like. Generally the alternative
version content is the same length and corresponds directly to
playback of the primary content. Other possible alternate content
will be apparent to one skilled in the art given the benefit of
this disclosure.
[0107] FIG. 15 depicts a flow chart 1500 of one such methodology
for providing alternate content on a second screen device. In this
example there are two primary steps. First, the timing of event on
the second screen device 520 is synched to the display of content
on the first screen device 510 (step 1510). Then, the alternative
content that is synched to the content displayed on the first
screen device can be provided (step 1520). Each of these steps will
be described in more detail below.
[0108] Referring to step 1510, there are several possible synching
mechanisms that can be used. Some examples include: Audio
Fingerprint detection, Audio Watermarking, Video Metadata Sync, and
BDLive Synch. These are discussed briefly below.
[0109] With Audio Fingerprint Detection the audio of the media is
separately indexed into a database for later lookup. The
second-screen app samples audio, sends the sample to a service
(typically a backend server) and the service returns with the
identification of the media. Typically this solution is used to
generally identify the content without providing any timing
information. A disadvantage of this is subject to environmental
considerations such as volume of audio, ambient noise, distance
from speakers to the microphone. This makes this a less ideal
synching mechanism.
[0110] With Audio Watermarking the program is preprocessed to
encode identifiers within the audio track at known locations. In
some instances the audio identifiers can be high frequency so as to
be outside the range of normal human hearing. This information can
be used for time or event-based synchronization in the second
screen app. Some advantages of this mechanism over Audio
Fingerprinting is that it is less susceptible to interference with
background or ambient noise as well as the ability to correlate the
identifier to time.
[0111] With Video Metadata Sync a video stream (e.g. DVB) is
encoded with additional metadata corresponding to known time
locations. A set-top box or other playback device decodes the
metadata stream and provides position information to a
second-screen app upon request. The associated advantage of Video
Metadata sync over audio is the relative reliability of the sync
connection (TCP socket communication).
[0112] With BDLive Sync a BluRay disk is produced to include a
service that is launched when a BluRay Disk is played. The second
screen device using an application queries the service for position
information (or the app can subscribe to events). The information
is then used to synchronize second-screen content to primary media
playback. The associated advantages of BDLive Sync is the reliable
connection and the additional capability to control media playback
(play, pause, etc).
[0113] Each of these synching provide have advantages and issues.
One common issue is accounting for transmission and processing
delays when synching the content. For example in the case of socket
communications periodic requests are made by the second-screen
application to a known service (e.g. BDLive service) for the
current playback position in the media. Network communications
delays are accounted for by noting the time of request and the time
of response and dividing in half.
communcationsDelay=(timeOfResponse-timeOfRequest)/2
This communication delay accounts for the time it takes for the
return half of the communication (this is a simplistic model that
assumes that any network delay is evenly distributed across both
the request and the response portions of the message). The
communication delay can be accounted for by where the synching
signal, cues, or other type of indicator that triggers the event is
provided coinciding with playback of the content on the first
screen device 510. In other embodiments the resulting value is used
to adjust a continuously running internal reference timer (a
flywheel) for the application providing the additional second
screen content, in this case the alternate version content.
[0114] Alternately audio watermark detection may be used to
determine the position of playback. This assumes that the audio
watermark can be deterministically embedded and extracted to
determine an accurate time reference. Again the timing delay can by
accounted for by where synching signal, cues, or other type of
indicator that triggers the event is provided coinciding with
playback of the content on the first screen device 510 or by
adjusting the internal reference timer (flywheel) on the second
screen device 520. The time adjustment may be different from that
used with the BDLive system as different transmission delay and
processing may be required.
[0115] In certain embodiments, multiple synching mechanisms may be
available for use in synching content on the second screen with
content on the first screen. In some such embodiments, priority may
be given to some synching mechanisms over others, wherein the
application will always try to use the highest priority synching
mechanism.
[0116] Referring to step 1520, alternate content "event" is
triggered at a predetermined playback time. Triggering typically
requires consideration for media loading or streaming such that the
alternate-angle video segment can be started at a precise time. In
certain embodiment the alternate content needs to be preloaded to
ensure that the alternate content can be available in sync with the
content being displayed on the first screen device 510. In such
instance a determination is made if the event involves alternate
content and if so the content is preloaded so that the alternate
content can be provided in sync with the content on the first
screen 510.
[0117] In some embodiment the alternate version content can be
located on a server 550 which requires transfer to the second
screen device 520. Such communication and transfer delays can also
be accounted for. The timing of the synching signals, cues, or
other type of indicators that trigger the event as well as the
adjustment to the reference timer discussed above for addressing
the communication delay can also be used to account for such
processing or loading delays.
[0118] FIG. 16 is a wireframe 1600 of screen that can be displayed
as part of a timeline display on the second screen device 520. In
this example, the displayed screen 1600 includes alternate version
content 1620 that is synched with content on a first screen device
510. The exemplary screen also includes a header 1630 with control
buttons 1632 which correspond to the buttons found in the header of
FIGS. 10-14. In this example, two alternate versions: version 1
(1622) and version 2 (1624), are provided to a user for selection.
By selecting the one of these versions the selected version is
displayed full screen on the second screen device 520 as seen in
FIG. 17. In certain embodiments, such as when there is only one
alternative version available, the alternate content may
automatically be displayed. In still further embodiments, the
automatic display of the alternate content may begin a certain time
period after the alternative content is shown as being available
and no action is taken by the user.
[0119] FIG. 17 is a wireframe 1700 of a screen wherein the
alternative version content is being displayed full screen on the
second screen device 520. In this example, the displayed screen
1700 includes a header 1710 and a main view area 1720 where
alternate content is displayed. In this embodiment, the header 1710
includes a back button 1730 and a transfer to first screen button
1740. The back button 1730, when selected, switches the display on
the seconds screen device 520 back to a traditional timeline view
such as seen in FIG. 16. The transfer to first screen button, when
selected, causes the alternate version content to be displayed on
the first screen device 510. In certain embodiments, when the
alternate view content is displayed on the first screen device 510,
the primary content can be displayed on the second screen device
520.
[0120] The present description illustrates the principles of the
present disclosure. It will thus be appreciated that those skilled
in the art will be able to devise various arrangements that,
although not explicitly described or shown herein, embody the
principles of the disclosure and are included within its spirit and
scope.
[0121] All examples and conditional language recited herein are
intended for informational purposes to aid the reader in
understanding the principles of the disclosure and the concepts
contributed by the inventor to furthering the art, and are to be
construed as being without limitation to such specifically recited
examples and conditions.
[0122] Moreover, all statements herein reciting principles,
aspects, and embodiments of the disclosure, as well as specific
examples thereof, are intended to encompass both structural and
functional equivalents thereof. Additionally, it is intended that
such equivalents include both currently known equivalents as well
as equivalents developed in the future, i.e., any elements
developed that perform the same function, regardless of
structure.
[0123] Thus, for example, it will be appreciated by those skilled
in the art that the block diagrams presented herewith represent
conceptual views of illustrative circuitry embodying the principles
of the disclosure. Similarly, it will be appreciated that any flow
charts, flow diagrams, state transition diagrams, pseudocode, and
the like represent various processes which may be substantially
represented in computer readable media and so executed by a
computer or processor, whether or not such computer or processor is
explicitly shown.
[0124] The functions of the various elements shown in the figures
may be provided through the use of dedicated hardware as well as
hardware capable of executing software in association with
appropriate software. When provided by a processor, the functions
may be provided by a single dedicated processor, by a single shared
processor, or by a plurality of individual processors, some of
which may be shared. Moreover, explicit use of the term "processor"
or "controller" should not be construed to refer exclusively to
hardware capable of executing software, and may implicitly include,
without limitation, digital signal processor ("DSP") hardware, read
only memory ("ROM") for storing software, random access memory
("RAM"), and nonvolatile storage.
[0125] Other hardware, conventional and/or custom, may also be
included. Similarly, any switches shown in the figures are
conceptual only. Their function may be carried out through the
operation of program logic, through dedicated logic, through the
interaction of program control and dedicated logic, or even
manually, the particular technique being selectable by the
implementer as more specifically understood from the context.
[0126] Although embodiments which incorporate the teachings of the
present disclosure have been shown and described in detail herein,
those skilled in the art can readily devise many other varied
embodiments that still incorporate these teachings. Having
described preferred embodiments for a method and system for
providing media recommendations (which are intended to be
illustrative and not limiting), it is noted that modifications and
variations can be made by persons skilled in the art in light of
the above teachings.
[0127] While the example set forth above has focused on an
electronic device, it should be understood that the present
invention can also be embedded in a computer program product, which
comprises all the features enabling the implementation of the
methods described herein, and which, when loaded in a computer
system, is able to carry out these methods. Computer program or
application in the present context means any expression, in any
language, code or notation, of a set of instructions intended to
cause a system having an information processing capability to
perform a particular function either directly or after either or
both of the following a) conversion to another language, code or
notation; b) reproduction in a different material form.
[0128] Additionally, the description above is intended by way of
example only and is not intended to limit the present invention in
any way, except as set forth in the following claims.
* * * * *