U.S. patent application number 14/637357 was filed with the patent office on 2015-06-25 for touch screen based interaction with traffic data.
The applicant listed for this patent is Pelmorex Canada Inc.. Invention is credited to Briac Blanquart, Andre Gueziec.
Application Number | 20150177018 14/637357 |
Document ID | / |
Family ID | 45605714 |
Filed Date | 2015-06-25 |
United States Patent
Application |
20150177018 |
Kind Code |
A1 |
Gueziec; Andre ; et
al. |
June 25, 2015 |
TOUCH SCREEN BASED INTERACTION WITH TRAFFIC DATA
Abstract
Embodiments of the present invention provide a service to goods
and/or service providers and consumers in the traditional context
of a "first-come, first-served queue" or "class based" queue. A
provider presents a listing of one or offerings to consumers to
solicit bids for a good or service associated with the offering.
After receiving an acceptable bid from a user, the provider
provides feedback regarding the bid that indicates the ranking of
the user with respect to other bidders in the bidding process.
Based on the feedback, the user may submit a modified bid.
Following confirmation of a bid by a user, a winning bid is
determined and the user is notified.
Inventors: |
Gueziec; Andre; (Sunnyvale,
CA) ; Blanquart; Briac; (San Francisco, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Pelmorex Canada Inc. |
Ontario |
|
CA |
|
|
Family ID: |
45605714 |
Appl. No.: |
14/637357 |
Filed: |
March 3, 2015 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
12860700 |
Aug 20, 2010 |
8982116 |
|
|
14637357 |
|
|
|
|
12398120 |
Mar 4, 2009 |
8619072 |
|
|
12860700 |
|
|
|
|
Current U.S.
Class: |
345/173 |
Current CPC
Class: |
G08G 1/01 20130101; G06F
3/0488 20130101; G06F 3/041 20130101; G08G 1/0967 20130101; G08G
1/091 20130101; G06F 16/29 20190101; G08G 1/0141 20130101; G08G
1/0116 20130101; G08G 1/012 20130101; G08G 1/04 20130101; G01C
21/3638 20130101; G06F 3/04815 20130101; G01C 21/3694 20130101 |
International
Class: |
G01C 21/36 20060101
G01C021/36; G08G 1/0967 20060101 G08G001/0967; G08G 1/04 20060101
G08G001/04; G06F 3/041 20060101 G06F003/041; G06F 3/0488 20060101
G06F003/0488 |
Claims
1. A method for conducting touch screen based interaction with
traffic data, the method comprising: generating a three-dimensional
virtual broadcast presentation displayed on a touch screen
including one or more interactive elements and visual
representations for the most current traffic conditions, wherein
the most current traffic conditions include information about
traffic flow and weather, and wherein the virtual broadcast
presentation is automatically synchronized with current information
regarding road conditions available from a plurality of sources
including traffic flow data and weather conditions; receiving a
touch screen signal from a user, the touch screen signal
corresponding to user interactions with one or more interactive
elements on the touch screen indicative of user interest;
processing the received touch screen signal, wherein each touch
screen signal corresponds to executable instructions that
manipulates the virtual broadcast presentation; executing the
instructions corresponding to the received touch screen signal
thereby manipulating the virtual broadcast presentation, the
virtual broadcast presentation updated to reflect the most current
traffic conditions inclusive of traffic flow and weather.
2. The method according to claim 1, wherein the plurality of
sources include public agencies.
3. The method according to claim 2, wherein the public agencies
include the National Weather Service.
4. The method according to claim 1, wherein the plurality of
sources include information from Internet.
5. The method according to claim 1, wherein the plurality of
sources include public listings.
6. The method according to claim 1, wherein the plurality of
sources include data from other users.
7. The method according to claim 1, wherein the one or more
instructions executed includes rotating, tilting, zooming in,
zooming out a view of the three-dimensional virtual broadcast
presentation.
8. The method according to claim 1, wherein the interactive
elements are representations of conditions affecting traffic.
9. The method according to claim 8, wherein the interactive
elements represent traffic alerts.
10. The method according to claim 9, wherein a traffic alert is
road construction.
11. The method according to claim 1, wherein the interactive
elements identify various locations, landmarks or points of
interest.
12. The method according to claim 11, wherein the points of
interest include exit ramps, highways, and city streets.
13. The method according to claim 1, wherein the touch screen
signal includes selection of one or more interactive elements.
14. The method according to claim 13, wherein the one or more
executed instructions includes displaying information regarding the
selected interactive element by the user.
15. The method according to claim 14, wherein the displayed
information includes traffic video from a traffic camera
corresponding to a particular area being monitored by the traffic
camera.
16. The method according to claim 15, wherein the displayed
information includes detailed information about a traffic incident
associated with the interactive element.
17. The method according to claim 8, wherein the interactive
elements are used to represent weather conditions including rain
and snow.
18. The method according to claim 17, wherein the weather
conditions may include side effects of the weather conditions
including flooded roads, slippery/icy roads and obstructed
roads.
19. The method according to claim 1, wherein the visual
representation includes colors and objects that illustrates
real-time traffic conditions to the user.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application is a continuation and claims the
priority benefit of U.S. application Ser. No. 12/860,700 filed Aug.
20, 2010, which is a continuation-in-part and claims the priority
benefit of U.S. patent application Ser. No. 12/398,120 filed Mar.
4, 2009, the disclosures of which are incorporated herein by
reference.
BACKGROUND OF THE INVENTION
[0002] Existing broadcast presentations generally include a variety
of maps, images, and animations that display current or forecasted
conditions for reference by a presenter (i.e., news reporter)
during a broadcast presentation such as a traffic or weather
report. The broadcast presentation is often produced prior to a
scheduled broadcast for presentation by a traffic or weather
reporter in a fixed arrangement (much like a slide show) with a
prerehearsed script. Although the presenter has the ability to
control the speed and manner in which the broadcast presentation is
presented to a viewing audience, the content in the maps and images
remains fixed. That is, the content presented during the broadcast
presentation is not in real-time and is outdated. The reporting of
outdated information (e.g., traffic or weather information) may
have a drastic effect on a viewing audience who may rely on the
reported information to make decisions about such things as travel
or logistics.
[0003] Another shortcoming of existing broadcast technology is the
lack of interaction with the content of the virtual broadcast
presentation. Since the presentation contains pre-determined
content, a presenter is unable to directly interact with or
manipulate the maps and images of the presentation. The presenter
cannot, for example, retrieve real-time conditions or other
information associated with the maps or images of the
presentation.
[0004] As such, there is a need in the art for touch screen based
interaction with traffic data and other related data.
SUMMARY OF THE PRESENTLY CLAIMED INVENTION
[0005] Embodiments of the present invention allow a presenter to
interact with traffic data and other related data in real-time
using a touch screen or other suitable display.
[0006] In a first claimed embodiment, a method for touch screen
based interaction with traffic data is claimed. Through the method,
a virtual broadcast presentation is generated based on traffic data
received from one or more information sources. A signal based on
user interaction with a touch screen is generated and received.
User interaction may include the selection of an interactive
element included in the virtual broadcast presentation. The signal
generated by the touch screen is then processed and the virtual
broadcast presentation is updated in response to the processed
signal.
[0007] In a second claimed embodiment, a system for touch screen
based interaction with traffic data is claimed. The system includes
at least a communications module and a presentation rendering
module, each module stored in memory and executable by a processor.
Execution of the communications module by the processor receives a
signal generated by the touch screen. The signal may be based on
user interaction with the touch screen, wherein the user
interaction includes selection of an interactive element included
in a virtual broadcast presentation. Execution of the presentation
rendering module by the processor generates the virtual broadcast
presentation based on traffic data received from one or more
information sources, processes the signal generated by the touch
screen, and updates the virtual broadcast presentation in response
to the processed signal.
[0008] In a third claimed embodiment, a non-transitory
computer-readable storage medium is claimed. The storage medium
includes a computer program that is executable by a processor to
perform a method for touch screen based interaction with traffic
data. A virtual broadcast presentation is generated based on
traffic data received from one or more information sources. A
signal based on user interaction with a touch screen is generated
and received. User interaction may include the selection of an
interactive element included in the virtual broadcast presentation.
The signal generated by the touch screen is then processed and the
virtual broadcast presentation is updated in response to the
processed signal.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] FIG. 1 illustrates a block diagram of an environment for the
broadcast of a virtual broadcast presentation that a user may
interact with and reference in real-time
[0010] FIG. 2 illustrates the virtual broadcast presentation engine
of FIG. 1.
[0011] FIGS. 3A-3B illustrate a virtual broadcast presentation
displayed on a touch screen.
[0012] FIG. 4 illustrates an interactive element appearing in a
virtual broadcast presentation.
[0013] FIG. 5 illustrates the interaction technique of "pinching"
used with a virtual broadcast presentation.
[0014] FIGS. 6A-6B illustrate a virtual broadcast presentation in
`trip time` mode.
[0015] FIGS. 7A-7B illustrate a traffic camera appearing within a
virtual broadcast presentation.
[0016] FIG. 8 is a flowchart illustrating a method for touch screen
based interaction with traffic data presented in a virtual
broadcast presentation.
DETAILED DESCRIPTION
[0017] The present invention provides for the use of a touch screen
to interact with traffic information and other related data during
a virtual broadcast presentation. The virtual broadcast
presentation may include maps, images, graphics, animations,
multimedia overlays, and the like, that are rendered in a
two-dimensional or three-dimensional manner on a display such as a
touch screen. A presenter may refer to the presentation in
real-time and may manipulate a view of the virtual broadcast
presentation using the touch screen. The presenter may also use the
touch screen to select an interactive element included in the
broadcast presentation.
[0018] FIG. 1 illustrates a block diagram of an environment for the
broadcast of a virtual broadcast presentation that a user may
interact with and reference in real-time. The environment 100 of
FIG. 1 includes a computing device 110 having a virtual broadcast
presentation engine 120. The computing device 110 of FIG. 1 is
communicatively coupled to information sources 130, a touch screen
140, and a broadcast system 150. While FIG. 1 illustrates one
particular environment 100 including certain elements for the
broadcast of a virtual presentation, alternative embodiments may be
implemented that utilize differing elements than those disclosed in
FIG. 1 (or combinations of the same), but that otherwise fall
within the scope and spirit of the present invention.
[0019] The computing device 110 and the virtual broadcast
presentation engine 120 may generate a composite presentation that
includes a virtual broadcast presentation. The virtual broadcast
presentation may be two-dimensional or three-dimensional. The
composite presentation may be generated using information obtained
in real-time (or near real-time) from the information sources 130
as described in further detail below. The virtual broadcast
presentation engine 120, in particular, is discussed with respect
to FIG. 2. The computing device 110 may include various components
such as one or more of communications interfaces, a processor,
memory, storage, and any number of buses providing communication
therebetween (not depicted). The processor may execute instructions
implemented through computing modules or engines while the memory
and storage may both permanently or temporarily store data
including the aforementioned modules and engines.
[0020] The information sources 130 may be provided by various
organizations and in a variety of forms. Information sources 130
may include data sources related to traffic data such as traffic
flow and as described in U.S. patent application Ser. No.
11/302,418, now U.S. Pat. No. 7,221,287, or weather data such as
forecasts. Information sources 130 may also include data sources
related to newsworthy events or incidents, school closings,
election results, and other information that may be featured in a
virtual broadcast presentation. Information sources 130 may require
subscription or authentication for access and may be accessible via
Telnet, FTP, or web services protocols. Information may be received
from information sources 130 in real-time or near real-time to
allow for generation of an equally real-time or near real-time
presentation. That presentation may, in turn, be manipulated in
real-time.
[0021] In an embodiment of the present invention utilizing traffic
data specific to the San Francisco Bay area, information sources
130 may include one or more of the 511.org system (a collaboration
of public agencies including the California Highway Patrol,
Metropolitan Transportation Commission, and CALTRANS), the
California Highway Patrol (CHP) World Wide Web server, the PeMS
system at the University of California at Berkeley, various public
event listings, or a publicly or privately accessible user input
mechanism. For weather data, the information sources 130 may
include the National Weather Service among other weather
information sources. Other data sources or alternative types of
data sources (e.g., non-traffic and non-weather related sources)
may be incorporated and utilized in various embodiments of the
present invention.
[0022] Touch screen 140 may be any multi-touch touch screen known
in the art capable of recognizing complex gestures. Touch screen
140 may employ any touch screen technology known in the art
including but not limited to resistive technology, surface acoustic
wave technology, capacitive sensing (e.g., surface capacitance,
projected capacitance, mutual capacitance, self capacitance),
infrared, force panel technology, optical imaging, dispersive
signal technology, acoustic pulse recognition, and the like. A
presenter may interact with touch screen 140 using any interaction
technique known in the art such as touch-drag motions, "pinching,"
(e.g., zooming in or out of a web page or photo by touching the
user interface and either spreading two fingers apart or bringing
two fingers close together), scrolling (e.g. sliding a finger up
and down or left and right to scroll through a page), or other
user-centered interactive effects (e.g. horizontally sliding
sub-section, bookmarks menu, menu bars, and a "back" button). Touch
screen 140 may include various sensors such as a light sensor for
adjusting touch screen brightness. Touch screen 140 may also
include a tilt sensor, an accelerometer, a gyroscopic component,
and/or magnetometer for sensing orientation of touch screen 140
and/or switching between landscape and portrait modes. Touch screen
140 may be a liquid crystal display or any other suitable
display.
[0023] The broadcast system 150 disseminates the composite
presentation to viewers. Dissemination may occur via radio waves
such as UHF or VHF, cable, satellite, or the World Wide Web.
Hardware and software necessary to effectuate a broadcast may be
included in the broadcast system 150 and are generally known to
those skilled in the broadcasting art.
[0024] FIG. 2 illustrates the virtual broadcast presentation engine
of FIG. 1. The virtual broadcast presentation engine 120 of FIG. 2
includes a communications module 210, a presentation rendering
module 220, a selection module 230, a feedback module 240, and a
trip calculation module 250. The virtual broadcast presentation
engine 120 and its constituent modules may be stored in memory and
executed by a processing device to effectuate the functionality
corresponding thereto. The virtual broadcast presentation engine
120 may be composed of more or less modules (or combinations of the
same) and still fall within the scope of the present invention. For
example, the functionality of the selection module 230 and the
functionality of the feedback module 240 may be combined into a
single module.
[0025] Execution of the communications module 210 allows for
receipt of a signal generated by touch screen 140, which may be
based at least partially on a user selection such as the selection
by a presenter of an interactive element displayed within the
virtual broadcast presentation. The signal may additionally be
based on--in part or in whole--the actuation of other components
included within the virtual broadcast presentation such as various
soft keys with different functionalities.
[0026] In addition to the signal generated by touch screen 140,
execution of the communications module 210 may also allow for
receipt of dynamic information from information sources 130. This
dynamic information may be used by other modules for generating,
manipulating, and interacting with the virtual broadcast
presentation.
[0027] Referring again to FIG. 2, execution of the presentation
rendering module 220 allows for the generation of a virtual
broadcast presentation based on the dynamic information received
through execution of the communications module 210. The dynamic
information may include traffic information, weather information,
newsworthy events or incidents, election results, school closings,
or other information that may be featured in a virtual broadcast
presentation.
[0028] Execution of the presentation rendering module 220 may also
allow for manipulation of a view of the virtual broadcast
presentation in response to the signal received by the
communications module 210 from touch screen 140. Manipulating the
view of the presentation may include one or more of panning across,
rotating, tilting, or zooming in/out of the virtual broadcast
presentation. Signals corresponding to various motions of touch
screen 140 may be assigned to various other manipulations of the
virtual broadcast presentation. For example, touching touch screen
140 with one finger and moving the finger upwards may adjust the
view and scroll upwards along the map or presentation. As another
example, actuation of a soft key displayed within the virtual
broadcast presentation may affect zoom speed, whereas actuation of
a different soft key may affect zoom direction.
[0029] Execution of the selection module 230 allows for selection
of an interactive element included in the virtual broadcast
presentation in response to the received signal. An interactive
element may include a soft key displayed within the virtual
broadcast presentation. The interactive element may also represent
a traffic alert. For example, if road construction is taking place
at a given intersection of two streets, an icon indicative of road
construction may be placed in the virtual broadcast presentation at
a position that corresponds to that given intersection. Execution
of the selection module 230 may also select the interactive element
when the interactive element is positioned near the center of the
virtual broadcast presentation.
[0030] Selecting the interactive element may cause one of a variety
of responses from the virtual broadcast presentation. For example,
selection of an interactive element may cause additional
information related to the interactive element to be displayed
within the virtual broadcast presentation. In one embodiment, the
interactive element may correspond to a traffic camera wherein
selection of the interactive element causes a live camera view to
appear within the virtual broadcast presentation.
[0031] Execution of the feedback module 240 provides feedback to
the presenter to inform the presenter that a given interactive
element is selectable. For example, the interactive element may be
selectable in certain regions of the virtual broadcast
presentation, such as the center. When the interactive element
enters or leaves the center of the virtual broadcast presentation,
the presenter may be informed via feedback. The feedback may
include highlighting of the interactive element. To avoid
distracting or otherwise undesirable imagery such as a cursor being
included in the virtual broadcast presentation, non-visible
feedback may be invoked. Examples of non-visible feedback include a
vibration of touch screen 140 or an audible tone.
[0032] Execution of the feedback module 240 also provides feedback
to the presenter that a given interactive element has been
successfully selected. For example, if the presenter has selected a
particular interactive element, feedback module 240 may highlight
the interactive element, change the color or appearance of the
interactive element, or cause the interactive element to blink or
flash continually. Such feedback confirms the selection of the
interactive element and prevents the presenter from selecting the
same interactive element multiple times.
[0033] Execution of the trip calculation module 250 may allow for
the determination or calculation of an estimated amount of time
(e.g., `trip time`) needed to travel from a selected location to
another location. For example, the presenter may select a first
interactive element displayed in the virtual broadcast presentation
wherein the first interactive element corresponds to a starting
point or location. The presenter may then select a second
interactive element displayed in the presentation that corresponds
to a desired end point or destination location. An interactive
element or starting/end point may include a particular street,
road, landmark or point of interest, highway, neighborhood, town,
city, area, region or the like. Trip calculation module 250 may
calculate the estimated amount of time required to traverse the
real world distance from the first selected interactive element to
the second interactive element in real-time considering, at least
in part, information from information sources 130. When calculating
a trip time, trip calculation module 250, for example, may consider
the actual distance from the starting point to the end point, as
well as various conditions affecting travel, including current
weather conditions or traffic conditions such as a recent accident
or road closure. In another embodiment, trip calculation module 250
may be used to calculate an estimated travel distance between two
selected locations. Execution of trip calculation module 250 may
occur following the actuation of a `mode key` as discussed further
in FIG. 3A below.
[0034] Execution of the virtual broadcast presentation engine 120
may output the virtual broadcast presentation to other components
of the computing device 110 for generation of the composite
presentation. Accordingly, the computing device 110 may output the
composite presentation to the broadcast system 150 for
dissemination to viewers.
[0035] FIG. 3A illustrates a virtual broadcast presentation 300
displayed on a touch screen 140. The presentation 300 of FIG. 3A
includes traffic information. The principles described herein with
respect to traffic are equally applicable to embodiments of the
present invention that include weather information, newsworthy
events or incidents, school closings, election results, or other
information that may be featured on a virtual broadcast
presentation. Presentation 300 may be generated and manipulated by
execution of the presentation rendering module 220 in real-time.
Presentation 300 may include satellite images of a given area with
an animated road traffic report. A detailed description of animated
road traffic reports may be found in U.S. patent application Ser.
No. 11/302,418, now U.S. Pat. No. 7,221,287, the disclosure of
which is incorporated by reference.
[0036] Satellite images may be manipulated by execution of the
presentation rendering module 220 to aid in generating
three-dimensional information. For example, two-dimensional
satellite images may be processed in the context of other
geographical information (e.g., topographical information) to
generate a three-dimensional satellite image that reflects
information along an x-, y-, and z-axis as illustrated in
presentation 300. The textured three-dimensional representation of
landscape of a particular urban area aligns with and provides the
three-dimensional coordinates for the road ways that may be
animated and overlaid on the satellite images.
[0037] The presentation 300 may also include a variety of markers
(310A-310C) to identify or label various locations, landmarks, or
points of interest appearing in presentation 300 such as exit
ramps, highways, named sections of highways, or city streets. These
markers may be readily or universally recognizable, such as a
highway marker resembling a California state highway sign with the
appropriate highway number. Presentation 300 may also include
markers or icons corresponding to the location of traffic
incidents, road construction, and traffic cameras. Some or all of
these markers 310C may be interactive elements of the virtual
broadcast presentation 300 and show real-time conditions, such as
an average traffic speed associated with a particular location. An
interactive element may include any marker, icon, label, object, or
image appearing in presentation 300 that may be associated with
real-time content or data. An interactive element, for example, may
include a street, road, bridge, highway, landmark, point of
interest, traffic incident or alert, road construction, or traffic
camera.
[0038] A presenter 305 may select an interactive element using
touch screen 140. FIG. 4 illustrates an interactive element
appearing in a virtual broadcast presentation 300 displayed on
touch screen 140. In one embodiment, an interactive element 410
(i.e., traffic incident) may be marked by a particular icon, image,
or symbol (e.g., an arrow pointing to the location of the traffic
incident), as shown in FIG. 4. When an interactive element is
selected, additional information related to that interactive
element may be displayed. In one embodiment, an interactive element
marking a traffic incident may be selected resulting in detailed
textual information describing the traffic incident being displayed
within presentation 300 (not shown).
[0039] Returning to FIG. 3A, presentation 300 may include images of
vehicles 315 appearing along a specific roadway or highway. A
vehicle 315 may be animated, for example, to show the speed and
direction of traffic along a particular highway. Presentation 300
may also use color coding to demonstrate real-time traffic
conditions. Color coding may help a viewer of the presentation 300
to quickly understand real-time traffic conditions associated with
a depicted map or location. Presentation 300 may include a legend
320 describing various objects or color representations used in
presentation 300. A `green` colored section of a road, street, or
highway, for example, may represent that real-time traffic is
moving at a speed of 50 miles per hour or higher (e.g., normal or
optimal conditions). A `yellow` colored highway may represent
traffic speeds of 25 miles per hour or higher (e.g., delayed
conditions), while a `red` colored highway may represent traffic
speeds that are less than 25 miles per hour (e.g., slow or impacted
conditions).
[0040] The presentation 300 may also display one or more soft keys
with various functionalities such as orientation key 325, tilt key
330, rotation key 335, synchronization key 340, previous and next
presentation display keys 345A-345B, and mode key 350. Presenter
305 may actuate a soft key to facilitate or enhance the
understanding of the content of presentation 300. For example,
presenter 305 may use tilt key 330 to adjust or modify a view or
perspective of presentation 300 vertically or horizontally. The
presenter 305 may also change the perspective of presentation 300
by actuating rotation key 335. Changing the perspective of
presentation 300 may alter the orientation of the presentation such
that a `north` direction of a map or image is not oriented at the
top of touch screen 140. As such, presenter 305 may actuate
orientation key 325 to return the `north` direction to the top of
touch screen 140. In one embodiment, presenter 305 may touch a soft
key with one finger or hand (e.g., tilt key 330 or rotation key
335) while using the other hand to activate the functionality of
the soft key (e.g., move or adjust the touch screen in the desired
direction).
[0041] The presentation 300 may also include a synchronization key
340. Presentation 300 may be generated based on information
received in real-time or near real-time through execution of
communications module 210. Presenter 305 may actuate
synchronization key 340 to cause the synchronization of data in
real time such that presentation 300 reflects the most current
information and conditions. In one embodiment, synchronization of
data may be done automatically. In another embodiment, presenter
305 or another user may program or instruct computing device 110 to
synchronize data at regular time periods (e.g., every 10 seconds,
every minute, every two minutes, etc.).
[0042] A presenter 305 may zoom in or out of presentation 300 by
actuating keys corresponding to a particular view of the
presentation, such as a previous key 345A and a next key 345B. For
example, previous key 345A may revert to a presentation that offers
a zoom out view while next key 345B may allow a view that zooms in
from the current view. Previous and next keys may, alternatively,
be assigned zoom in or zoom out functionality. Presenter 305 may
actuate a particular key (345A, 345B) multiple times to further
zoom in or out of the current view. In one embodiment, the previous
key 345A and next key 345B may be used to display or shift to a
different image or map within presentation 300.
[0043] The presentation 300 may also include mode key 350.
Presenter 305 may operate presentation 300 in different modes such
as `trip time mode` or `navigation mode.` Presenter 305 may switch
between various modes by actuating mode key 350. Presenter 305 may
use navigation mode to view presentation 300 as described in FIG.
3B below. Trip time mode is discussed in further detail in FIGS. 6A
and 6B below.
[0044] FIG. 3B illustrates the virtual broadcast presentation 300
of FIG. 3A following manipulation by presenter 305. Presenter 305
may manipulate presentation 300 in navigation mode to review or
illustrate real-time traffic conditions (e.g., average traffic
speeds, traffic incidents, etc.) associated with various locations
depicted in presentation 300. A view of presentation 300 may be
manipulated to give the effect of `flying,` or scrolling through
the three-dimensional virtual representation of the traffic map and
images. As presenter 305 scrolls through presentation 300, various
interactive elements may be highlighted and/or become available for
selection. FIG. 3B illustrates presentation 300 of FIG. 3A
following presenter 305 touching touch screen 140 and scrolling
through presentation 300. As a result, presentation 300 (as shown
in FIG. 3B) shows a magnified portion of presentation 300 (i.e.,
the intersection of highways 287 and 107) and the associated
traffic conditions (e.g., traffic speeds).
[0045] Presenter 300 may interact with presentation 300 using other
interaction techniques known in the art. For example, FIG. 5
illustrates the interaction technique of "pinching" (e.g., zooming
in or out of presentation 300) used with virtual broadcast
presentation 300 displayed on touch screen 140. As shown in FIG. 5,
presenter 305 may interact with presentation 300 by touching touch
screen 140 and bringing two fingers closer together (on one hand or
with two). Such motion may cause the view associated with
presentation 300 to zoom out of the current viewpoint.
[0046] Besides zooming in or out of presentation 300, presenter 305
may also manipulate presentation 300 by panning, tilting, and/or
rotating the view. For example, as presenter 305 touches touch
screen 140 to scroll through presentation 300, touch screen 140
generates a corresponding signal that is received in conjunction
with execution of the communications module 210. In turn, the
presentation rendering module 220 may be executed to move or rotate
the presentation 300 a corresponding amount as presenter 305
manipulated the touch screen 140. The correspondence of the
presentation 300 to manipulation of the touch screen 140 gives the
presenter 305 the sensation of directly controlling the
presentation 300. Such manipulation of the view may also be used in
selecting interactive elements. For example, if a particular
interactive element may be selected only when near the center of
the presentation 300, the presenter may cause the view to be
manipulated such that the particular interactive element is
centered and therefore selectable.
[0047] FIGS. 6A-6B illustrate a virtual broadcast presentation in
`trip time mode.` Presenter 305 may activate trip time mode by
actuating mode key 350. Once trip time mode has been activated,
presenter 305 may select an interactive element corresponding to a
first location or starting point by touching the interactive
element within presentation 300 displayed on touch screen 140. As
shown in FIG. 6A, presenter 305 has selected or designated
"83.sup.rd Ave" as a starting point. Following selection of the
first location, display 355A may appear confirming the selection of
presenter 305.
[0048] Presenter 305 may then select another interactive element
corresponding to a second location or end point or travel
destination by touching a second interactive element within
presentation 300 displayed on touch screen 140. As shown in FIG.
6B, presenter 305 has selected or designated "1st Ave" as an end
point. Following selection of the second interactive element, trip
calculation module 250 may calculate the estimated amount of time
required to traverse the real world distance from the first
selected interactive element (i.e., "83.sup.rd Ave") to the second
interactive element (i.e., "1st Ave") in real-time considering, at
least in part, information from information sources 130. For
example, trip calculation module 250 may consider various
conditions affecting travel such as weather conditions or traffic
conditions such as a recent accident, a road closure, or any other
delay. Display 355B may then display the estimated trip time (i.e.,
"28 minutes"), as well as any condition affecting travel such as
weather conditions or a traffic delay, within presentation 300 on
touch screen 140. Display 355B may also show the route (i.e.,
highway "25") associated with the calculated trip time.
[0049] Besides calculating the estimate trip time in real-time,
trip time module 250 may calculate or forecast the estimated trip
time based on a time of day and/or date (i.e., special day or
occasion) designated by presenter 305. For example, presenter 305
may want to determine the estimated trip time at 9:00 AM (e.g.,
morning rush hour) or at 8:00 PM (e.g., a later evening hour). As
another example, presenter 305 may want to determine the estimated
trip time when departing at a particular time on the Labor Day
holiday or on a date when a sporting event, concert, or other large
gathering is scheduled at a venue. In trip time mode, presenter 305
may input the desired time of day and/or date and select a starting
point and end point for trip time calculation. In another
embodiment, trip time mode may also be used to calculate an
estimated travel distance between two selected locations (not
shown). The calculated estimated travel distance may also be
displayed within presentation 300.
[0050] FIGS. 7A-7B illustrate a traffic camera appearing within
virtual broadcast presentation 300 displayed on touch screen 140.
In one embodiment, an interactive element appearing in presentation
300 may include a traffic camera (710A, 710B). Presenter 305 may
select traffic camera 710A by touching the traffic camera 710A
within presentation 300 displayed on touch screen 140 (as shown in
FIG. 7A). Following selection of traffic camera 710A associated
with a particular location, a live video feed 720 corresponding to
the location of a real-world traffic camera may be displayed within
presentation 300 (as shown in FIG. 7B). Presenter 305 may then use
live video 720 feed to view actual traffic conditions associated
with the real world location of traffic camera 710A.
[0051] FIG. 8 is a flow chart illustrating a method 800 for touch
screen interaction with traffic data presented in a virtual
broadcast presentation. The steps of method 800 may be performed in
varying orders. Steps may be added or subtracted from the method
800 and still fall within the scope of the present invention. The
steps of the process of FIG. 8 may be embodied in hardware or
software including a non-transitory computer-readable storage
medium comprising instructions executable by a processor of a
computing device.
[0052] At step 810, a real-time, virtual broadcast presentation 300
is generated. The presentation 300 may be based on dynamic
information and may be two-dimensional or three-dimensional.
Execution of the presentation rendering module 210 may perform step
810. The dynamic information may include real-time traffic
information or real-time weather information and be received from
the information sources 130 in conjunction with execution of the
communications module 210.
[0053] At step 820, a signal generated by touch screen 140 may be
received. The signal generated by touch screen 140 may be based at
least partially on the selection by a presenter of an interactive
element displayed within presentation 300 on touch screen 140. The
signal may also be based on the actuation of other components
included in the touch screen 140 such as soft keys. Step 820 may be
performed by execution of the communications module 210. Receipt of
the signal in step 820 allows for processing of presentation 300 at
step 830.
[0054] At step 830, presentation 300 is processed in response to
the signal received at step 820. Execution of the presentation
rendering module 220 may perform step 830. Presentation 300 may be
processed, for example, to allow for real-time manipulation of
presentation 300 and various views thereof such as zooming in and
out, scrolling, panning across, tilting, or rotating presentation
300. Presentation 300 may also be processed based on the actuation
of a particular soft key displayed within presentation 300 on touch
screen 140.
[0055] At step 840, presentation 300 is updated in response to the
processed signal from step 830. Execution of the presentation
rendering module 220 may perform step 840. For example,
presentation 300 may be updated to show a manipulated viewpoint
desired by presenter 305 (e.g., rotated or tilted presentation).
Presentation 300 may also be updated to show presentation 300 in a
particular mode such as `navigation mode` or `trip time mode.`
Presentation 300 may also be updated to display information
associated with an interactive element selected by presenter 305,
such as information regarding a traffic incident, road closure, or
average travel speeds.
[0056] Any number of additional and/or optional steps that are not
otherwise depicted may be included in method 800. These steps may
include selection of an interactive element included in the virtual
broadcast presentation using touch screen 140 or feedback being
provided to the presenter to inform the presenter that an
interactive element is selectable.
[0057] It is noteworthy that any hardware platform suitable for
performing the processing described herein is suitable for use with
the invention. Computer-readable storage media refer to any medium
or media that participate in providing instructions to a central
processing unit (CPU) for execution. Such media can take forms,
including, but not limited to, non-volatile and volatile media such
as optical or magnetic disks and dynamic memory, respectively.
Common forms of computer-readable storage media include a floppy
disk, a flexible disk, a hard disk, magnetic tape, any other
magnetic medium, a CD-ROM disk, digital video disk (DVD), any other
optical medium, RAM, PROM, EPROM, a FLASHEPROM, or any other memory
chip or cartridge.
[0058] Various forms of transmission media may be involved in
carrying one or more sequences of one or more instructions to a CPU
for execution. A bus may carry data to system RAM, from which a CPU
retrieves and executes the instructions. The instructions received
by system RAM may optionally be stored on a fixed disk either
before or after execution by a CPU.
[0059] The foregoing detailed description of the technology herein
has been presented for purposes of illustration and description. It
is not intended to be exhaustive or to limit the technology to the
precise form disclosed. Many modifications and variations are
possible in light of the above teaching. The described embodiments
were chosen in order to best explain the principles of the
technology and its practical application to thereby enable others
skilled in the art to best utilize the technology in various
embodiments and with various modifications as are suited to the
particular use contemplated. It is intended that the scope of the
technology be defined by the claims appended hereto.
* * * * *