U.S. patent application number 14/797425 was filed with the patent office on 2015-11-05 for presentation of content during navigational instructions.
This patent application is currently assigned to AT&T Intellectual Property II, L.P.. The applicant listed for this patent is AT&T Intellectual Property II, L.P.. Invention is credited to Stephen Griesmer, Arun Kandappan, Neerav Mehta.
Application Number | 20150316392 14/797425 |
Document ID | / |
Family ID | 40999105 |
Filed Date | 2015-11-05 |
United States Patent
Application |
20150316392 |
Kind Code |
A1 |
Griesmer; Stephen ; et
al. |
November 5, 2015 |
Presentation of Content During Navigational Instructions
Abstract
A window of time between navigational instructions is
determined. Non-navigational content is then determined for audible
and/or visual presentation within the window of time. Emails, text
messages, and even video may thus be presented in between the
navigational instructions.
Inventors: |
Griesmer; Stephen;
(Westfield, NJ) ; Kandappan; Arun; (Morganville,
NJ) ; Mehta; Neerav; (Edison, NJ) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
AT&T Intellectual Property II, L.P. |
Atlanta |
GA |
US |
|
|
Assignee: |
AT&T Intellectual Property II,
L.P.
Atlanta
GA
|
Family ID: |
40999105 |
Appl. No.: |
14/797425 |
Filed: |
July 13, 2015 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
12072240 |
Feb 25, 2008 |
9109918 |
|
|
14797425 |
|
|
|
|
Current U.S.
Class: |
701/487 |
Current CPC
Class: |
G01C 21/3691 20130101;
G01C 21/3655 20130101; G01C 21/3626 20130101; G01S 19/52 20130101;
G01C 21/3697 20130101 |
International
Class: |
G01C 21/36 20060101
G01C021/36; G01S 19/52 20060101 G01S019/52 |
Claims
1. A method, comprising: receiving, by a processor, global
positioning system information; determining, by the processor,
navigational instructions based on the global positioning system
information; determining, by the processor, a duration of time
between successive aural announcements of the navigational
instructions based on the global positioning system information;
receiving, by the processor, electronic content; processing, by the
processor, the electronic content for presentation during the
duration of time between the successive aural announcements of the
navigational instructions.
2. The method of claim 1, further comprising determining a speed
based on the global positioning system information.
3. The method of claim 1, further comprising determining a distance
based on the global positioning system information.
4. The method of claim 1, further comprising displaying at least
one of the navigational instructions.
5. The method of claim 1, further comprising receiving digital
music as the electronic content.
6. The method of claim 1, further comprising receiving electronic
advertising as the electronic content.
7. The method of claim 1, further comprising receiving weather
information as the electronic content.
8. The method of claim 1, further comprising receiving traffic
information as the electronic content.
9. A system, comprising: a processor; a global positioning system;
and a memory storing instructions that when executed cause the
processor to perform operations, the operations comprising:
receiving global positioning system information; determining
navigational instructions based on the global positioning system
information; determining a duration of time between successive
aural announcements of the navigational instructions; querying an
electronic database for the duration of time, the electronic
database having electronic database associations between different
electronic content and different durations of time; retrieving
electronic content from the electronic database having an
electronic database association with the duration of time between
the successive aural announcements of the navigational
instructions; and processing the electronic content for
presentation during the duration of time between the successive
aural announcements of the navigational instructions.
10. The system of claim 9, wherein the operations further comprise
determining a speed based on the global positioning system
information.
11. The system of claim 9, wherein the operations further comprise
determining a distance based on the global positioning system
information.
12. The system of claim 9, wherein the operations further comprise
displaying at least one of the navigational instructions.
13. The system of claim 9, wherein the operations further comprise
retrieving digital music as the electronic content.
14. The system of claim 9, wherein the operations further comprise
retrieving electronic advertising as the electronic content.
15. The system of claim 9, wherein the operations further comprise
retrieving weather information as the electronic content.
16. The system of claim 9, wherein the operations further comprise
retrieving traffic information as the electronic content.
17. A memory device storing instructions that when executed cause a
processor to perform operations, the operations comprising:
receiving global positioning system information; determining
successive navigational instructions based on the global
positioning system information; determining a duration of time
between successive aural announcements of the successive
navigational instructions based on the global positioning system
information; querying an electronic database for the duration of
time, the electronic database having electronic database
associations between different filenames and different durations of
time; retrieving a filename from the electronic database having an
electronic database association with the duration of time between
the successive aural announcements of the successive navigational
instructions; retrieving an electronic file associated with the
filename; and processing the electronic file for presentation
during the duration of time between the successive aural
announcements of the successive navigational instructions.
18. The memory device of claim 17, wherein the operations further
comprise retrieving digital music as the electronic file.
19. The memory device of claim 17, wherein the operations further
comprise retrieving electronic advertising as the electronic
file.
20. The memory device of claim 17, wherein the operations further
comprise retrieving at least one of weather information and traffic
information as the electronic file.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation of U.S. application Ser.
No. 12/072,240 filed Feb. 25, 2008 and since issued as U.S. Pat.
No. ______, which is incorporated herein by reference in its
entirety.
BACKGROUND OF THE INVENTION
[0002] The present invention relates generally to managing
aural/visual delivery, and more particularly to managing
aural/visual delivery in a navigational environment.
[0003] A primary function of a navigational device is to present
navigational content, such as aural or visual-text directions and
visual maps, in a timely manner to a mobile user. This navigational
content may be transmitted to the mobile user through a service
provider or the navigation device may have all of the navigational
content already stored in the memory of the device. In the latter
case, the navigational content may be updated by means of downloads
or software packages.
[0004] Historically, there has been delineation between
navigational devices and other devices. However, currently, this
delineation between navigational devices and other devices has
become fuzzy. Navigational devices, which historically have been
solely designated to receive navigational content, are currently
being expanded to include the ability to present non-navigational
content. Additionally, other devices, not historically used to
present navigational content--such as personal digital assistants
(PDAs) or cell phones--are now able to receive navigational
content. Therefore, a possibility exits where multiple types of
content--both navigational and non-navigational--may be presented
on the same device.
[0005] When a mobile user activates a device's navigational
capabilities, it is likely that the mobile user is in a situation
where receiving navigational information in a timely manner is
critical. An example of such an environment is when a mobile user
activates a device's navigational capabilities during an automobile
trip.
[0006] The environment within an automobile, as well as other
environments where a mobile user may desire to be presented with
navigational content, is often full of various aural and visual
presentation mediums. For example, while a mobile user is presented
with navigational content, he may also be listening to a radio or
talking on a cell phone. Additionally, as automobiles become more
advanced, more aural and visual information will be presented to a
mobile user by the automobile itself regarding certain urgent and
non-urgent matters relating to its operation.
[0007] With so many different possible combinations for receiving
both navigational and non-navigational content, there is a need to
manage the presentation of the various contents.
SUMMARY OF THE INVENTION
[0008] We have recognized a problem that may arise when delivering
navigational and non-navigational content--whether the presentation
is on a single device or on multiple devices in a single
presentational environment. This problem is that presentation of at
least certain non-navigational content may interfere with
presentation of the navigational content and vice versa. For
example, the aural presentation of an extended news story may not
be completed by the time that navigational content, such as the
aural instruction "turn left 200 yards ahead," needs to be
presented. It is certainly undesirable to allow the news story to
continue and supplant the navigational content. But it is also
undesirable--albeit less so--to interrupt the complete presentation
of the news story in order to present the more urgent navigational
content. The concurrent presentation of navigational and
non-navigational content may not be a significant concern if the
two types of content are in different presentational media, i.e.
aural and visual. Thus if the mobile user has disabled the
presentation of aural navigational content--choosing to rely only
on visually presented map data and/or textual instructions--the
concurrent aural presentation of a news story and the presentation
of updated visual navigational content may not be problematic.
Split-screen techniques might also allow both navigational and
non-navigational content to be displayed on a screen concurrently.
But the problem will certainly arise when both navigational and
non-navigational content are to be presented aurally, as in the
example presented above.
[0009] In accordance with an aspect of this invention, the above
mentioned problems are solved by two steps. The first step is
ascertaining an available window of time before a piece of
navigational content needs to be presented to the mobile user. The
second step is identifying a piece of non-navigational content
which can be presented in its totality within the available window
of time. Choosing non-navigational content which may be completely
presented during an available window of time avoids the problem
noted above. The first step is illustratively carried out as a
function of (a) a current location and travel speed of a mobile
user and (b) a projected location at which the piece of
navigational content should be presented to the mobile user.
[0010] The term "navigational content" is used herein to refer
either to (a) an aural presentation of the instruction, (b) a
visual presentation of the instruction, or (c) a visual
presentation of a map for illustrating the instruction.
[0011] The term "non-navigational content" is used herein to refer
to all remaining content which is not classified as a piece of
navigational content. Some examples of non-navigational content
include: a weather report, a traffic report, an indication of the
duration of an available window of time before navigational content
will need to be presented, a news story, a phone call, an
advertisement, an automobile's mechanical report, an entertainment
option such as a piece of music or a video clip, miscellaneous
locally relevant information, and other miscellaneous content.
[0012] In one embodiment, the step of identifying a piece of
non-navigational content is performed by a service provider.
Alternatively, this step is performed utilizing a device of a
mobile user.
[0013] The identified piece of non-navigational content may be
transmitted via a wireless channel from a server of a service
provider to a mobile user's device. Alternatively, it may be
already available in the mobile user's device.
[0014] In accordance with a feature of the invention, the step of
identifying a piece of non-navigational involves selecting from at
least two potential pieces of non-navigational content which can be
presented during an available window of time. In one example of
this feature of the invention, the service provider determines the
prioritization of the non-navigational content. In a second
example, the mobile user determines the prioritization of the
non-navigational content. In a third example, the prioritization is
based on a factory or merchant's default setting.
[0015] In accordance with another feature of the invention, an
available window of time may be determined with a calculation. In
another example, an available window of time is determined by
looking an available window of time on a table, the table
containing values pertaining to the mobile user's trip.
[0016] In accordance with another feature of the invention, in one
embodiment, a presentation medium of the navigational content and
non-navigational content is aural. In another embodiment, a
presentation medium of both contents is visual.
[0017] These features and the various advantages of the invention
can be more fully appreciated by reference to the following
detailed description and the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] FIG. 1 illustrates a device--mounted on an automobile
dashboard--able to present navigational content.
[0019] FIG. 2 is a table showing various types of non-navigational
content.
[0020] FIG. 3 illustrates a device--presenting non-navigational
content--being interrupted to present navigational content.
[0021] FIG. 4 illustrates a device presenting non-navigational
content aurally and navigational content visually.
[0022] FIG. 5 is a data table pertinent to presentation of
non-navigational content.
[0023] FIG. 6 is a diagram illustrating notices of upcoming
navigational actions and locations of completed navigational
actions for a typical trip.
[0024] FIG. 7 is a flowchart illustrating an aspect of the
principles of the invention.
[0025] FIG. 8 is a data table of prioritized non-navigational
content.
[0026] FIG. 9 is a block diagram illustrating an embodiment of a
server device.
[0027] FIG. 10 is a block diagram illustrating an embodiment of a
mobile user's device.
[0028] FIG. 11 is a flowchart illustrating an embodiment for
carrying out the invention.
DETAILED DESCRIPTION
[0029] When a mobile user activates a device's navigational
capabilities, the likelihood is that the mobile user is in a
situation where receiving navigational information in a timely
manner is critical. An example of such an environment is when a
mobile user receives navigational content during an automobile
trip.
[0030] FIG. 1 illustrates a device 101--mounted on a
dashboard--able to present navigational content. Out of the
automobile's window can be seen a road sign 102 for "Exit 14--Main
Street, Business District". The navigational device issues an aural
direction 103 "Make a slight right turn onto the exit ramp for Exit
14--`Business District` in 200 yards. Make a right turn at end of
the exit ramp onto Main Street." The navigational device also
presents a visual map. An alternative visual display may be a
visual listing of the trip's navigational actions. Other visual
options may be envisioned.
[0031] Automobiles--as well as other environments where a mobile
user may desire to receive navigational content--are often full of
various devices on which all sorts of aural and visual content may
be presented. Some examples of these devices include: a
navigational device which is able to receive non-navigational
content, a cell phone, a PDA, a car stereo, and a processor
connected to the car which may issue audio or visual alerts such as
informing the driver concerning the car's operational status. Other
examples of such devices may be envisioned.
[0032] Some examples of non-navigational content are listed in the
table of FIG. 2. Non-navigational content may include: a weather
report; a traffic report; an indication of the duration of an
available window of time before navigational content will need to
be presented, which is useful--for example--in deciding whether to
place or receive a phone call at a particular time during a trip; a
news story; a phone call; an advertisement; an automobile's
mechanical report; an entertainment option such as a piece of music
or a video clip; miscellaneous locally relevant information, such
as emergency reports or local tourist attractions; and other
miscellaneous content.
[0033] We have recognized a problem that may arise when delivering
navigational and non-navigational content, whether the presentation
is on a single device or on multiple devices in a single
presentational environment. This problem is that presentation of at
least certain non-navigational content may interfere with
presentation of the navigational content and vice versa. An example
of how this problem may arise on a single device is illustrated in
FIG. 3. In this figure, a traffic report presented on a device 301
is providing information about the traffic on the local "bridges
and tunnels". The information concerning the traffic report is
interrupted with the navigational content 302 directing the driver,
"Make a right turn at end of the exit ramp." An alternative mode of
presentation is for the traffic report to continue despite the
mobile user's need for the navigational information. Either
alternative may be frustrating to the mobile user, who desires to
know both the traffic on the Bayside Tunnel as well as the
instruction regarding when to turn onto the Bayside Tunnel.
[0034] A problem can also arise when the navigational and
non-navigational content are presented in the same environment.
Even though the multiple contents are not presented on the same
device, the multiple presentations of content may still distract a
mobile user from being able to focus on either content. For
example, the non-navigational information may be an aural traffic
report playing on the car stereo while the navigational information
may be an aural direction being presented on a navigational device.
There is no problem to mechanically play both pieces of aural
content simultaneously. However, the mobile user will likely find
it difficult to effectively listen to both pieces of information at
the same time.
[0035] The concurrent presentation of navigational and
non-navigational content may not be a significant concern if the
two types of content are in different presentational media, as is
shown in FIG. 4. In this example, the mobile user has disabled the
presentation of aural navigational instructions--choosing to rely
only on visually presented textual instructions 401. Therefore, the
concurrent aural presentation of a traffic report 402 and visual
presentation of textual navigational content may not be
problematic. Split-screen techniques might also allow both
navigational and non-navigational information to be displayed on a
screen concurrently. But, as mentioned previously concerning the
example in FIG. 3, the problem will certainly arise when both
navigational and non-navigational content are to be presented in
the same presentational medium.
[0036] In accordance with an aspect of this invention, the above
mentioned problems are solved by two steps. The first step is
ascertaining an available window of time before the piece of
navigational content needs to be presented to the mobile user. The
second step is identifying a piece of non-navigational content
which can be presented in its totality within the available window
of time. Choosing non-navigational content which may be completely
presented during an available window of time avoids the problems
noted above.
[0037] Before discussion this solution in further detail, some
additional background material will be useful.
[0038] FIG. 5 is a data table pertinent to a presentation of
non-navigational content. The data table indicates a mobile user's
decisions along side of factory settings. The mobile user's
decisions may have been obtained using a graphical user interface
(GUI). A GUI may be presented to a mobile user before each trip.
Alternatively, the GUI may be presented to the mobile user during
registration with a service provider to receive non-navigational
content.
[0039] Questions and settings that are shown in Group 1 allow the
mobile user to specify in which types of locations non-navigational
content may be displayed to the mobile user. The designation of
highway travel and local-road travel, in one embodiment, depend on
the road's speed limit. In a different embodiment, these
designations depend on the presence or absence of stop lights. In
another embodiment, these designations depend on the number of
lanes in the road. Other embodiments may be envisioned.
[0040] Questions and settings in Group 2 are used for ascertaining
in which scenarios non-navigational content may be provided to a
mobile user. In one embodiment, when a mobile user begins using a
navigational device for a particular trip, the mobile user enters a
destination location. The navigational device determines the
current location of the mobile user. This determination may either
be from mobile user input or through a locating means in the
proximity to the mobile user. In one example, the locating means
are a global positioning system (GPS).
[0041] The default input values for all the groups may be set by a
manufacturer or a service provider at the time of the initiation of
the service. The customized settings may be determined by a mobile
user, a service provider, or a third party.
[0042] In group two, it is determined how much advance notice will
be given to a mobile user before an upcoming navigational action.
Additionally, it is determined in this grouping if non-navigational
action may be displayed during; (1) "relaxed travel", which is
travel between a completed, prior navigational action and a notice
of an upcoming navigational action; and, (2) "anxious travel",
which is travel between a notice of an upcoming navigational action
and the location for performing the navigational action.
[0043] The question and setting in group 3 is used in determining
which travel speed will be used in determining an available window
of time. Some possible options for travel speed include; legal
speed limit or limits for each stretch of travel, actual travel
speed of the vehicle of the mobile user which may include periodic
updates, and some offset value from either of these two
values--such as 5 miles per hour (mph) above the legal speed limit.
Values for the legal speed limit may be determined from a map
stored in the device or a server. Actual travel speed values may be
determined by a GPS. The GPS can make a time stamp at two different
locations during travel. By dividing the distance traveled by the
time difference between the two time stamps, the resulting value
gives the speed of travel.
[0044] Certain values in the data table have an affect on the
determination the available window of time. First, it is determined
if the mobile user desires to receive any non-navigational content
in various scenarios. These scenarios include: highway travel,
local-road travel; and, relaxed travel and anxious travel. Second,
in a scenario where non-navigational content may be provided, other
values in the data table are used to determine the available window
of time for presenting this content. One example is the value for a
distance before a navigational action when a "notice" of an
upcoming navigational action is displayed. This value indicates
that an early warning is being provided to the mobile user. Instead
of receiving navigational content immediately before a needed
action, this feature gives the mobile user a certain fixed time or
travel distance to prepare for the upcoming navigational
action.
[0045] Additionally, the travel speed obtained from the data table
is used to determine an available window of time. In one example
where the available window of time is determined from a
calculation, the travel speed is needed to determine how much time
the traveler will take to get from a starting point to a point
where navigational will be presented. In another example, where the
available window of time is determined from looking up a value on a
table, a travel speed may be one of the inputs necessary to
determine the value.
[0046] FIG. 6 is a diagram illustrating (a) the positions of a
vehicle at which notices of upcoming navigational actions are given
during a typical trip and (b) locations of completed navigational
actions. A mobile user starts a trip at point A traveling on a road
with a speed limit of 65 mph. If the legal speed limit was chosen
in FIG. 5 as the travel speed to use, then 65 mph would be used
here. Additionally, the determination in FIG. 5 regarding whether a
mobile user wants to receive non-navigational content during
highway travel will impact on where the mobile user may receive the
content during this phase of travel. Even if the mobile user
selects "yes" or the default value is "yes", it must still be
determined if there is a sufficient window of time for presenting
the non-navigational content.
[0047] The first notice of an upcoming navigational action is at
point B which is a certain distance before the initial road ends
and turns into a 45 mph road. At point C, the mobile user has
completed the navigational action and entered the 45 mph road.
Between points B and C is what is referred to as "anxious travel",
since the traveler is anticipating having to perform an upcoming
navigational action responsive to the notice that was delivered at
point B. The next notice of a navigational action is at point D,
where the mobile user is instructed to make a right turn onto a 30
mph road. Between points C and D is what is referred to as "relaxed
travel", since the traveler has not received any instructions for
upcoming, imminent navigational actions. However, since travel
between points C and D may be considered "local travel", this
option on FIG. 5 must have been set to "yes" in order to
potentially receive non-navigational content. Additionally, there
must still be a sufficient window of time available before the
navigational content is needed.
[0048] At point E, the mobile user has completed the navigational
action and entered the 30 mph road. The next notice of a
navigational action is at point F, where the mobile user is
instructed to enter an entrance ramp for a 65 mph road. At point G,
the mobile user has completed the navigational action and has
merged onto the 65 mph road. The next notice of a navigational
action is at point H, where the mobile user is instructed to turn
right on a 25 mph road at the end of the previous road. At point I,
the mobile user has completed the navigational action and has
entered the 25 mph road. At point J, the mobile user receives the
last notice of a navigational action where the mobile user is
instructed to turn into a movie theater's parking lot.
[0049] The flowchart of FIG. 7 illustrates an aspect of the
principles of the invention. The process begins at step 701. Step
702 involves ascertaining an available window of time between a
current location and a next navigational action point. In one
example, a window of time is calculated based on the distance
between the current location of the mobile user, a next
navigational action point, and the speed of travel of the mobile
user. A next navigational action point is a projected location
where a mobile user receives navigational content. The navigational
content may be received immediately before the necessary action
needs to be performed, or it may be received some time or distance
in advance of when the necessary action needs to be performed.
[0050] Step 703 involves identifying a piece of non-navigational
content which can be presented within the available window of time.
This step may be performed by a server managed by a service
provider. Alternatively, this step may be performed by a mobile
user's device.
[0051] When the step of identifying is performed by a server, the
non-navigational content may be transmitted to a mobile user's
device and stored for a later presentation. Alternatively, the
non-navigational content may be transmitted and then presented
immediately. In either case, the transmission from the server to
the mobile user's device may take place via a wireless
transmission, such as over a cellular network.
[0052] When the step of identifying is performed by a mobile user's
device, the non-navigational content may be provided to the device
from a server or from various other sources. Once the mobile user's
device has identified the non-navigational content, the content may
be presented immediately. Alternatively, the non-navigational
content may be presented at a later time.
[0053] When multiple pieces of non-navigational content may be
presented within a window of time, the step of identifying further
involves selecting a piece of non-navigational content with a
highest prioritization. A prioritization may be established as a
factory setting or by a service provider. Alternatively, a
prioritization may be established by a mobile user.
[0054] FIG. 8 is a data table of prioritized non-navigational
content. A default priority setting is shown along side of user
selected values. The default priority setting may be a factory
setting or a setting made by a service provider. The user selected
values may be obtained from presenting a mobile user with a
graphical user interface (GUI) at the onset of a trip.
Alternatively, a GUI may be presented to the mobile user during a
registration period.
[0055] FIG. 9 is a block diagram illustrating an embodiment of a
server device 901. A mobile user profile database 902 contains
mobile users listed by "user names", such as Jay_123, Bob_214, and
Dan_412. Associated with each user name are priorities. These
priorities may have been selected by the mobile user.
Alternatively, the priorities may reflect a factory setting.
[0056] A non-navigational content database, 903, contains various
non-navigational contents which are available for transmission to
the mobile user's device. Included in this database is the length
of the content and the display type of the content--for
example--visual, aural, or both.
[0057] A processor 904 in the server device uses databases 902 and
903 to identify non-navigational content for each mobile user. A
transmitter 905 then transmits the identified content to a mobile
user's device. The transmitter illustratively transmits the
identified content over a cellular network.
[0058] FIG. 9 further shows three different navigational devices
906, 907 and 908 that have received non-navigational content
transmitted from the server. In one example, the data received by
each mobile user's device reflects the interests for that
particular mobile user. In this embodiment, multiple pieces of
non-navigational content may be stored in each mobile user's
navigational device. During the available window of time for
presenting non-navigational content, the selection of which content
to present is determined by the mobile user's device.
[0059] FIG. 10 is a block diagram illustrating an embodiment of a
mobile user's device 1001. GPS unit 1002 determines the mobile
user's location on a trip. This location is needed to know what
navigational content to provide to the mobile user. In one
embodiment, the mobile user's device serves as a navigational
device for displaying navigational content. In another embodiment,
the mobile user's device is in the same environment as a
navigational device, but is not the actual device which is used to
display a piece of navigational content.
[0060] The navigational content is illustratively stored in a map
computer 1005. Alternatively, the navigational content may be
stored on the server device and transmitted to the mobile user's
device when the navigational content is needed.
[0061] A memory 1003 contains a mobile user preferences profile,
and another memory 1004 contains a database of non-navigational
content. Most non-navigational content may be stored on a server,
and this memory 1004 is a more limited database for storing
non-navigational content. Alternatively, memory 1004 may be a
primary storage location for non-navigational content.
[0062] A processor, 1006, identifies which content to present to a
mobile user. If there is navigational content that needs to be
presented to a mobile user, then that content takes precedence.
However, if there is a window of time available before this content
needs to be presented, then non-navigational content of
sufficiently short length may be presented.
[0063] The appropriate content--whether aural or visual or both--is
displayed via an audio/visual display driver, 1007.
[0064] FIG. 11 is a flowchart illustrating an embodiment for
carrying out the invention. The process starts at 1101.
[0065] At step 1102, a mobile user sets preferences on the mobile
user's device. At step 1103, the mobile-user-specified preferences
are obtained by a server device. At step 1104, a server device is
utilized to identify non-navigational content which is appropriate
for the mobile user. Once identified, this non-navigational content
is transmitted to a mobile user's device, at step 1105. The
transmitted content may be stored until it is displayed at a later
available window of time; or, the step of transmitting may be done
at the same time as the step of displaying, when an available time
window exists.
[0066] At step 1106, the mobile user's device presents the
appropriate content. This may be navigational content or
non-navigational content, depending on the mobile user's location
and the available window of time. Thus, in this embodiment, the
mobile user's device is also a navigational device, i.e. a device
able to display navigational content. At step 1107, it is
determined if the trip is finished. If the trip is completed, then
the method ends at step 1108.
[0067] If the trip is still in progress, then the method continues,
at step 1109, with an update sent from the mobile user's device
updating location and any changes in preferences to the server
device. At step 1110, the server device updates preferences and
location for mobile user's device. The method then continues at
1104, as is described above.
[0068] The embodiment shown in FIG. 11 illustrates one scenario
where certain functions are performed by a server device and
certain functions are performed by a mobile user's device. However,
other scenarios are possible which also fall under the general
principles of this invention. In some scenarios, the server device
performs more of the steps of the invention. In other scenarios,
the mobile user's device may perform more of the steps of the
invention.
[0069] It will thus be appreciated that those skilled in the art
will be able to devise numerous alternative arrangements that,
while not shown or described herein, embody the principles of the
invention and thus are within its spirit and scope.
* * * * *