U.S. patent application number 14/042670 was filed with the patent office on 2015-04-02 for autonomous vehicle entertainment system.
This patent application is currently assigned to Ford Global Technologies, LLC. The applicant listed for this patent is Ford Global Technologies, LLC. Invention is credited to Mark A. Cuddihy, Jialiang Le, Manoharprasad K. Rao.
Application Number | 20150094897 14/042670 |
Document ID | / |
Family ID | 52673370 |
Filed Date | 2015-04-02 |
United States Patent
Application |
20150094897 |
Kind Code |
A1 |
Cuddihy; Mark A. ; et
al. |
April 2, 2015 |
AUTONOMOUS VEHICLE ENTERTAINMENT SYSTEM
Abstract
A vehicle system includes an autonomous mode controller that
controls a vehicle in an autonomous mode and an entertainment
system controller that presents media content while the vehicle is
operating in the autonomous mode. The entertainment system actuates
a projection screen inside a passenger compartment of the vehicle
and enables a projector to project media content onto the
projection screen. A method includes determining whether a vehicle
is operating in an autonomous mode, and if so, actuating a
projection screen inside a passenger compartment of the vehicle and
enabling a projector to project media content onto the projection
screen.
Inventors: |
Cuddihy; Mark A.; (New
Boston, MI) ; Rao; Manoharprasad K.; (Novi, MI)
; Le; Jialiang; (Canton, MI) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Ford Global Technologies, LLC |
Dearborn |
MI |
US |
|
|
Assignee: |
Ford Global Technologies,
LLC
Dearborn
MI
|
Family ID: |
52673370 |
Appl. No.: |
14/042670 |
Filed: |
September 30, 2013 |
Current U.S.
Class: |
701/23 |
Current CPC
Class: |
H04N 21/41422 20130101;
B60W 30/00 20130101; H04N 21/2146 20130101; B60K 2370/66 20190501;
B60R 11/0229 20130101; H04N 21/4122 20130101; B60K 2370/334
20190501; B60K 35/00 20130101; B60K 2370/175 20190501; B60K 2370/77
20190501 |
Class at
Publication: |
701/23 |
International
Class: |
B60R 11/02 20060101
B60R011/02 |
Claims
1. A vehicle system comprising: an autonomous mode controller
programmed to control a vehicle in an autonomous mode; an
entertainment system controller programmed to present media content
while the vehicle is operating in the autonomous mode, wherein the
entertainment system is programmed to determine, based on a signal
output by the autonomous mode controller, whether the vehicle is
operating in the autonomous mode and actuate a projection screen
inside a passenger compartment of the vehicle and enable a
projector configured to project media content onto the projection
screen if the entertainment system determines that vehicle is
operating in the autonomous mode.
2. (canceled)
3. (canceled)
4. The vehicle system of claim 1, wherein the entertainment system
controller is programmed to actuate the projection screen and
enable the projector while the vehicle is operating in the
autonomous mode and in response to a user input.
5. The vehicle system of claim 1, wherein the entertainment system
controller is programmed to retract the projection screen and
disable the projector prior to the vehicle operating in a
non-autonomous mode.
6. The vehicle system of claim 1, wherein the entertainment system
controller is programmed to retract the projection screen and
disable the projector in response to a user input.
7. A method comprising: determining, via a computing device,
whether a vehicle is operating in an autonomous mode based on a
signal received from an autonomous mode controller; and if the
vehicle is operating in the autonomous mode: actuating, via the
computing device, a projection screen inside a passenger
compartment of the vehicle, and enabling, via the computing device,
a projector to project media content onto the projection
screen.
8. The method of claim 7, wherein the determination of whether the
vehicle is operating in the autonomous mode includes monitoring at
least one autonomous driving sensor.
9. (canceled)
10. The method of claim 7, wherein the projection screen is
actuated and the projector is enabled in response to a user input
received while the vehicle is operating in the autonomous mode.
11. The method of claim 7, further comprising, prior to the vehicle
entering a non-autonomous mode, retracting the projection
screen.
12. The method of claim 7, further comprising, prior to the vehicle
entering a non-autonomous mode, disabling the projector.
13. The method of claim 7, further comprising retracting the
projection screen in response to a user input.
14. The method of claim 7, further comprising disabling the
projector in response to a user input.
15. A non-transitory computer-readable medium tangibly embodying
computer-executable instructions that cause a processor to execute
operations comprising: receiving a signal output by an autonomous
mode controller, the signal indicating whether a vehicle is
operating in an autonomous mode; determining whether the vehicle is
operating in the autonomous mode based at least in part on the
signal output by the autonomous mode controller; and if the vehicle
is operating in the autonomous mode: actuating a projection screen
inside a passenger compartment of the vehicle, and enabling a
projector to project media content onto the projection screen.
16. The non-transitory computer-readable medium of claim 15,
wherein the determination of whether the vehicle is operating in
the autonomous mode includes monitoring at least one autonomous
driving sensor.
17. (canceled)
18. The non-transitory computer-readable medium of claim 15,
wherein the projection screen is actuated and the projector is
enabled in response to a user input received while the vehicle is
operating in the autonomous mode.
19. The non-transitory computer-readable medium of claim 15, the
operations further comprising, prior to the vehicle entering a
non-autonomous mode, retracting the projection screen and disabling
the projector.
20. The non-transitory computer-readable medium of claim 15, the
operations further comprising retracting the projection screen and
disabling the projector in response to a user input.
Description
BACKGROUND
[0001] Vehicles operating in an autonomous (e.g., driverless) mode
can relieve occupants, especially the driver, from some
driving-related responsibilities. When operating in an autonomous
mode, the vehicle can navigate to various locations using on-board
sensors, allowing the vehicle to travel with minimal human
interaction or in some cases without any passengers. Therefore,
autonomous vehicles give passengers, especially the person who
would otherwise be driving the vehicle, the opportunity to do other
things while travelling. Instead of concentrating on numerous
driving-related responsibilities, the driver may be free to watch
movies or other media content, converse with other passengers,
read, etc., while riding in an autonomous vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 is a block diagram of components of an exemplary
autonomous vehicle.
[0003] FIGS. 2A-2B are views of an exemplary entertainment system
of the vehicle of FIG. 1 while operating in an autonomous mode.
[0004] FIG. 3 illustrates an exemplary graphical user interface in
the vehicle.
[0005] FIG. 4 illustrates the graphical user interface when the
entertainment system is in use.
[0006] FIG. 5 illustrates an alternate location of the exemplary
graphical user interface for when the entertainment system is in
use.
[0007] FIG. 6 is a flowchart of an exemplary process that may be
implemented by the entertainment system.
[0008] FIG. 7 is a flowchart of another exemplary process that may
be implemented by the entertainment system.
DETAILED DESCRIPTION
[0009] An exemplary vehicle system includes an autonomous mode
controller that controls a vehicle in an autonomous mode and an
entertainment system controller that presents media content while
the vehicle is operating in the autonomous mode. The entertainment
system actuates a projection screen inside a passenger compartment
of the vehicle and enables a projector to project media content
onto the projection screen. A method includes determining whether a
vehicle is operating in an autonomous mode, and if so, actuating a
projection screen inside a passenger compartment of the vehicle and
enabling a projector to project media content onto the projection
screen.
[0010] The FIGS. illustrate an exemplary vehicle entertainment
system for an autonomous vehicle. The system may take many
different forms and include multiple and/or alternate components
and facilities. While an exemplary system is shown, the exemplary
components illustrated are not intended to be limiting. Indeed,
additional or alternative components and/or implementations may be
used.
[0011] As illustrated in FIG. 1, a vehicle 100 includes a user
interface device 105, autonomous driving sensors 110, an autonomous
mode controller 115, an entertainment system 120, and an
entertainment system controller 125. The vehicle 100 may include
any passenger or commercial vehicle such as a car, a truck, a sport
utility vehicle, a taxi, a bus, a train, an airplane, etc.
[0012] The user interface device 105 may be configured to present
information to a user, such as a driver, during operation of the
vehicle 100. Moreover, the user interface device 105 may be
configured to receive user inputs. Thus, the user interface device
105 may be located in a passenger compartment 130 (see FIGS. 2A-2B)
of the vehicle 100. In some possible approaches, the user interface
device 105 may include a touch-sensitive display screen. The user
interface device 105 may further be configured to generate an
audible alarm, a visual alarm, or both.
[0013] The autonomous driving sensors 110 may include any number of
devices configured to generate signals that help navigate the
vehicle 100 while the vehicle 100 is operating in an autonomous
(e.g., driverless) mode. Examples of autonomous driving sensors 110
may include a radar sensor, a lidar sensor, a camera, or the like.
The autonomous driving sensors 110 help the vehicle 100 "see" the
roadway and/or negotiate various obstacles while the vehicle 100 is
operating in the autonomous mode.
[0014] The autonomous mode controller 115 may be configured to
control one or more subsystems 135 while the vehicle 100 is
operating in the autonomous mode. Examples of subsystems 135 that
may be controlled by the autonomous mode controller 115 may include
a brake subsystem, a suspension subsystem, a steering subsystem,
and a powertrain subsystem. The autonomous mode controller 115 may
control any one or more of these subsystems 135 by outputting
signals to control units associated with these subsystems 135. The
autonomous mode controller 115 may control the subsystems 135
based, at least in part, on signals generated by the autonomous
driving sensors 110.
[0015] The entertainment system 120 may be configured to present
media content or other types of content to one or more passengers.
Examples of media content may include movies, television shows,
games, music, videos, or the like. The entertainment system 120 may
include a projector 140 and a projection screen 145 (see FIGS. 2A
and 2B), both of which may be located within the passenger
compartment 130 of the vehicle 100. The projector 140 may be
mounted to a ceiling of the vehicle 100 and generally aimed toward
the projection screen 145 to project media content onto the
projection screen 145 for viewing by one or more passengers of the
vehicle 100. The projection screen 145 may be located near the
front of the vehicle 100, such as near the windshield. In some
possible implementations, the projector 140, the projection screen
145, or both, may be configured to retract into the ceiling when
the vehicle 100 is operating in a manual (e.g., a non-autonomous)
mode. The projector 140, the projection screen 145, or both may be
actuated (e.g., lowered from the ceiling) when the vehicle 100 is
operating in the autonomous mode. FIG. 2A shows the projection
screen 145 retracted (with the outline of the projection screen 145
when lowered shown for illustrative purposes only) and FIG. 2B
shows the projector 140 and projection screen 145 lowered from the
ceiling. FIG. 2B also illustrates that some of the seats 150 in the
passenger compartment 130 may be stowed during presentation of the
media content.
[0016] As shown in FIGS. 3-5, the entertainment system 120 may
include other display devices 205 located in the passenger
compartment 130 of the vehicle 100 for presenting media content
when the vehicle 100 is operating in the autonomous or
non-autonomous modes. For example, the entertainment system 120 may
be configured to present media content via a dashboard 155, an
instrument cluster 160 (See FIGS. 3-4), or a rearview mirror 165
(See FIG. 5).
[0017] The entertainment system 120 may be configured to receive
media content from any number of sources. In some possible
implementations, the entertainment system 120 may be configured to
access media content locally from a memory device (not shown)
incorporated into the vehicle 100 or remotely via a network. The
entertainment system 120 may be further configured to receive media
content from, e.g., a mobile device brought into the vehicle 100 by
one of the passengers. The entertainment system 120 may communicate
with the mobile device via a wired (e.g., USB) or wireless (e.g.,
Bluetooth.RTM.) communication protocol.
[0018] The entertainment system controller 125 may be configured to
control the operation of the entertainment system 120. The
entertainment system controller 125 may present media content in
the passenger compartment 130 while the vehicle 100 is operating in
the autonomous mode. Prior to presenting media content, the
entertainment system controller 125 may determine whether the
vehicle 100 is operating in the autonomous mode based on, e.g.,
signals received from the autonomous mode controller 115, signals
received from the autonomous driving sensors 110, and/or a user
input provide via the user interface device 105. After determining
that the vehicle 100 is operating in the autonomous mode, the
entertainment system controller 125 may actuate (e.g., lower) the
projection screen 145 and/or the projector 140 from the ceiling.
The entertainment system controller 125 may further turn on the
projector 140, cause the entertainment system 120 to access the
media content (either locally or remotely), and cause the
entertainment system 120 to present the media content to the
passengers of the vehicle 100.
[0019] Before the vehicle 100 begins to operate in a non-autonomous
mode, or if the passengers no longer wish to consume media content
via the entertainment system 120, the entertainment system
controller 125 may turn off the projector 140 and retract the
projection screen 145 and/or the projector 140 into the ceiling.
The entertainment system controller 125 may do so in response to a
user input or a signal received from, e.g., the autonomous mode
controller 115.
[0020] Some interaction from one of the passengers (e.g., the
driver) may be required while other passengers wish to continue to
consume media content. In such instances, the entertainment system
controller 125 may transfer the presentation of the media content
to a different display device 205. That is, the entertainment
system controller 125 may retract the projection screen 145 and
projector 140 when the driver assumes control of the vehicle 100
(i.e., the vehicle 100 is no longer operating in the non-autonomous
mode) and transfer the presentation of the media content to another
display device 205 such as a display in the dashboard 155, the
instrument cluster 160, or the rearview mirror 165. Alternatively,
the entertainment system controller 125 may transfer the
presentation of the media content from, e.g., a display device 205
in the instrument cluster 160 to, e.g., a display device 205 in the
rearview mirror 165. In other possible approaches, or in response
to a user input, the entertainment system controller 125 may stop
or pause the presentation of the media content when the vehicle 100
switches from operating in the autonomous mode to the
non-autonomous mode.
[0021] In general, computing systems and/or devices, such as the
user interface device 105, the autonomous mode controller 115, and
the entertainment system controller 125, may employ any of a number
of computer operating systems, including, but by no means limited
to, versions and/or varieties of the SYNC.RTM. operating system by
Ford Motor Company, the Microsoft Windows.RTM. operating system,
the Unix operating system (e.g., the Solaris.RTM. operating system
distributed by Oracle Corporation of Redwood Shores, Calif.), the
AIX UNIX operating system distributed by International Business
Machines of Armonk, N.Y., the Linux operating system, the Mac OS X
and iOS operating systems distributed by Apple Inc. of Cupertino,
Calif., the BlackBerry OS distributed by Research In Motion of
Waterloo, Canada, and the Android operating system developed by the
Open Handset Alliance. Examples of computing devices include,
without limitation, a computer workstation, a server, a desktop,
notebook, laptop, or handheld computer, or some other computing
system and/or device.
[0022] Computing devices generally include computer-executable
instructions, where the instructions may be executable by one or
more computing devices such as those listed above.
Computer-executable instructions may be compiled or interpreted
from computer programs created using a variety of programming
languages and/or technologies, including, without limitation, and
either alone or in combination, Java.TM., C, C++, Visual Basic,
Java Script, Perl, etc. In general, a processor (e.g., a
microprocessor) receives instructions, e.g., from a memory, a
computer-readable medium, etc., and executes these instructions,
thereby performing one or more processes, including one or more of
the processes described herein. Such instructions and other data
may be stored and transmitted using a variety of computer-readable
media.
[0023] A computer-readable medium (also referred to as a
processor-readable medium) includes any non-transitory (e.g.,
tangible) medium that participates in providing data (e.g.,
instructions) that may be read by a computer (e.g., by a processor
of a computer). Such a medium may take many forms, including, but
not limited to, non-volatile media and volatile media. Non-volatile
media may include, for example, optical or magnetic disks and other
persistent memory. Volatile media may include, for example, dynamic
random access memory (DRAM), which typically constitutes a main
memory. Such instructions may be transmitted by one or more
transmission media, including coaxial cables, copper wire and fiber
optics, including the wires that comprise a system bus coupled to a
processor of a computer. Common forms of computer-readable media
include, for example, a floppy disk, a flexible disk, hard disk,
magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other
optical medium, punch cards, paper tape, any other physical medium
with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM,
any other memory chip or cartridge, or any other medium from which
a computer can read.
[0024] Databases, data repositories or other data stores described
herein may include various kinds of mechanisms for storing,
accessing, and retrieving various kinds of data, including a
hierarchical database, a set of files in a file system, an
application database in a proprietary format, a relational database
management system (RDBMS), etc. Each such data store is generally
included within a computing device employing a computer operating
system such as one of those mentioned above, and are accessed via a
network in any one or more of a variety of manners. A file system
may be accessible from a computer operating system, and may include
files stored in various formats. An RDBMS generally employs the
Structured Query Language (SQL) in addition to a language for
creating, storing, editing, and executing stored procedures, such
as the PL/SQL language mentioned above.
[0025] In some examples, system elements may be implemented as
computer-readable instructions (e.g., software) on one or more
computing devices (e.g., servers, personal computers, etc.), stored
on computer readable media associated therewith (e.g., disks,
memories, etc.). A computer program product may comprise such
instructions stored on computer readable media for carrying out the
functions described herein.
[0026] FIGS. 3-5 illustrate an exemplary graphical user interface
170 presented on different display devices 205 in the vehicle 100.
FIG. 3 illustrates an example of a graphical user interface 170
that may be presented in the passenger compartment 130 of the
vehicle 100. As shown, the graphical user interface 170 includes a
speedometer 175, a fuel gauge 180, a battery charge indicator 185
(for, e.g., electric or hybrid vehicles), an engine temperature
indicator 190, a fuel economy indicator 195, and an odometer 200.
This graphical user interface 170 may be part of the instrument
cluster 160. The graphical user interface 170 shown in FIG. 3 may
be presented while the vehicle 100 is operating in the
non-autonomous mode. With reference now to FIG. 4, when the vehicle
100 is operating in the autonomous mode, and in response to user
input, the graphical user interface 170 may be updated to present
media content. For example, as shown in FIG. 4, the speedometer
175, the fuel gauge 180, and the battery charge indicator 185 may
be replaced with a display device 205 for viewing media content.
The display device 205 may be part of the entertainment system 120,
and the display of the media content may be controlled by the
entertainment system controller 125. Display devices 205 may be
incorporated into other components in the passenger compartment 130
of the vehicle 100. As shown in FIG. 5, a display device 205 may be
incorporated into the rearview mirror 165. Thus, passengers other
than the driver can continue to view media content even after the
driver has assumed control of the vehicle 100 (i.e., the vehicle
100 is no longer operating in the autonomous mode).
[0027] FIG. 6 is a flowchart of a process 600 that may be
implemented in the entertainment system controller 125 to control
the operation of the entertainment system 120 during use of the
vehicle 100.
[0028] At block 605, the entertainment system controller 125 may
confirm that one or more components of the entertainment system 120
are deactivated. For instance, the entertainment system controller
125 may confirm that the projector 140, the projection screen 145,
or both are deactivated. If one or more of the components are
currently active, the entertainment system controller 125 may
deactivate any active components.
[0029] At block 610, the entertainment system controller 125 may
monitor a status of one or more of the autonomous driving sensors
110. The status of the autonomous driving sensors 110 may be
determined from one or more signals output by the autonomous mode
controller 115. The status may indicate whether the vehicle 100 is
operating in the autonomous mode or needs to switch from the
autonomous mode to a non-autonomous mode.
[0030] At decision block 615, the entertainment system controller
125 may determine whether the vehicle 100 is operating in the
autonomous mode. As discussed above, the entertainment system
controller 125 may determine whether the vehicle 100 is operating
in the autonomous mode by monitoring the status of the autonomous
driving sensors 110. The process 600 may only continue if the
vehicle 100 is operating in the autonomous mode. Therefore, the
process 600 may return to block 610 if the vehicle 100 is not
operating in the autonomous mode. If the vehicle 100 is operating
in the autonomous mode, the process 600 may continue at block
620.
[0031] At block 620, the entertainment system controller 125 may
enable the entertainment system 120. Enabling the entertainment
system 120 may include lowering the projection screen 145 and/or
the projector 140 from the ceiling and turning on the projector
140.
[0032] At decision block 625, the entertainment system controller
125 may determine whether a user input has been received via, e.g.,
the user interface device 105 that indicates the user's desire to
view media content via the entertainment system 120. If the user
input has been received, the process 600 may continue at block 630.
If the user input has not been received, the process 600 may repeat
block 625 until the user input is received.
[0033] At block 630, the entertainment system controller 125 may
cause the entertainment system 120 to present the media content in
the passenger compartment 130 of the vehicle 100. The entertainment
system 120 may continue to present media content until either a
user input is received indicating a user's desire for the
entertainment system 120 to stop presenting the media content or
before the vehicle 100 switches from the autonomous mode to a
non-autonomous mode of operation.
[0034] At block 635, the entertainment system controller 125 may
continue to monitor the autonomous driving sensors 110 and also for
any user inputs indicating the user's desire to no longer view
media content through the entertainment system 120. For example,
the entertainment system controller 125 may monitor the autonomous
driving sensors 110 for signals indicating that user intervention
is necessary or that the vehicle 100 is going to stop operating in
the autonomous mode.
[0035] At decision block 640, the entertainment system controller
125 may determine whether the vehicle 100 is still operating in the
autonomous mode. If so, the process 600 may continue at decision
block 645. If the vehicle 100 is operating in a non-autonomous
mode, the process 600 may continue at block 650.
[0036] At decision block 645, the entertainment system controller
125 may determine whether a user input indicating the user's desire
to stop presenting media content through the entertainment system
120 has been received. If such a user input has been received, the
process 600 may continue at block 650. If no user input indicating
that the user desires for the entertainment system 120 to stop
providing media content in the passenger compartment 130, the
process 600 may return to block 635.
[0037] At block 650, the entertainment system controller 125 may
disable one or more components of the entertainment system 120.
Disabling one or more components of the entertainment system 120
may include retracting the projection screen 145, disabling the
projector 140, or both. Moreover, disabling one or more components
of the entertainment system 120 may include causing any displays in
the passenger compartment 130 to return to a normal operating mode.
After block 650, the process 600 may end or return to block
610.
[0038] FIG. 7 is a flowchart of an example process 700 that may be
implemented by the entertainment system controller 125 during,
e.g., a transition from the vehicle 100 operating in the autonomous
mode to the non-autonomous mode.
[0039] At block 705, the entertainment system controller 125 may
cause the displays of the entertainment system 120 to operate in a
normal mode of operation. This may include disabling one or more
components of the entertainment system 120 such as retracting the
projection screen 145, disabling the projector 140, or both.
[0040] At block 710, the entertainment system controller 125 may
monitor a status of one or more of the autonomous driving sensors
110. The status of the autonomous driving sensors 110 may be
determined from one or more signals output by the autonomous mode
controller 115. The status may indicate whether the vehicle 100 is
operating in the autonomous mode or needs to switch from the
autonomous mode to a non-autonomous mode.
[0041] At decision block 715, the entertainment system controller
125 may determine whether the vehicle 100 is operating in the
autonomous mode. As discussed above, the entertainment system
controller 125 may determine whether the vehicle 100 is operating
in the autonomous mode by monitoring the status of the autonomous
driving sensors 110. The process 700 may only continue if the
vehicle 100 is operating in the autonomous mode. Therefore, the
process 700 may return to block 710 if the vehicle 100 is not
operating in the autonomous mode. If the vehicle 100 is operating
in the autonomous mode, the process 700 may continue at block
720.
[0042] At block 720, the entertainment system controller 125 may
enable the entertainment system 120. Enabling the entertainment
system 120 may include lowering the projection screen 145 and/or
the projector 140 from the ceiling and turning on the projector
140.
[0043] At decision block 725, the entertainment system controller
125 may determine whether the a user input has been received via,
e.g., the user interface device 105 that indicates the user's
desire to view media content via the entertainment system 120. If
the user input has been received, the process 700 may continue at
block 730. If the user input has not been received, the process 700
may repeat block 725 until the user input is received.
[0044] At block 730, the entertainment system controller 125 may
cause the entertainment system 120 to present the media content in
the passenger compartment 130 of the vehicle 100. The entertainment
system 120 may continue to present media content until either a
user input is received indicating a user's desire for the
entertainment system 120 to stop presenting the media content or
before the vehicle 100 switches from the autonomous mode to a
non-autonomous mode of operation.
[0045] At block 735, the entertainment system controller 125 may
continue to monitor the autonomous driving sensors 110 and also for
any user inputs indicating the user's desire to no longer view
media content through the entertainment system 120. For example,
the entertainment system controller 125 may monitor the autonomous
driving sensors 110 for signals indicating that user intervention
is necessary or that the vehicle 100 is going to stop operating in
the autonomous mode.
[0046] At decision block 740, the entertainment system controller
125 may determine whether the vehicle 100 is still operating in the
autonomous mode. If so, the process 700 may continue at decision
block 745. If the vehicle 100 is operating in a non-autonomous
mode, the process 700 may continue at block 750.
[0047] At decision block 745, the entertainment system controller
125 may determine whether a user input indicating the user's desire
to stop presenting media content through the first display has been
received. Alternatively or in addition, the entertainment system
controller 125 may determine whether the autonomous mode controller
115 has indicated that the driver should assume command of the
vehicle 100. If such a user input or indication has been received,
the process 700 may continue at block 750. If no user input
indicating that the user desires for the entertainment system 120
to stop providing media content in the passenger compartment 130,
the process 700 may return to block 735.
[0048] At block 750, the entertainment system controller 125 may
set one or more components of the entertainment system 120 to,
e.g., operate in a normal (i.e., non-autonomous) mode. For example,
the entertainment system controller 125 may disable a first display
so that the first display stops presenting media content. Instead
of media content, the entertainment system controller 125 may cause
the first display to present information useful to a driver for
operating the vehicle 100.
[0049] At block 755, the entertainment system controller 125 may
transfer the presentation of the media content to another display
(i.e., a second display) in the passenger compartment 130. The
presentation of the media content on the second display may not
interfere with the driver's manual operation of the vehicle 100.
Thus, passengers other than the driver can continue to view media
content even after the driver has assumed control of the vehicle
100 (i.e., the vehicle 100 is no longer operating in the autonomous
mode). The process 700 may end after block 755 or continue at block
710.
[0050] With regard to the processes, systems, methods, heuristics,
etc. described herein, it should be understood that, although the
steps of such processes, etc. have been described as occurring
according to a certain ordered sequence, such processes could be
practiced with the described steps performed in an order other than
the order described herein. It further should be understood that
certain steps could be performed simultaneously, that other steps
could be added, or that certain steps described herein could be
omitted. In other words, the descriptions of processes herein are
provided for the purpose of illustrating certain embodiments, and
should in no way be construed so as to limit the claims.
[0051] Accordingly, it is to be understood that the above
description is intended to be illustrative and not restrictive.
Many embodiments and applications other than the examples provided
would be apparent upon reading the above description. The scope
should be determined, not with reference to the above description,
but should instead be determined with reference to the appended
claims, along with the full scope of equivalents to which such
claims are entitled. It is anticipated and intended that future
developments will occur in the technologies discussed herein, and
that the disclosed systems and methods will be incorporated into
such future embodiments. In sum, it should be understood that the
application is capable of modification and variation.
[0052] All terms used in the claims are intended to be given their
broadest reasonable constructions and their ordinary meanings as
understood by those knowledgeable in the technologies described
herein unless an explicit indication to the contrary in made
herein. In particular, use of the singular articles such as "a,"
"the," "said," etc. should be read to recite one or more of the
indicated elements unless a claim recites an explicit limitation to
the contrary.
[0053] The Abstract of the Disclosure is provided to allow the
reader to quickly ascertain the nature of the technical disclosure.
It is submitted with the understanding that it will not be used to
interpret or limit the scope or meaning of the claims. In addition,
in the foregoing Detailed Description, it can be seen that various
features are grouped together in various embodiments for the
purpose of streamlining the disclosure. This method of disclosure
is not to be interpreted as reflecting an intention that the
claimed embodiments require more features than are expressly
recited in each claim. Rather, as the following claims reflect,
inventive subject matter lies in less than all features of a single
disclosed embodiment. Thus the following claims are hereby
incorporated into the Detailed Description, with each claim
standing on its own as a separately claimed subject matter.
* * * * *