U.S. patent application number 14/457353 was filed with the patent office on 2015-08-20 for vehicle imaging system and method.
The applicant listed for this patent is General Electric Company. Invention is credited to Anwarul Azam, Matthew Lawrence Blair, Shannon Joseph Clouse, Mark Bradshaw KRAELING, Nidhi Naithani, Scott Daniel Nelson, Dattaraj Jagdish Rao.
Application Number | 20150235094 14/457353 |
Document ID | / |
Family ID | 53798380 |
Filed Date | 2015-08-20 |
United States Patent
Application |
20150235094 |
Kind Code |
A1 |
KRAELING; Mark Bradshaw ; et
al. |
August 20, 2015 |
VEHICLE IMAGING SYSTEM AND METHOD
Abstract
An imaging system and method generate image data within a field
of view that includes a cab of the vehicle and a portion of a route
being traveled on and/or wayside devices disposed along the route.
The cab includes a space where an operator of the vehicle is
located. The image data generated by the camera is examined to
identify route damage, a deteriorating condition of the route,
and/or a condition of the wayside devices. The condition of the
wayside devices can include damage to the wayside devices, a
missing wayside device, deterioration of the wayside devices, or a
change in terrain.
Inventors: |
KRAELING; Mark Bradshaw;
(Melbourne, FL) ; Blair; Matthew Lawrence;
(Lawrence Park, PA) ; Clouse; Shannon Joseph;
(Lawrence Park, PA) ; Nelson; Scott Daniel;
(Melbourne, FL) ; Naithani; Nidhi; (Bangalore,
IN) ; Rao; Dattaraj Jagdish; (Bangalore, IN) ;
Azam; Anwarul; (Erie, PA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
General Electric Company |
Schenectady |
NY |
US |
|
|
Family ID: |
53798380 |
Appl. No.: |
14/457353 |
Filed: |
August 12, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61940660 |
Feb 17, 2014 |
|
|
|
61940610 |
Feb 17, 2014 |
|
|
|
61940813 |
Feb 17, 2014 |
|
|
|
61940696 |
Feb 17, 2014 |
|
|
|
Current U.S.
Class: |
348/148 |
Current CPC
Class: |
H04N 7/181 20130101;
B61L 23/047 20130101; G06K 9/00791 20130101; B61L 23/042 20130101;
G06K 9/6217 20130101 |
International
Class: |
G06K 9/00 20060101
G06K009/00; B60R 1/00 20060101 B60R001/00; H04N 5/225 20060101
H04N005/225 |
Claims
1. A system comprising: a digital camera configured to be disposed
in a first vehicle system, the camera configured to generate image
data within a field of view of the camera, the field of view
including at least a portion of a cab of the first vehicle system
and at least one of a portion of a route being traveled by the
first vehicle system or one or more wayside devices disposed along
the route being traveled by the first vehicle system, the cab
including a space where an operator of the first vehicle system is
located during travel of the first vehicle system; and one or more
analysis processors configured to examine the image data generated
by the camera to identify at least one of damage to the route, a
deteriorating condition of the route, or a condition of the one or
more wayside devices, the condition of the one or more wayside
devices including at least one of damage to the one or more wayside
devices, a missing wayside device, deterioration of the one or more
wayside devices, or a change in terrain at or near the one or more
wayside devices.
2. The system of claim 1, wherein the digital camera is a high
definition camera.
3. The system of claim 1, wherein the one or more analysis
processors are configured to be disposed onboard the first vehicle
system for examination of the image data.
4. The system of claim 1, wherein the one or more analysis
processors are configured to identify at least one of the damage to
the route, the deteriorating condition of the route, or the
condition of the one or more wayside devices based on at least one
of an edge detection algorithm, pixel metrics, an object detection
algorithm, baseline image data, or a pixel gradient in the image
data.
5. The system of claim 1, wherein the one or more wayside devices
include a signaling light and the one or more analysis processors
are configured to identify a broken or missing light of the
signaling light based on the image data.
6. The system of claim 1, wherein the one or more analysis
processors are configured to identify the damage to the route as
shifting of one or more supporting bodies that connect rails of the
route, bending of the rails of the route, twisting of the rails of
the route, or a spacing between the rails of the route that differs
from a designated distance.
7. The system of claim 1, wherein the one or more analysis
processors are configured to edit the image data acquired during a
trip of the first vehicle system to create edited image data that
includes the image data representative of the at least one of the
damage to the route, the deteriorating condition of the route, or
the condition of the one or more wayside devices but that does not
include other image data.
8. The system of claim 1, wherein the one or more analysis
processors are configured to determine a location of the first
vehicle system when the image data representative of at least one
of damage to the route, the deteriorating condition of the route,
or the condition of the one or more wayside devices is obtained,
and the one or more analysis processors are configured to examine
the image data representative of at least one of the damage to the
route, the deteriorating condition of the route, or the condition
of the one or more wayside devices at the location, but to not
examine the image data acquired at one or more other locations.
9. A method comprising: generating image data within a field of
view of a camera disposed onboard a first vehicle system, the field
of view including at least a portion of a cab of the first vehicle
system and at least one of a portion of a route being traveled by
the first vehicle system or one or more wayside devices disposed
along the route being traveled by the first vehicle system, the cab
including a space where an operator of the first vehicle system is
located during travel of the first vehicle system; and examining,
using one or more analysis processors, the image data generated by
the camera to identify at least one of damage to the route, a
deteriorating condition of the route, or a condition of the one or
more wayside devices, the condition of the one or more wayside
devices including at least one of damage to the one or more wayside
devices, a missing wayside device, deterioration of the one or more
wayside devices, or a change in terrain at or near the one or more
wayside devices.
10. The method of claim 9, wherein the image data is generated and
examined while the first vehicle system is moving along the
route.
11. The method of claim 9, wherein the at least one of damage to
the route, the deteriorating condition of the route, or the
condition of the one or more wayside devices is identified based on
at least one of an edge detection algorithm, pixel metrics, an
object detection algorithm, baseline image data, or a pixel
gradient in the image data.
12. The method of claim 9, wherein the one or more wayside devices
include a signaling light and the image data is examined to
identify a broken or missing light of the signaling light based on
the image data.
13. The method of claim 9, wherein the damage to the route is
identified by the one or more analysis processors as a shifting of
one or more supporting bodies that connect rails of the route,
bending of the rails of the route, twisting of the rails of the
route, or a spacing between the rails of the route that differs
from a designated distance.
14. The method of claim 9, further comprising editing the image
data acquired during a trip of the first vehicle system to create
edited image data that includes the image data representative of
the at least one of damage to the route, the deteriorating
condition of the route, or the condition of the one or more wayside
devices but does not include other image data.
15. The method of claim 9, further comprising determining a
location of the first vehicle system when the image data
representative of the at least one of damage to the route, the
deteriorating condition of the route, or the condition of the one
or more wayside devices is generated, wherein the image data
representative of the at least one of damage to the route, the
deteriorating condition of the route, or the condition of the one
or more wayside devices is examined based on the location and the
image data acquired at one or more other locations is not
examined
16. A system comprising: a digital camera configured to be disposed
in a rail vehicle, the camera configured to generate image data
within a field of view of the camera, the field of view including
at least a portion of a cab of the rail vehicle and at least one of
a portion of a track outside of the rail vehicle or one or more
wayside devices along the track being traveled by the rail vehicle,
the cab including a space where an operator of the rail vehicle is
located during travel of the rail vehicle; and one or more analysis
processors configured to be disposed onboard the rail vehicle and
to examine the image data generated by the camera to identify a
condition of at least one of the track or the one or more wayside
devices, the condition including at least one of damage to the
track, damage to the one or more wayside devices, a missing wayside
device, or a changing condition of terrain at or near the one or
more wayside devices.
17. The system of claim 16, wherein the digital camera is a high
definition camera.
18. The system of claim 16, wherein the one or more analysis
processors are configured to identify the condition of the at least
one of the track or the one or more wayside devices based on at
least one of an edge detection algorithm, pixel metrics, an object
detection algorithm, baseline image data, or a pixel gradient in
the image data.
19. The system of claim 16, wherein the one or more wayside devices
include at least one of an inspection wayside device that inspects
the rail vehicle as the rail vehicle moves past the inspection
wayside device or a signaling wayside device that communicates
information with the rail vehicle as the rail vehicle moves past
the signaling wayside device.
20. The system of claim 16, wherein the one or more analysis
processors are configured to determine a location of the rail
vehicle when the image data representative of the condition of the
at least one of the track or the one or more wayside devices is
imaged.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional
Application No. 61/940,660, which was filed on 17 Feb. 2014, and is
titled "Route Imaging System And Method" (the "'660 Application"),
U.S. Provisional Application No. 61/940,610, which also was filed
on 17 Feb. 2014, and is titled "Wayside Imaging System And Method"
(the "'610 Application"), U.S. Provisional Application No.
61/940,813, which was filed on 17 Feb. 2014, and is titled
"Portable Camera System And Method For Transportation Data
Communication" (the "'813 Application"), and U.S. Provisional
Application No. 61/940,696, which was filed on 17 Feb. 2014, and is
titled "Vehicle Image Data Management System And Method" (the '696
Application"). The entire disclosures of these applications (e.g.,
the '660 Application, the '610 Application, the '813 Application,
and the '696 Application) are incorporated by reference.
FIELD
[0002] Embodiments of the subject matter described herein relate to
imaging systems, such as imaging systems onboard or near vehicle
systems.
BACKGROUND
[0003] Vehicle systems such as trains or other rail vehicles can
include cameras disposed on or near the vehicle systems. These
cameras can be used to record actions occurring outside of the
vehicle systems. For example, forward facing cameras can
continuously record video of the locations ahead of a train. If a
collision between the train and another vehicle occurs (e.g., an
automobile is struck at a crossing), then this video can later be
reviewed to determine liability for the collision, whether the
other vehicle improperly moved through a gate or signal, whether
the train was moving too fast, or the like. But, the image data
obtained by these cameras typically is only saved on a temporary
loop. Older image data is discarded when no accidents occur, even
though this image data may represent one or more other problems
with the vehicle and/or track.
[0004] In order to inspect routes, wayside devices disposed along
the routes traveled by the vehicle systems, or the like, crews are
periodically sent out over the routes to inspect the condition of
the wayside devices. This is a labor intensive and costly operation
that also ties up the routes, and can interfere with regular and
normal operations of other transportation using the routes.
Additionally, because this is a periodic operation, a fault in the
route or in a wayside device may not be observed by inspection
crews in time to prevent catastrophic events.
BRIEF DESCRIPTION
[0005] In one example of the inventive subject matter described
herein, a system (e.g., an imaging system) includes a digital
camera configured to be disposed in a first vehicle system. The
camera is configured to generate image data within a field of view
of the camera. The field of view includes at least a portion of a
cab of the first vehicle system and at least one of a portion of a
route being traveled by the first vehicle system or one or more
wayside devices disposed along the route being traveled by the
first vehicle system. The cab includes a space where an operator of
the first vehicle system is located during travel of the first
vehicle system. The system also can include one or more analysis
processors configured to examine the image data generated by the
camera to identify at least one of damage to the route, a
deteriorating condition of the route, or a condition of the one or
more wayside devices. The condition of the one or more wayside
devices includes at least one of damage to the one or more wayside
devices, a missing wayside device, deterioration of the one or more
wayside devices, or a change in terrain at or near the one or more
wayside devices.
[0006] In another example of the inventive subject matter described
herein, another method (e.g., an imaging method) includes
generating image data within a field of view of a camera disposed
onboard a first vehicle system. The field of view includes at least
a portion of a cab of the first vehicle system and at least one of
a portion of a route being traveled by the first vehicle system or
one or more wayside devices disposed along the route being traveled
by the first vehicle system. The cab includes a space where an
operator of the first vehicle system is located during travel of
the first vehicle system. The method also can include examining,
using one or more analysis processors, the image data generated by
the camera to identify at least one of damage to the route, a
deteriorating condition of the route, or a condition of the one or
more wayside devices. The condition includes at least one of damage
to the one or more wayside devices, a missing wayside device,
deterioration of the one or more wayside devices, or a change in
terrain at or near the one or more wayside devices.
[0007] In another example of the inventive subject matter described
herein, another system (e.g., an imaging system) includes a digital
camera configured to be disposed in a rail vehicle. The camera is
configured to generate image data within a field of view of the
camera. The field of view includes at least a portion of a cab of
the rail vehicle and at least one of a portion of a track outside
of the rail vehicle or one or more wayside devices along the track
being traveled by the rail vehicle. The cab includes a space where
an operator of the rail vehicle is located during travel of the
rail vehicle. The system also includes one or more analysis
processors configured to be disposed onboard the rail vehicle and
to examine the image data generated by the camera to identify a
condition of at least one of the track or the one or more wayside
devices. The condition includes at least one of damage to the
track, damage to the one or more wayside devices, a missing wayside
device, or a changing condition of terrain at or near the one or
more wayside devices.
[0008] In another example of the inventive subject matter described
herein, a system (e.g., an imaging system) includes a digital
camera and one or more analysis processors. The digital camera is
configured to be disposed in a first vehicle system to generate
image data within a field of view of the camera. The field of view
includes at least a portion of a cab of the first vehicle system
and one or more wayside devices disposed along a route being
traveled by the first vehicle system. The cab includes a space
where an operator of the first vehicle system is located during
travel of the first vehicle system. The one or more analysis
processors are configured to examine the image data generated by
the camera to identify a condition of the one or more wayside
devices. The condition includes at least one of damage to the one
or more wayside devices, a missing wayside device, deterioration of
the one or more wayside devices, or a change in terrain at or near
the one or more wayside devices.
[0009] In another example of the inventive subject matter described
herein, a method (e.g., a method for imaging) includes generating
image data within a field of view of a camera disposed onboard a
first vehicle system. The field of view includes at least a portion
of a cab of the first vehicle system and one or more wayside
devices disposed along a route being traveled by the first vehicle
system. The cab includes a space where an operator of the first
vehicle system is located during travel of the first vehicle
system. The method also includes examining (using one or more
analysis processors) the image data generated by the camera to
identify a condition of the one or more wayside devices. The
condition includes at least one of damage to the one or more
wayside devices, a missing wayside device, deterioration of the one
or more wayside devices, or a change in terrain at or near the one
or more wayside devices.
[0010] In another example of the inventive subject matter described
herein, another system (e.g., a rail vehicle imaging system)
includes a digital camera and one or more analysis processors. The
camera is configured to be disposed in a rail vehicle and to
generate image data within a field of view of the camera. The field
of view includes at least a portion of a cab of the rail vehicle
and one or more wayside devices along a track being traveled by the
rail vehicle. The cab includes a space where an operator of the
rail vehicle is located during travel of the rail vehicle. The one
or more analysis processors are configured to be disposed onboard
the rail vehicle and to examine the image data generated by the
camera to identify a condition of the one or more wayside devices.
The condition includes at least one of damage to the one or more
wayside devices, a missing wayside device, or a changing condition
of terrain at or near the one or more wayside devices.
[0011] In one example of the inventive subject matter described
herein, a system (e.g., an imaging system) includes a digital
camera and one or more analysis processors. The digital camera is
configured to be disposed in a first vehicle system and to generate
image data within a field of view of the camera. The field of view
includes at least a portion of a cab of the first vehicle system
and a portion of a route being traveled by the first vehicle
system. The cab includes a space where an operator of the first
vehicle system is located during travel of the first vehicle
system. The one or more analysis processors are configured to
examine the image data generated by the camera to identify at least
one of damage to the route or a deteriorating condition of the
route.
[0012] In another example of the inventive subject matter described
herein, a method (e.g., an imaging method) includes generating
image data within a field of view of a camera disposed onboard a
first vehicle system. The field of view includes at least a portion
of a cab of the first vehicle system and a portion of a route being
traveled by the first vehicle system. The cab includes a space
where an operator of the first vehicle system is located during
travel of the first vehicle system. The method also includes
examining (using one or more analysis processors) the image data
generated by the camera to identify at least one of damage to the
route or a deteriorating condition of the route.
[0013] In another example of the inventive subject matter described
herein, another system (e.g., an imaging system) includes a digital
camera and one or more analysis processors. The digital camera is
configured to be disposed in a rail vehicle and to generate image
data within a field of view of the camera. The field of view
includes at least a portion of a cab of the rail vehicle and a
portion of a track being traveled by the rail vehicle. The cab
includes a space where an operator of the rail vehicle is located
during travel of the rail vehicle. The one or more analysis
processors are configured to be disposed onboard the rail vehicle
and to examine the image data generated by the camera to identify
at least one of damage to the track or a deteriorating condition of
the track.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The subject matter described herein will be better
understood from reading the following description of non-limiting
embodiments, with reference to the attached drawings, wherein
below:
[0015] FIG. 1 is a schematic illustration of a vehicle system
having a wayside imaging system disposed thereon according to one
embodiment;
[0016] FIG. 2 illustrates an image representative of image data
generated by a camera shown in FIG. 1 for examination by one or
more analysis processors shown in FIG. 1 according to one
embodiment;
[0017] FIG. 3 illustrates one example of a comparison between the
image data acquired by and output from the camera shown in FIG. 1
with baseline image data according to one embodiment;
[0018] FIG. 4 illustrates another example of a comparison between
the image data acquired by and output from the camera shown in FIG.
1 with baseline image data according to one embodiment;
[0019] FIG. 5 illustrates an image representative of additional
image data generated by the camera shown in FIG. 1 for examination
by the analysis processor shown in FIG. 1 according to one
embodiment;
[0020] FIG. 6 illustrates one example of a comparison between the
image data acquired by and output from the camera shown in FIG. 1
with baseline image data according to one embodiment;
[0021] FIG. 7 illustrates a flowchart of a method for imaging one
or more wayside devices disposed along a route traveled by one or
more vehicle systems according to one embodiment;
[0022] FIG. 8 illustrates an image representative of image data
generated by a camera shown in FIG. 1 for examination by an
analysis processor shown in FIG. 1 according to one embodiment;
[0023] FIG. 9 illustrates one example of a comparison between a
route shown in FIG. 1 as represented by image data generated by the
camera shown in FIG. 1 with a baseline route image according to one
embodiment;
[0024] FIG. 10 illustrates a schematic diagram of a vehicle consist
having one or more imaging systems shown in FIG. 1 in accordance
with one embodiment;
[0025] FIG. 11 illustrates a top view of the vehicle consist shown
in FIG. 10 according to one embodiment; and
[0026] FIG. 12 illustrates a flowchart of a method for imaging a
route traveled by one or more vehicle systems according to one
embodiment.
DETAILED DESCRIPTION
[0027] One or more embodiments of the inventive subject matter
described herein relate to imaging systems and methods for vehicle
systems. While several examples of the inventive subject matter are
described in terms of rail vehicles (e.g., trains, locomotive,
locomotive consists, and the like), not all embodiments of the
inventive subject matter is limited to rail vehicles. At least some
of the inventive subject matter may be used in connection with
other off-highway vehicles (e.g., vehicles that are not permitted
or designed for travel on public roadways, such as mining
equipment), automobiles, marine vessels, airplanes, or the
like.
[0028] FIG. 1 is a schematic illustration of a vehicle system 100
having an imaging system 102 disposed thereon according to one
embodiment. The vehicle system 100 includes a single vehicle, such
as a single rail vehicle (e.g., locomotive), but alternatively may
include two or more vehicles mechanically coupled with each other
to travel together along a route 104, such as a train or other
vehicle consist.
[0029] The imaging system 102 includes one or more cameras 106 that
obtain image data of the interior and/or exterior of the vehicle
system 100. The camera 106 shown in FIG. 1 is an internal or
interior camera that is coupled with the vehicle system 100 so that
a field of view 110 of the camera (e.g., the space that is imaged
or otherwise represented by image data generated by the camera)
includes at least part of an interior of the vehicle system 100.
For example, the camera 106 in FIG. 1 may be referred to as a cab
camera that is disposed inside a cab of the vehicle system 100
where an operator of the vehicle system 100 is located to control
and/or monitor operations of the vehicle system 100.
[0030] The camera 106 can be positioned and oriented so that the
field of view of the camera 106 includes the interior space of the
cab in the vehicle system 100, as well as a portion of the exterior
of the vehicle system 100. This portion of the exterior of the
vehicle system 100 can be the space outside of the vehicle system
100 that is viewable through one or more windows 108 of the vehicle
system 100. In the illustrated example, the camera 106 is oriented
so that at least a portion of the route 104 that is ahead of the
vehicle system 100 is viewable in the field of view 110 of the
camera 106. In this way, the camera 106 can be used to both monitor
events inside the vehicle system 100 and examine the route and/or
wayside devices outside of the vehicle system 100, as described
herein. For example, the camera 106 can be used to record
performance of the operator of the vehicle system 100 to ensure
that the operator is controlling the vehicle system 100 safely,
according to rules, requirements, or regulations, is present and
aware, and the like. The camera 106 may then also be used to
determine if any external problems exist with the route and/or the
wayside devices.
[0031] As described herein, one or more wayside devices also may be
within the field of view 110 of the camera 106. These wayside
devices include equipment, systems, assemblies, and the like, that
are located outside of the vehicle system 100 at, near, or
alongside the route 104. The wayside devices can provide
functionality to guide, warn, examine, or otherwise assist travel
of the vehicle system 100. By way of non-limiting examples, the
wayside devices can include signals that display signs, illuminate
lights, or otherwise visually notify an onboard operator of the
vehicle system 100 of a vacancy or occupancy of an upcoming segment
of the route 104, a reduced speed limit, an upcoming segment of the
route 104 being repaired or maintained, or the like. The wayside
devices can include examination systems that examine the vehicle
system 100 as the vehicle system 100 travels past the wayside
devices. Examples of such wayside devices can include cameras,
infrared detectors, radio frequency identification (RFID)
transponders, readers, or tags, or the like. Other wayside devices
can include systems that communicate with the vehicle system 100
using wired and/or wireless connections, such as by wirelessly
communicating messages to the vehicle system 100 and/or
communicating messages through the route 104 (e.g., as electric
signals communicated through one or more conductive rails of the
route 104). Optionally, the wayside devices may include other
devices or systems.
[0032] The images and/or video captured and output by the camera
106 can record actions performed by the operator onboard the
vehicle system 100, but also may capture the wayside devices as the
vehicle system 100 travels past the wayside devices. While the
camera 106 may be disposed onboard and inside the vehicle system
100 in one embodiment, the camera 106 may be oriented such that the
field of view 110 of the camera 106 includes the wayside devices
(e.g., through one or more windows 108 of the vehicle system
100).
[0033] Optionally, the imaging system 102 can include one or more
external or exterior cameras, or the camera 106 may be an external
or exterior camera. An external or exterior camera is a camera that
is coupled with the vehicle system 100 outside of the cab (e.g., on
an exterior surface of the vehicle system 100) so that the field of
view 110 of the camera 106 includes at least part of the exterior
of the vehicle system 100. For example, the field of view 110 can
capture portions of the route 104 and/or the wayside devices during
movement of the vehicle system 100 on the route 104.
[0034] The camera 106 may be a digital camera capable of obtaining
relatively high quality image data (e.g., static or still images
and/or videos). For example, the camera 106 may be Internet
protocol (IP) cameras that generate packetized image data. The
camera 106 can be a high definition (HD) camera capable of
obtaining image data at relatively high resolutions. For example,
the camera 106 may obtain image data having at least 480 horizontal
scan lines, at least 576 horizontal scan lines, at least 720
horizontal scan lines, at least 1080 horizontal scan lines, or an
even greater resolution. The image data generated by the camera 106
can include still images and/or videos.
[0035] A controller 112 of the imaging system 102 includes or
represents hardware circuits or circuitry that includes and/or is
connected with one or more computer processors, such as one or more
computer microprocessors. The controller 112 can dictate
operational states of the camera 106, save image data obtained by
the camera 106 to one or more memory devices 114 of the imaging
system 102, generate alarm signals responsive to identifying one or
more problems with the route 104 and/or the wayside devices based
on the image data that is obtained, or the like.
[0036] The memory device 114 includes one or more computer readable
media used to at least temporarily store the image data provided by
the camera 106. Without limitation, the memory device 114 can
include a computer hard drive, flash drive, optical disk, or the
like. The memory device 114 may be disposed entirely onboard the
vehicle system 100, or may be at least partially stored off-board
the vehicle system 100, such as in a dispatch facility, another
vehicle, or the like.
[0037] During travel of the vehicle system 100 along the route 104,
the camera 106 can generate image data representative of images
and/or video of the field of view 110 of the camera 106. This image
data can represent actions occurring in the interior of the vehicle
system 100 (e.g., the operator changing operational settings of the
vehicle system 100). For example, one use for the image data may be
for an accident investigation, where the actions of an onboard
operator are examined to determine if the operator was present at
the controls of the vehicle system 100 at the time of the accident,
if the operator was awake and aware leading up to the accident, if
the proper actions were taken leading up to the accident (e.g., a
horn or other alarm was activated, the brakes were engaged, etc.),
and the like.
[0038] Additionally or alternatively, the image data may be used to
inspect the health of the route 104, status of wayside devices
along the route 104 being traveled on by the vehicle system 100, or
the like. As described above, the field of view 110 of the camera
106 can encompass at least some of the route 104 and/or wayside
devices disposed ahead of the vehicle system 100 along a direction
of travel of the vehicle system 100. During movement of the vehicle
system 100 along the route 104, the camera 106 can obtain image
data representative of the route 104 and/or the wayside devices for
examination to determine if the route 104 and/or wayside devices
are functioning properly, or have been damaged, need repair, and/or
need further examination.
[0039] The image data created by the camera 106 can be referred to
as machine vision, as the image data represents what is seen by the
imaging system 102 in the field of view 110 of the camera 106. One
or more analysis processors 116 of the imaging system 102 may
examine the image data to identify conditions of the route 104
and/or wayside devices. Optionally, the analysis processor 116 can
examine the terrain at, near, or surrounding the route 104 and/or
wayside devices to determine if the terrain has changed such that
maintenance of the route 104, wayside devices, and/or terrain is
needed. For example, the analysis processor 116 can examine the
image data to determine if vegetation (e.g., trees, vines, bushes,
and the like) is growing over the route 104 or a wayside device
(such as a signal) such that travel over the route 104 may be
impeded and/or view of the wayside device may be obscured from an
operator of the vehicle system 100. As another example, the
analysis processor 116 can examine the image data to determine if
the terrain has eroded away from, onto, or toward the route 104
and/or wayside device such that the eroded terrain is interfering
with travel over the route 104, is interfering with operations of
the wayside device, or poses a risk of interfering with operation
of the route 104 and/or wayside device. Thus, the terrain "near"
the route 104 and/or wayside device may include the terrain that is
within the field of view of the camera 106 when the route 104
and/or wayside device is within the field of view of the camera
106, the terrain that encroaches onto or is disposed beneath the
route 104 and/or wayside device, and/or the terrain that is within
a designated distance from the route 104 and/or wayside device
(e.g., two meters, five meters, ten meters, or another
distance).
[0040] The analysis processor 116 can represent hardware circuits
and/or circuitry that include and/or are connected with one or more
computer processors, such as one or more computer microprocessors.
While the analysis processor 116 is shown as being disposed onboard
the vehicle system 100, optionally, all or part of the analysis
processor 116 may be located off-board of the vehicle system 100,
such as in a dispatch facility or other location.
[0041] Acquisition of HD image data from the camera 106 can allow
for the analysis processor 116 to have access to sufficient
information to examine individual video frames, individual still
images, several video frames, or the like, and determine the
condition of the wayside devices and/or terrain at or near the
wayside device. The HD image data optionally can allow for the
analysis processor 116 to have access to sufficient information to
examine individual video frames, individual still images, several
video frames, or the like, and determine the condition of the route
104. The condition of the route 104 can represent the health of the
route 104, such as a state of damage to one or more rails of a
track, the presence of foreign objects on the route, overgrowth of
vegetation onto the route, and the like. As used herein, the term
"damage" can include physical damage to the route (e.g., a break in
the route, pitting of the route, or the like), movement of the
route from a prior or designated location, growth of vegetation
toward and/or onto the route, deterioration in the supporting
material (e.g., ballast material) beneath the route, or the like.
For example, the analysis processor 116 may examine the image data
to determine if one or more rails are bent, twisted, broken, or
otherwise damaged. Optionally, the analysis processor 116 can
measure distances between the rails to determine if the spacing
between the rails differs from a designated distance (e.g., a gauge
or other measurement of the route).
[0042] In one embodiment, because the HD image data includes a
sufficiently large amount of data, the analysis processor 116 may
examine the image data for damage to the route 104 in real time. By
"real time," it is meant that the analysis processor 116 examines
the image data to identify damage to the route 104, examine the
wayside device, and/or examine nearby terrain while the vehicle
system 100 is moving along the route 104. Optionally, the analysis
processor 116 (and/or another off-board analysis processor 116) may
perform post-hoc processing and/or analysis of the image data.
"Post-hoc" refers to the examination of the image data after the
vehicle system 100 has completed a trip. For example, during a trip
of the vehicle system 100 from a starting location to a destination
location, the camera 106 can generate image data that captures
views of the route 104 and/or wayside devices. Real time
examination of this image data includes examination of the image
data while the vehicle system 100 is moving from the starting
location to the destination location, while post-hoc examination of
the image data includes examination of the image data onboard
and/or off-board the vehicle system 100 after the vehicle system
100 has arrived at the destination location (and/or is no longer
moving).
[0043] The analysis of the image data by the analysis processor 116
can be performed using one or more image and/or video processing
algorithms, such as edge detection, pixel metrics, comparisons to
benchmark images, object detection, gradient determination, or the
like.
[0044] Edge detection refers to the examination of the image data
to identify edges of objects in the image data, such as the edges
of the rails of a track. As one example, the edges of the objects
can be identified by finding the pixels in one or more frames of
the image data that have the same or similar (e.g., within a
designated range of each other) intensity and/or that are located
adjacent or near each other (e.g., within a designated distance of
each other). Those pixels having the same or similar intensity and
located adjacent or near each other can be identified as
representing an object. The pixels on the outer edges of the object
can be differentiated from other pixels that are outside of the
object based on differences between the pixel intensities. For
example, the differences in intensities between pixels within the
object may be smaller than the differences in intensities between a
pixel on the outer edge of the object and a pixel outside of the
object (and/or that is adjacent to or near the outer edge of the
object). Based on these differences, the analysis processor 116 can
identify the edges of the object.
[0045] Pixel metrics can refer to one or more algorithms that
measure qualities of the pixels to identify objects in the image
data. While pixel intensities are described above, optionally,
pixel metrics can include measuring color, luminance, or another
parameter of the pixels in the image data. The pixel metrics can be
compared or otherwise examined in order to determine locations of
objects (e.g., rails) in the image data for identifying damage to
the route 104.
[0046] The use of benchmark images includes the analysis processor
116 comparing actual image data to one or more benchmark images to
determine if any differences (e.g., significant differences other
than noise) between the actual and benchmark images are present.
The benchmark images can include previously acquired image data of
a wayside device and/or nearby terrain. The previously acquired
image data may be obtained at a time when the wayside device and/or
nearby terrain was known to be in good or at least acceptable
condition. Optionally, the benchmark image may be an image or video
that is created (e.g., not an actual image) to represent the shape
or other appearance of the wayside device and/or nearby terrain.
The created benchmark image may be designed to represent the
wayside device and/or nearby terrain when the wayside device and/or
nearby terrain are in good or at least acceptable condition.
[0047] The analysis processor 116 can compare the actual image data
with the benchmark image data if the shape, size, arrangement,
color, or the like, of objects, edges, pixels, or the like,
identified the actual image data are the same as or within a
designated range of each other. For example, the analysis processor
116 can identify edges of objects in the actual image data (e.g.,
using edge detection algorithms described above or in another
manner). These edges can be used to estimate the shape, location,
and/or arrangement of objects in the actual image data. Some of
these objects can represent the wayside devices and/or the terrain
nearby. The identified objects in the actual image data can be
compared with the known shape, location, and/or arrangement of
objects in the benchmark image data. Depending on the amount of
spatial overlap and/or the lack of spatial overlap between the
object(s) in the actual image data and the object(s) in the
benchmark image data, the analysis processor 116 may determine that
the wayside device is or is not in the proper position, has or has
not been damaged, and the like. For example, if the object(s) in
the actual image data match or are relatively close to matching the
object(s) in the benchmark image data, then the analysis processor
116 can determine that the wayside device and/or nearby terrain is
in acceptable condition. Otherwise, the analysis processor 116 can
determine that the wayside device and/or nearby terrain is damaged,
has moved, or otherwise is in an unacceptable position.
[0048] The analysis processor 116 can examine one or more
parameters of the pixels, such as the intensities, color,
luminance, or the like, of the pixels in one or more areas of the
actual image data from the camera 106 to determine gradients of
these pixel parameters. The gradients can represent a degree or
rate at which these parameters change over a designated area, such
as across a frame of the image data or another area or distance.
The analysis processor 116 can compare the determined gradients to
one or more designated gradients that are associated with image
data representative of the wayside device and/or nearby terrain
that is in acceptable condition (e.g., not damaged, overgrown,
eroded, or the like). If the determined gradients differ from the
designated gradients, then the analysis processor 116 can determine
that the image data does not include the wayside device, or that
the wayside device and/or nearby terrain is damaged or otherwise in
unacceptable condition.
[0049] With continued reference to the imaging system 102 shown in
FIG. 1, FIG. 2 illustrates an image 200 representative of image
data generated by the camera 106 for examination by the analysis
processor 116 according to one embodiment. The image 200
illustrates portions of rails 202, 204 of the route 104 as viewed
from the interior camera 106 through the windows 108 of the vehicle
system 100. Also viewable in the image 200 alongside the route 104
is an inspection wayside device 206 and a signaling wayside device
208. The inspection wayside device 206 may inspect one or more
components of the vehicle system 100 as the vehicle system 100
moves past the inspection wayside device 206. For example, the
inspection wayside device 206 may include an infrared detector, or
"hot box detector," or other inspection device. The signaling
wayside device 208 is shown as a light signal that changes color to
indicate a status of an upcoming segment of the route 104 and/or a
speed that the vehicle system 100 should use, such as by
illuminating one or more lights with a green, yellow, or red color.
While only one light 210 is shown in the signaling wayside device
208, optionally, the signaling wayside device 208 may include
multiple lights or may signal the vehicle system 100 in another
manner.
[0050] As described above, the analysis processor 116 can examine
the image data to identify objects (e.g., shapes) in the image
data. These shapes may correspond to the wayside devices 206, 208
and/or terrain near the wayside devices 206, 208. The shapes of
these objects can be identified using pixel metrics, edge
detection, pixel gradients, comparisons to benchmark images, object
detection, or the like.
[0051] The analysis processor 116 can compare the actual image data
shown in FIG. 2 with benchmark image data that represents locations
of where objects representative of the wayside devices 206, 208
and/or nearby terrain are to be located. In order to determine
which benchmark image data to compare to the actual image data, the
analysis processor 116 can determine the location of the vehicle
system 100 and select at least one set of benchmark image data from
several sets of benchmark image data (e.g., stored on the memory
device 114 or otherwise accessible by the analysis processor 116).
The different sets of benchmark image data can be representative of
the wayside devices at different locations along the route 104.
Once the analysis processor 116 identifies the location of the
vehicle system 100 when the image data shown in the image 200 was
obtained, the analysis processor 116 can obtain the benchmark image
data representative of the wayside devices and/or nearby terrain
associated with that location.
[0052] In order to determine the locations of the vehicle system
100, the imaging system 102 can include or be coupled with a
location determining device 118 that generates location data
representative of where the vehicle system 100 is located. The
location determining device 118 can represent a global positioning
system (GPS) receiver, a radio frequency identification (RFID)
transponder that communicates with RFID tags or beacons disposed
alongside the route 104, a computer that triangulates the location
of the vehicle system 100 using wireless signals communicated with
cellular towers or other wireless signals, a speed sensor (that
outputs data representative of speed, which is translated into a
distance from a known or entered location by the controller 112),
or the like. The location determining device 118 can include an
antenna 120 (and associated hardware receiver or transceiver
circuitry) for determining the location data. The analysis
processor 116 can receive this data and can determine the location
of the vehicle system 100.
[0053] The location data can be associated with the image data in
order to indicate where the portion of the route 104 that is shown
in the image data is located. For example, the location data can be
included in the image data as metadata or other data that is saved
with the image data. Optionally, the location data may be stored
separately from the image data but associated with the image data,
such as in a table, list, database, or other memory structure. The
location and/or time information can be shown on the image 200,
such as by overlaying this information on the image. The location
and/or time information can represent the location where the image
data that is shown in the image 200 was acquired and/or when this
image data was acquired. For example, the location and/or time
information can indicate the GPS coordinates of the segment of the
route 104 that is shown in the image 200.
[0054] In one embodiment, the location data may be used to control
when the camera 106 obtains image data and/or which image data is
examined by the analysis processor 116. For example, different
areas, or zones, may be identified as including wayside devices to
be examined using the image data acquired and output by the camera
106. Responsive to the vehicle system 100 entering into such a zone
based on the location data, the camera 106 may begin obtaining
and/or sending image data to the analysis processor 116, and/or the
analysis processor 116 may begin examining the image data. Upon
exit from the zone, the camera 106 may stop obtaining and/or
sending the image data to the analysis processor 116, and/or the
analysis processor 116 may stop examining the image data.
Alternatively, the analysis processor 116 may only examine the
image data that is acquired by the camera 106 when the vehicle
system 100 is inside such a zone.
[0055] The analysis processor 116 examines the image data to
identify problems with the route 104. In one aspect, the analysis
processor 116 compares locations and/or arrangements of the route
104 in the image data with designated locations and/or arrangements
of the route 104, such as which can be stored in the memory device
114.
[0056] FIG. 3 illustrates one example of a comparison between the
image data acquired by and output from the camera 106 with baseline
image data according to one embodiment. The baseline image data
represents how the wayside devices 206, 208 and/or surrounding
terrain is expected to appear in the image data output by the
camera 106 if the wayside devices 206, 208 are in acceptable
condition (e.g., have not been moved, vandalized or otherwise
damaged, are not overgrown with vegetation, are not covered due to
erosion, or the like).
[0057] The baseline image data includes baseline objects 300, 302,
304 shown in dashed lines in FIG. 3. These objects 300, 302, 304
represent where the wayside devices 206, 208 are expected to be
when the vehicle system 100 is at or near a designated location
associated with the baseline objects 300, 302, 304 and the wayside
devices 206, 208 and/or nearby terrain are in acceptable condition.
The baseline objects 300, 302, 304 may change location, size,
arrangement, or the like, for image data acquired when the vehicle
system 100 is in another location.
[0058] The analysis processor 116 can determine the locations of
the objects representative of the wayside devices 206, 208 (e.g.,
the signal, the light, the inspection device, or the like) from the
actual image data that is output by the camera 106 and compare this
to the baseline objects of the baseline image data. The comparison
can involve determining if the actual objects (e.g., representative
of the wayside devices 206, 208) are in the same or overlapping
locations as the baseline objects in the image data or image 200.
In the illustrated example, the analysis processor 116 can
determine that the object representative of the wayside device 206
is in the same location as the baseline object 304 or overlaps the
area encompassed by the baseline object 304 for a least a
designated fraction of the baseline object 304. If the actual
imaged object and the baseline object overlap by at least this
designated fraction, then the analysis processor 116 can determine
that the wayside device 206, 208 is in acceptable condition.
Otherwise, the analysis processor 116 may determine that the
wayside device 206, 208 and/or terrain is in an unacceptable
condition.
[0059] The designated fraction may be a percentage such as 50%,
60%, 70%, 80%, 90%, or another amount, depending on how sensitive
the analysis processor 116 is to be in identifying problems with
the wayside devices 206, 208 and/or nearby terrain. For example,
lowering the designated fraction may cause the analysis processor
116 to identify more problems with the wayside devices 206, 208
and/or terrain, but also may cause the analysis processor 116 to
falsely identify such problems when no problems actually exist.
Increasing the designated fraction may cause the analysis processor
116 to identify fewer problems with the wayside devices 206, 208
and/or terrain, but also may cause the analysis processor 116 to
miss some identification of some problems.
[0060] Based on the large amount of overlap between the signal
wayside device 208 and the baseline object 300, between the light
210 and the baseline object 302, and/or between the inspection
wayside device 206 and the baseline object 304, the analysis
processor 116 can determine that the wayside devices 206, 208
and/or the terrain are in acceptable condition.
[0061] FIG. 4 illustrates another example of a comparison between
the image data acquired by and output from the camera 106 with
baseline image data according to one embodiment. In contrast to the
image data shown in FIG. 3, the image data shown in FIG. 4 includes
a foreign object 400 at least partially obstructing the view of the
inspection wayside device 206 and the signaling wayside device 208
being damaged (e.g., bent out of position and the light 210 shown
in FIGS. 2 and 3 being removed or damaged). The foreign object 400
can represent growth of vegetation on or around the wayside device
206, erosion of the terrain onto or around the wayside device 206,
or another object disposed on or near the wayside device 206. The
presence of the foreign object 400 may interfere with functions of
the wayside device 206. Similarly, the damage to the wayside device
208 may interfere with signaling functions of the wayside device
208.
[0062] The analysis processor 116 can compare the actual objects in
the image data representative of the wayside devices 206, 208 with
the baseline objects 300, 302, 304 to determine if the actual
objects match the baseline objects. For example, the analysis
processor 116 can calculate the amount of overlap between the
actual objects and the baseline objects 300, 302, 304. With respect
to the wayside device 208 and the baseline object 300, very little
overlap exists. With respect to the light 210 and the baseline
object 302, the damage or removal of the light 210 may prevent the
analysis processor 116 from identifying any object where the light
210 should be located. With respect to the wayside device 206 and
the baseline object 304, the object 400 (e.g., eroded terrain,
vegetation, other foreign objects, or the like) is partially
blocking the view of the wayside device 206.
[0063] The analysis processor 116 can determine that the object 400
is not part of the wayside device 206 based on edge detection
between the object 400 and the wayside device 206, differences in
pixel metrics between the object 400 and the wayside device 206, or
the like. The presence of the object 400 prevents the analysis
processor 116 from identifying the wayside device 206 as being in
acceptable condition because not enough of the wayside device 206
overlaps the baseline object 304. For example, because the fraction
of the baseline object 304 that is overlapped by the wayside device
206 does not exceed a designated amount, the analysis processor 116
may determine that the wayside device 206 is not in a location
where the wayside device 206 was installed, that the wayside device
206 has become damaged or stolen, that the terrain is occluding the
view of the wayside device 206, or the like.
[0064] Responsive to identifying the unacceptable conditions of the
wayside devices 206, 208, the analysis processor 116 may flag or
otherwise mark the image data that represents the damage. By "flag"
or "mark," it is meant that the portion or subset of the image data
that includes the damaged section of the route 104 can be saved to
the memory device 114 with additional data (e.g., metadata or other
information) that indicates which portion of the image data
represents these wayside devices 206, 208. Optionally, the analysis
processor 116 may save data representative of which portion of the
image data shows these wayside devices 206, 208 in another location
(e.g., separate from the image data).
[0065] In the illustrated example, the image 200 generated from the
image data includes location and/or time information 214 that is
shown on the image. The location and/or time information 214 can
represent the location where the image data that is shown in the
image 200 was acquired and/or when this image data was acquired.
For example, the location and/or time information 214 can indicate
the GPS coordinates of the wayside devices 206, 208 that are shown
in the image 200. This information can be overlaid on the image 200
to assist viewers of the image 200 in determining where and/or when
the wayside devices 206, 208 were identified as being damaged or
otherwise in unacceptable condition.
[0066] The analysis processor 116 can generate one or more
reporting signals responsive to identifying the unacceptable
conditions of the wayside devices 206, 208. The reporting signals
can include the portions of the image data that show the wayside
devices 206, 208. For example, the reporting signals can include an
edited version of the image data, with the portions of the image
data that show the wayside devices 206, 208 included and other
portions of the image data that do not show the wayside devices
206, 208 not included.
[0067] The reporting signals with the included edited image data
can be locally saved onto the memory device 114 onboard the vehicle
system 100 and/or communicated to an off-board location having a
memory device 114 for storing the edited image data. As described
above, location data may be included in the image data such that
the location data is included in the edited image data.
[0068] In one aspect, the analysis processor 116 may examine image
data of the wayside devices obtained at different times to
determine if the conditions of the wayside devices are changing
over time. For example, the analysis processor 116 can compare
image data of the same location of the route 104 obtained at
different times over the course of several days, weeks, months, or
years. Based on this comparison and changes in the wayside devices,
the analysis processor 116 can determine that the condition of the
wayside devices is exhibiting a downward or negative trend.
[0069] The analysis processor 116 can compile the image data of the
same section of the route 104 obtained at different times into
compilation image data. This compilation image data (and/or other
image data described herein) can be presented to an operator of the
imaging system 102, such as on a display device 122. The display
device 122 may be a monitor, television, touchscreen, or other
output device that visually presents image data. The display device
122 can be disposed onboard the vehicle system 100 and/or at an
off-board facility, such as a dispatch center or elsewhere. The
operator can examine the compilation image data to identify the
changes in the route 104 over time. Optionally, the analysis
processor 116 may use one or more software algorithms, such as edge
detection, pixel metrics, or the like, to identify objects in the
compilation image data. The analysis processor 116 can then
automatically identify changes or trends in the condition of the
wayside devices using the compilation image data.
[0070] In one embodiment, the memory device 114 and/or the analysis
processor 116 may receive image data of the same wayside devices
obtained by different vehicle systems 100. For example, as separate
vehicle systems 100 that are not traveling together travel over the
same segment of the route 104 obtain image data of the wayside
devices at different times, the image data can be sent to the
analysis processor 116 and/or memory device 114. The analysis
processor 116 and/or memory device 114 can be disposed onboard of
off-board the vehicle system 100. The analysis processor 116 can
examine this image data of the same wayside devices from diverse
sources (e.g., the imaging systems 102 on different vehicle systems
100) to identify damage to the wayside devices and/or trends in the
condition of the wayside devices.
[0071] A communication system 124 of the imaging system 102
represents hardware circuits or circuitry that include and/or are
connected with one or more computer processors (e.g.,
microprocessors) and communication devices (e.g., wireless antenna
126 and/or wired connections 128) that operate as transmitters
and/or transceivers for communicating signals with one or more
locations disposed off-board the vehicle system 100. For example
the communication system 124 may wirelessly communicate signals
(e.g., image data) via the antenna 126 and/or communicate the
signals over the wired connection 128 (e.g., a cable, bus, or wire
such as a multiple unit cable, train line, or the like) to a
facility and/or another vehicle system, or the like.
[0072] In one embodiment, the imaging system 102 is disposed
onboard a vehicle system 100 that propels cargo. For example,
instead of the imaging system 102 being disposed onboard a
specially outfitted vehicle (e.g., a vehicle designed for the sole
purpose of inspecting the route), the imaging system 102 may be
disposed onboard a vehicle system that is designed to propel cargo
so that the imaging system 102 can inspect the wayside devices 206,
208 at the same time that the vehicle system 100 is moving cargo
(e.g., goods such as commodities, commercial products, or the
like). The imaging system 102 may automatically obtain and/or
examine image data during travel of the vehicle system 102 during
other operations of the vehicle system 100 (e.g., moving freight)
and, as a result, anomalies (e.g., damage) to the wayside devices
206, 208 can be identified on a repeated basis. As the vehicle
system 100 travels and the imaging system 102 discovers problems
with the wayside devices 206, 208, the communication system 124 may
communicate (e.g., transmit and/or broadcast) alarm signals to
notify others of the damage to the wayside devices 206, 208, trends
in the conditions of the wayside devices 206, 208, or the like, to
one or more off-board locations. For example, upon identifying a
problem with one or more of the wayside devices 206, 208, the
imaging system 102 can cause the communication system 124 to send
an alarm signal to a repair facility so that one or more
maintenance crews of workers can be sent to the location of the
wayside devices 206, 208 for further inspection of the wayside
devices 206, 208 and/or repair to the wayside devices 206, 208.
[0073] FIG. 5 illustrates an image 1100 representative of
additional image data generated by the camera 106 for examination
by the analysis processor 116 according to one embodiment. The
image 1100 illustrates portions of the rails 202, 204 of the route
104 as viewed from the interior camera 106 through the windows 108
of the vehicle system 100. Also viewable in the image 1100 between
the rails 202, 204 are supporting bodies 1102. The supporting
bodies 1102 can represent ties, such as railroad ties, railway
ties, crossties, railway sleepers, or other bodies that support the
rails 202, 204 and/or assist in keeping the rails 202, 204 parallel
to each other. As described above, the analysis processor 116 can
examine the image data to identify objects (e.g., shapes) in the
image data. These shapes may correspond to the supporting bodies
1102. The shapes of these objects can be identified using pixel
metrics, edge detection, pixel gradients, comparisons to benchmark
images, object detection, or the like. The analysis processor 116
examines the image data to identify problems with the route 104. In
one aspect, the analysis processor 116 compares locations and/or
arrangements of the route 104 in the image data with designated
locations and/or arrangements of the route 104, such as which can
be stored in the memory device 114.
[0074] FIG. 6 illustrates one example of a comparison between the
image data acquired by and output from the camera 106 with baseline
image data according to one embodiment. The baseline image data
represents how the supporting bodies 1102 are expected to appear in
the image data output by the camera 106 if the supporting bodies
1102 are in acceptable condition (e.g., have not been moved or
otherwise damaged). The baseline image data includes baseline
objects 1200 shown in dashed lines in FIG. 6. These objects 1200
represent where the supporting bodies 1102 are expected to be when
the vehicle system 100 is at or near a designated location
associated with the baseline objects 1200 and the supporting bodies
1102. The baseline objects 300, 302, 304 may change location, size,
arrangement, or the like, for image data acquired when the vehicle
system 100 is in another location.
[0075] The analysis processor 116 can determine the locations of
the objects representative of the supporting bodies 1102 from the
actual image data that is output by the camera 106 and compare this
to the baseline objects of the baseline image data. The comparison
can involve determining if the actual objects are in the same or
overlapping locations as the baseline objects in the image data or
image 1100. In the illustrated example, the analysis processor 116
can determine that one of the objects representative of one of the
supporting bodies 1102 is in the same location as one of the
baseline objects 1200 or overlaps the area encompassed by the
baseline object 1200 for a least a designated fraction of the
baseline object 1200. As a result, the outer boundaries of the
baseline object 1200 are not visible outside of or inside of the
object representative of one of the supporting bodies 1102.
[0076] But, another object representative of a supporting body 1102
in the image data 1100 appears to be angled with respect to another
baseline object 1200. The analysis processor 116 can compare the
edges of the supporting body 1102 with the baseline object 1200,
the overlap (or lack of overlap) between the areas in the image
data 1100 that are encompassed by the supporting body 1102 and the
baseline object 1200, or the like, to determine if the supporting
body 1102 is out of position. In the illustrated example, the
analysis processor 116 may determine that the supporting body 1102
has shifted or otherwise moved out of position due to the area
encompassed by the supporting body 1102 only overlapping the area
encompassed by the baseline object 1200 by less than a designated
threshold amount.
[0077] As described above, responsive to determining that the
supporting body 1102 is out of position, the analysis processor 116
may flag or otherwise mark the image data 1100. The image data 1100
may be saved with or otherwise associated with time and/or location
data to assist in later determining where the out-of-position
supporting body 1102 is located. The analysis processor 116 can
generate one or more reporting signals responsive to identifying
the unacceptable condition of the supporting body 1102, similar to
as described above.
[0078] FIG. 7 illustrates a flowchart of a method 500 for imaging
one or more wayside devices disposed along a route traveled by one
or more vehicle systems according to one embodiment. The method 500
may be used to obtain image data of the wayside devices to
determine if one or more problems exist with the wayside devices,
such as damage to the wayside devices, a deteriorating condition of
the wayside devices, or the like. At least one embodiment of the
method 500 may be practiced using the imaging system 102 (shown in
FIG. 1) described herein.
[0079] At 502, image data of one or more wayside devices are
obtained from a camera onboard a vehicle system. For example, a
camera that is positioned to obtain video of an operator disposed
on the vehicle system also may obtain video of wayside devices
disposed outside the vehicle system along a route being traveled by
the vehicle system. This camera may be installed to monitor actions
(or lack thereof) of an operator for liability or other purposes
(e.g., such as in post-accident investigations or reconstructions),
but that also may obtain video of the wayside devices, such as
through windows of the vehicle system.
[0080] At 504, the image data is examined. As described above, the
image data can be examined in real time and/or after travel of the
vehicle system is complete. The image data can be examined to
identify locations and/or arrangements of the wayside devices. At
506, a determination is made as to whether the image data indicates
a problem with the wayside devices. For example, the image data can
be examined to determine if the wayside devices are damaged,
vandalized, stolen, or the like, and/or if other objects are on or
near the wayside devices such that the objects could prevent the
wayside devices from performing one or more functions.
[0081] If the image data indicates a problem with the wayside
devices, then one or more corrective actions may need to be taken.
As a result, flow of the method 400 can proceed to 514. On the
other hand, if the image data does not indicate such a problem,
then additional image data (e.g., previously and/or subsequently
acquired image data of the same wayside devices) may be examined to
determine if the conditions of the wayside devices are
deteriorating, if foreign objects (e.g., vegetation) are moving
toward the wayside devices, or the like. As a result, flow of the
method 500 can proceed to 508. Alternatively, no further analysis
is performed on this or other image data, and the method 500 can
terminate.
[0082] At 508, additional image data of the same wayside devices
can be obtained. This additional image data may be generated by one
or more cameras at different times than when the image data is
obtained at 502. For example, the additional image data can be
generated by cameras of imaging systems on other vehicle systems on
prior days, weeks, months, or years, and/or subsequent days, weeks,
months, or years.
[0083] At 510, the image data and additional image data are
compared with each other and examined. The image data and
additional image data are compared and examined in order to
determine if the image data and additional image data exhibit
changes in the wayside devices over time. For example, while a
single or multiple set of the image data acquired at one or more
times may not indicate damage to the wayside devices due to the
damage being relatively minor, examining more image data acquired
over longer periods of time may illustrate a change and/or damage
to the wayside devices more than a smaller amount of image data
and/or image data obtained over shorter time periods.
[0084] At 512, a determination is made as to whether the image data
and the additional image data indicate a trend in the condition of
the one or more wayside devices. For example, the image data and
additional image data can be examined to determine if the wayside
device are gradually becoming more damaged, if the wayside device
is gradually changing location, if vegetation is growing toward
and/or onto the wayside device, if the ground near the wayside
device is eroding or building up onto the wayside device, or the
like. Alternatively, the image data acquired at different times can
be examined to identify damage to the wayside device without
identifying a trend in the condition of the route.
[0085] If the image data and additional image data indicate a
problem with the wayside device, then one or more corrective
actions may need to be taken. As a result, flow of the method 500
can proceed to 514. On the other hand, if the image data and the
additional image data do not indicate such a problem, then flow of
the method 500 may return to 502 so that additional image data of
the same or other wayside devices may be obtained. Alternatively,
no further analysis is performed on this or other image data, and
the method 500 can terminate.
[0086] At 514, one or more alarm signals are generated. The alarm
signals may be communicated from the analysis processor (which is
onboard and/or off-board a vehicle system) to an off-board facility
to request further inspection of the wayside device and/or repair
to the wayside device. Optionally, an alarm signal may be
communicated to one or more vehicle systems to warn the vehicle
systems of the identified problem with the wayside device. In one
aspect, the alarm signal can be communicated to a scheduling system
that generates schedules for vehicle systems so that the scheduling
system can alter the schedules of the vehicle systems to avoid
and/or slow down over the section of the route where the wayside
device identified as having the problem is located.
[0087] FIG. 8 illustrates another image 600 representative of image
data generated by the camera 106 for examination by the analysis
processor 116 according to one embodiment. The image 600
illustrates portions of the rails 202, 204 of the route 104 as
viewed from the interior camera 106 through the windows 108 of the
vehicle system 100. As described above, the analysis processor 116
can examine the image data to identify edges 606, 608, 610, 612 of
the rails 202, 204. These edges can be used to determine locations
of the rails 202, 204 in the image 200.
[0088] As described above, location data can be associated with the
image data in order to indicate where the portion of the route 104
that is shown in the image data is located. For example, the
location data can be included in the image data as metadata or
other data that is saved with the image data. Optionally, the
location data may be stored separately from the image data but
associated with the image data, such as in a table, list, database,
or other memory structure. In the illustrated example, the image
200 generated from the image data includes the location and/or time
information 214 that is shown on the image.
[0089] The analysis processor 116 examines the image data to
identify problems with the route 104. In one aspect, the analysis
processor 116 compares locations and/or arrangements of the route
104 in the image data with designated locations and/or arrangements
of the route 104, such as which can be stored in the memory device
114.
[0090] FIG. 9 illustrates one example of a comparison between the
route 104 as represented by the image data generated by the camera
106 with a baseline route image 700 according to one embodiment.
The baseline route image 700 represents how the route 104 (e.g.,
the rails 202, 204 of the route 104) is to appear in the image data
if the route 104 is not damaged. For example, the baseline route
image 700 can represent locations of the route 104 prior to the
route 104 being damaged or degrading.
[0091] The baseline route image 700 using dashed lines while
locations of the rails 202, 204 are shown using solid lines in FIG.
8. With respect to the rail 202, the location of the rail 202 in
the image data exactly or closely matches or otherwise corresponds
to the designated location of the rail 202 that is represented by
the baseline route image 700 (e.g., the location of the rail 202 is
within a designated range of distances, pixels, or the like). As a
result, the dashed lines of the baseline route image 700 that
correspond to the rail 202 are not visible due to the solid lines
of the actual rail 202 in the image data.
[0092] With respect to the rail 204, however, the location of the
rail 204 in the image data does not exactly or closely match the
designated location of the rail 204 in the baseline route image
700. As a result, the baseline route image 700 and the rail 204 do
not exactly or closely overlap in the image data.
[0093] In one example, the analysis processor 116 can identify this
mismatch between the baseline route image 700 and the image data
and determine that one or more of the rails 202, 204 have become
damaged, such as by being bent, twisted, broken, or otherwise
damaged. Optionally, the analysis processor 116 can measure
distances between the rails 202, 204 in the image data and compare
these distances to one or more designated distances representative
of the gauge of the route 104. If the measured distances differ
from the designated distances by more than a threshold amount, then
the analysis processor 116 determines that the route 104 is damaged
at or near the location where the image data was obtained (as
determined from the location data).
[0094] Responsive to identifying the damage to the route 104, the
analysis processor 116 may flag or otherwise mark the image data
that represents the damage. By "flag" or "mark," it is meant that
the portion or subset of the image data that includes the damaged
section of the route 104 can be saved to the memory device 114 with
additional data (e.g., metadata or other information) that
indicates which portion of the image data represents the damaged
section of the route 104. Optionally, the analysis processor 116
may save data representative of which portion of the image data
includes the damaged portion of the route 104 in another location
(e.g., separate from the image data).
[0095] The analysis processor 116 can generate one or more
reporting signals responsive to identifying one or more damaged
sections of the route 104. The reporting signals can include the
portions of the image data that show the damaged sections. For
example, the reporting signals can include an edited version of the
image data, with the portions of the image data that show the
damaged sections of the route included and other portions of the
image data that do not show the damaged sections not included.
[0096] The reporting signals with the included edited image data
can be locally saved onto the memory device 114 onboard the vehicle
system 100 and/or communicated to an off-board location having a
memory device 114 for storing the edited image data. As described
above, location data may be included in the image data such that
the location data is included in the edited image data.
[0097] In one aspect, the analysis processor 116 may examine image
data of the same route 104 obtained at different times to determine
if the health of the route 104 is changing over time. For example,
the analysis processor 116 can compare image data of the same
location of the route 104 obtained at different times over the
course of several days, weeks, months, or years. Based on this
comparison and changes in the shape, location, arrangement, and the
like, of the rails of the route 104, other objects near the route
104 (e.g., vegetation, ballast material, erosion or movement of
hills near the route 104, etc.), and the like, the analysis
processor 116 can determine that the health of the route 104 at
that location is exhibiting a downward or negative trend. For
example, the location of one or more rails may gradually change,
the vegetation and/or hillside may gradually move toward and/or
encroach onto the rails, the amount of ballast material may
gradually change, or the like.
[0098] The analysis processor 116 can compile the image data of the
same section of the route 104 obtained at different times into
compilation image data. This compilation image data (and/or other
image data described herein) can be presented to an operator of the
imaging system 102, such as on the display device 122. The operator
can examine the compilation image data to identify the changes in
the route 104 over time. Optionally, the analysis processor 116 may
use one or more software algorithms, such as edge detection, pixel
metrics, or the like, to identify objects in the compilation image
data. The analysis processor 116 can then automatically identify
changes or trends in the health of the route 104 using the
compilation image data.
[0099] In one embodiment, the memory device 114 and/or the analysis
processor 116 may receive image data of the same section of the
route 104 obtained by different vehicle systems 100. For example,
as separate vehicle systems 100 that are not traveling together
travel over the same segment of the route 104 obtain image data of
the route 104 at different times, the image data can be sent to the
analysis processor 116 and/or memory device 114. The analysis
processor 116 and/or memory device 114 can be disposed onboard of
off-board the vehicle system 100. The analysis processor 116 can
examine this image data of the same segment of the route 104 from
diverse sources (e.g., the imaging systems 102 on different vehicle
systems 100) to identify damage to the route and/or trends in the
health of the route 104.
[0100] In one embodiment, the imaging system 102 is disposed
onboard a vehicle system 100 that propels cargo. For example,
instead of the imaging system 102 being disposed onboard a
specially outfitted vehicle (e.g., a vehicle designed for the sole
purpose of inspecting the route), the imaging system 102 may be
disposed onboard a vehicle system that is designed to propel cargo
so that the imaging system 102 can inspect the route 104 at the
same time that the vehicle system 100 is moving cargo (e.g., goods
such as commodities, commercial products, or the like). The imaging
system 102 may automatically obtain and/or examine image data
during travel of the vehicle system 102 during other operations of
the vehicle system 100 (e.g., moving freight) and, as a result,
anomalies (e.g., damage) to the route 104 can be identified on a
repeated basis. As the vehicle system 100 travels and the imaging
system 102 discovers problems with the route 104, the communication
system 124 may communicate (e.g., transmit and/or broadcast) alarm
signals to notify others of the damage to the route 104, trend in
health of the route 104, or the like, to one or more off-board
locations. For example, upon identifying a problem with the route
104, the imaging system 102 can cause the communication system 124
to send an alarm signal to a repair facility so that one or more
maintenance crews of workers can be sent to the location of the
problem for further inspection of the route 104 and/or repair to
the route 104.
[0101] FIG. 10 illustrates a schematic diagram of a vehicle consist
800 having one or more imaging systems 102 in accordance with one
embodiment. The vehicle consist 800 can include two or more vehicle
systems 100 (e.g., vehicle systems 100a-c) mechanically connected
with each other directly or by one or more other vehicles 802
(e.g., non-propulsion generating vehicles such as rail cars, or
other propulsion-generating vehicle systems). The vehicle systems
100 and/or the cameras 106 (shown in FIG. 1) of the imaging systems
102 may be oriented in different directions. For example, the
cameras 106 onboard the vehicle systems 100a, 100b may be oriented
along a direction of movement of the vehicle consist 800, while the
camera 106 onboard the vehicle system 100c may be oriented in an
opposite direction (e.g., rearward). The vehicle systems 100 may
have separate imaging systems 102, or a single imaging system 102
may span across multiple vehicle systems 100. For example, an
imaging system 102 may include a controller 112 (shown in FIG. 1)
that controls multiple cameras 106 onboard different vehicle
systems 100 in the consist 800 and/or an analysis processor 116
(shown in FIG. 1) that examines image data obtained from multiple
cameras 106 disposed onboard different vehicle systems 100 in the
consist 800.
[0102] The imaging systems 102 onboard the different vehicle
systems 100 in the same vehicle consist 800 can coordinate
operation of the cameras 106 onboard the different vehicle systems
100. For example, a first vehicle system 100a can relay information
to a second vehicle system 100b or 100c in the same vehicle consist
800. The imaging system 102 of the first vehicle system 100a may
identify a segment of the route 104 and/or a wayside device that
may be damaged (as described above). Responsive to this
identification, the controller 112 and/or the analysis processor
116 can instruct another camera 106 onboard the same or a different
vehicle system 100 of the consist 800 to change one or more
operational settings and obtain additional image data. This
additional image data may then be examined to confirm or refute
identification of damage to the route and/or wayside device.
[0103] The operational settings that may be changed can include the
focus, position, resolution, or the like, of the camera 106 onboard
the second vehicle system 100. For example, responsive to image
data acquired by a first camera 106 onboard the vehicle system 100a
indicating that the route and/or wayside device is potentially
damaged, the controller 112 and/or analysis processor 116 can send
a signal to one or more additional cameras 106 disposed onboard the
vehicle system 100b and/or 100c. This signal can instruct the
additional camera(s) 106 to change focus (e.g., to increase or
decrease the focal distance or point of the cameras 106) to obtain
clearer image data of the potentially damaged route segment and/or
wayside device. The signal optionally can direct the additional
camera(s) 106 to change position (e.g., to change tilt, rotation,
pitch, yaw, or the like) so that the image data of the additional
camera(s) 106 encompasses the potentially damaged route segment
and/or wayside device. The signal may instruct the additional
camera(s) 106 to change resolution (e.g., a number of pixels per
unit area) so that more image data of the potentially damaged route
segment and/or wayside device is obtained.
[0104] The controller 112 and/or analysis processor 116 may send
information to cause the camera 106 of the second vehicle system
100 to obtain image data of one or more locations of interest that
may include the potentially damaged route and/or wayside device.
For example, if a first locomotive goes over a piece of track, and
the image data obtained by the first locomotive identifies an area
of the track and/or wayside equipment that may be need of repair or
further inspection, then information could be relayed to a second
locomotive (for instance to the rear distributed power unit at the
rear of the train) to focus in on that particular rail or location,
and even change the resolution to get a better image data of the
issue. Optionally, the information can be relayed between different
vehicle consists that are not connected with each other. For
example, responsive to identifying a damaged route and/or wayside
device, the controller 112 and/or analysis processor 116 of an
imaging system 102 onboard one vehicle consist 800 can direct the
communication system 124 (shown in FIG. 1) to transmit and/or
broadcast an assistance signal to one or more other vehicle
consists 800. This signal can request the other vehicle consists
800 obtain image data of the damaged route and/or wayside device so
that the damage can be confirmed, refuted, further characterized,
or the like.
[0105] FIG. 11 illustrates a top view of the vehicle consist 800
shown in FIG. 10 according to one embodiment. Only the vehicle
system 100a and the vehicle system 100c are shown in FIG. 11. The
imaging system 102 may determine an orientation of the field of
view 110 of one or more cameras 106. For example, the imaging
system 102 may be able to automatically determine if the field of
view 110 of a camera 106 is oriented forward or backward relative
to a direction of travel 900 of the consist 800. This could help
identify trouble areas of the tracks, and also just be another way
to determine direction as opposed to a hard switch.
[0106] As one example, the imaging system 102 can use a change in
the size of an object 902 in the field of view 110 of a camera 106.
As the consist 800 approaches and moves by the object 902 (e.g., a
wayside device, sign, or signal), the object 902 may change in size
in the image data obtained by the camera 106 of the vehicle system
100a. As the consist 800 passes the object 902, the object 902 may
appear in the image data obtained by the camera 106 of the vehicle
system 100c and then change size.
[0107] If the object 902 is detected in the image data and the
object 902 gets smaller in the image data over time (e.g., in the
image data obtained by the camera 106 of the vehicle system 100c),
then the analysis processor 116 of the imaging system 102 can
determine that the camera 106 is facing opposite of the direction
of travel 900 of the vehicle consist 800. Conversely, if the object
902 is progressively getting larger in the image data (e.g., in the
image data obtained by the camera 106 of the vehicle system 100a),
then the imaging system 102 may determine that the field of view
110 of the camera 106 is oriented in the same direction as the
direction of travel 900.
[0108] Optionally, changing intensities of colors or lights in the
image data may be used to determine the direction in which a camera
106 is facing. The locations of wayside signals can be updated in
the memory device 114 (shown in FIG. 1) so the controller 112
and/or analysis processor 116 of the vehicle consist 800 knows
which wayside signals to communicate with and where the limits of
the wayside signals are located. If the analysis processor 116
identifies red, yellow, or green light (or another color) in an
expected location of the wayside signal in the image data, and the
light is getting more intense (e.g., brighter, as could occur in
the image data obtained by the camera 106 of the vehicle system
100a as the consist 800 passes a light along the direction of
travel 900), then the analysis processor 116 can determine that the
camera 106 is facing toward the direction of travel 900 of the
vehicle consist 800. Conversely, if the intensity of the light is
decreasing (e.g., becoming dimmer, as could occur in the image data
obtained by the camera 106 of the vehicle system 100c), then the
analysis processor 116 can determine that the camera 106 is facing
away from the direction of travel 900. The analysis processor 116
can combine this analysis of the light intensity with the changing
size of the wayside device (e.g., signal) in order to determine the
direction that the camera 106 is facing. Optionally, the analysis
processor 116 can examine changing sizes of objects or changing
light intensities, but not both, to determine orientations of
cameras 106.
[0109] The analysis processor 116 can use the determination of
which direction a camera is facing to direct the camera 106 to
change operational settings. For example, the analysis processor
116 can determine which direction the camera 106 of the vehicle
system 100c is facing using changing object sizes and/or light
intensities. The analysis processor 116 can direct this camera 106
where to focus, change the field of view, or the like, based on
this orientation. For example, if the image data obtained by the
camera 106 of the vehicle system 100a identifies damage to the rail
204 of the route 104, then the analysis processor 116 can direct
the camera 106 of the vehicle system 100c to change the focus,
field of view, resolution, or the like, of the camera 106 so that
the image data captures the rail 204. The analysis processor 116
could direct the camera 106 of the vehicle system 100c to focus in
on the rail 204 and exclude the rail 202 from the field of view in
the image data. As a result, the same track is examined by the
cameras on the leading and trailing vehicles.
[0110] FIG. 12 illustrates a flowchart of a method 400 for imaging
a route traveled by one or more vehicle systems according to one
embodiment. The method 400 may be used to obtain image data of a
route and to determine if one or more problems exist with the
route, such as damage to the route, a deteriorating health (e.g.,
condition) of the route, or the like. At least one embodiment of
the method 400 may be practiced using the imaging system 102 (shown
in FIG. 1) described herein.
[0111] At 402, image data of the route is obtained from a camera
onboard a vehicle system. For example, a camera that is positioned
to obtain video of an operator disposed on the vehicle system also
may obtain video of a portion of the route. This camera may be
installed to monitor actions (or lack thereof) of an operator for
liability or other purposes (e.g., such as in post-accident
investigations or reconstructions), but that also may obtain video
of a portion of the route, such as through windows of the vehicle
system.
[0112] At 404, the image data is examined. As described above, the
image data can be examined in real time and/or after travel of the
vehicle system is complete. The image data can be examined to
identify locations and/or arrangements of components of the route,
such as the relative locations and/or spacing of rails of a track.
The image data optionally can be examined to monitor conditions of
other objects on or near the route, as described above.
[0113] At 406, a determination is made as to whether the image data
indicates a problem with the route. For example, the image data can
be examined to determine if the route is broken, bent, twisted, or
the like, and/or if other objects are on or near the route such
that the objects could cause problems for travel along the route
for the vehicle system obtaining the image data and/or another
vehicle system.
[0114] If the image data indicates a problem with the route, then
one or more corrective actions may need to be taken. As a result,
flow of the method 400 can proceed to 414. On the other hand, if
the image data does not indicate such a problem, then additional
image data (e.g., previously and/or subsequently acquired image
data of the same segment of the route) may be examined to determine
if the condition of the route is deteriorating, if foreign objects
(e.g., vegetation) are moving toward the route over time, if
ballast material needs cleaning and/or replacing, or the like. As a
result, flow of the method 400 can proceed to 408. Alternatively,
no further analysis is performed on this or other image data, and
the method 400 can terminate.
[0115] At 408, additional image data of the same segment of the
route can be obtained. This additional image data may be generated
by one or more cameras at different times than when the image data
is obtained at 402. For example, the additional image data can be
generated by cameras of imaging systems on other vehicle systems on
prior days, weeks, months, or years, and/or subsequent days, weeks,
months, or years.
[0116] At 410, the image data and additional image data are
compared with each other and examined. The image data and
additional image data are compared and examined in order to
determine if the image data and additional image data exhibit
changes in the route over time. For example, while a single or
multiple set of the image data acquired at one or more times may
not indicate damage to the route due to the damage being relatively
minor, examining more image data acquired over longer periods of
time may illustrate a change in locations and/or damage to the
route more than a smaller amount of image data and/or image data
obtained over shorter time periods.
[0117] At 412, a determination is made as to whether the image data
and the additional image data indicate a trend in the condition
(e.g., health) of the route. For example, the image data and
additional image data can be examined to determine if the route is
gradually becoming more damaged, if the route is gradually changing
location, if vegetation is growing toward and/or onto the route, or
the like. Alternatively, the image data acquired at different times
can be examined to identify damage to the route without identifying
a trend in the condition of the route.
[0118] If the image data and additional image data indicate a
trending problem with the route, then one or more corrective
actions may need to be taken. As a result, flow of the method 400
can proceed to 414. On the other hand, if the image data and the
additional image data do not indicate such a trending problem, then
flow of the method 400 may return to 402 so that additional image
data of the route may be obtained. Alternatively, no further
analysis is performed on this or other image data, and the method
400 can terminate.
[0119] At 414, one or more alarm signals are generated. The alarm
signals may be communicated from the analysis processor (which is
onboard and/or off-board a vehicle system) to an off-board facility
to request further inspection of the route and/or repair to the
route. Optionally, an alarm signal may be communicated to one or
more vehicle systems to warn the vehicle systems of the identified
problem with the route. In one aspect, the alarm signal can be
communicated to a scheduling system that generates schedules for
vehicle systems so that the scheduling system can alter the
schedules of the vehicle systems to avoid and/or slow down over the
section of the route identified as having the problem.
[0120] In one example of the inventive subject matter described
herein, a system (e.g., a wayside device imaging system) includes a
digital camera and one or more analysis processors. The digital
camera is configured to be disposed in a first vehicle system to
generate image data within a field of view of the camera. The field
of view includes at least a portion of a cab of the first vehicle
system and one or more wayside devices disposed along a route being
traveled by the first vehicle system. The cab includes a space
where an operator of the first vehicle system is located during
travel of the first vehicle system. The one or more analysis
processors are configured to examine the image data generated by
the camera to identify a condition of the one or more wayside
devices. The condition includes at least one of damage to the one
or more wayside devices, a missing wayside device, deterioration of
the one or more wayside devices, or a change in terrain at or near
the one or more wayside devices.
[0121] In one aspect, the condition that is identified includes
only one of the damage to the one or more wayside devices, the
missing wayside device, the deterioration of the one or more
wayside devices, or the change in terrain at or near the one or
more wayside devices. Optionally, the condition that is identified
includes all of the damage to the one or more wayside devices, the
missing wayside device, the deterioration of the one or more
wayside devices, or the change in terrain at or near the one or
more wayside devices. Alternatively, the condition that is
identified includes two to three, but not all, of the damage to the
one or more wayside devices, the missing wayside device, the
deterioration of the one or more wayside devices, or the change in
terrain at or near the one or more wayside devices.
[0122] In another aspect, the condition that is identified includes
the damage to the route, the deteriorating condition of the route,
and the condition of the one or more wayside devices. The condition
of the one or more wayside devices can include damage to the one or
more wayside devices, a missing wayside device, deterioration of
the one or more wayside devices, and a change in terrain at or near
the one or more wayside devices.
[0123] Optionally, the condition that is identified includes damage
to the route and a deteriorating condition of the route, but does
not include damage to the one or more wayside devices, the missing
wayside device, the deterioration of the one or more wayside
devices, or the change in terrain at or near the one or more
wayside devices.
[0124] In another aspect, the condition that is identified includes
damage to one or more wayside devices, a missing wayside device,
deterioration of the one or more wayside devices, and a change in
terrain at or near the one or more wayside devices, but does not
include damage to the route or a deteriorating condition of the
route.
[0125] In one aspect, the digital camera is a high definition
camera.
[0126] In one aspect, the one or more analysis processors are
configured to be disposed onboard the first vehicle system for
examination of the image data.
[0127] In one aspect, the one or more analysis processors are
configured to identify the condition of the one or more wayside
devices based on at least one of an edge detection algorithm, pixel
metrics, an object detection algorithm, baseline image data, or a
pixel gradient in the image data.
[0128] In one aspect, the one or more wayside devices include a
signaling light and the one or more analysis processors are
configured to identify a broken or missing light of the signaling
light based on the image data.
[0129] In one aspect, the one or more analysis processors are
configured to edit the image data acquired during a trip of the
first vehicle system to create edited image data that includes the
image data representative of the condition of the one or more
wayside devices and that does not include other image data.
[0130] In one aspect, the one or more analysis processors are
configured to determine a location of the first vehicle system when
the image data representative of the one or more wayside devices is
acquired. The one or more analysis processors are configured to
examine the image data representative of the one or more wayside
devices and to not examine the image data acquired at one or more
other locations.
[0131] In one aspect, the one or more analysis processors are
configured to examine the image data from two or more previous
trips of the first vehicle system and at least a second vehicle
system over a common segment of the route to identify the condition
of the one or more wayside devices.
[0132] In one aspect, the one or more analysis processors are
configured to determine a location of the first vehicle system when
the image data representative of at least one of damage to the
route, the deteriorating condition of the route, or the condition
of the one or more wayside devices is obtained. The one or more
analysis processors also can be configured to examine the image
data representative of at least one of the damage to the route, the
deteriorating condition of the route, or the condition of the one
or more wayside devices at the location, but to not examine the
image data acquired at one or more other locations.
[0133] In another example of the inventive subject matter described
herein, a method (e.g., for imaging a wayside device) includes
generating image data within a field of view of a camera disposed
onboard a first vehicle system. The field of view includes at least
a portion of a cab of the first vehicle system and one or more
wayside devices disposed along a route being traveled by the first
vehicle system. The cab includes a space where an operator of the
first vehicle system is located during travel of the first vehicle
system. The method also includes examining (using one or more
analysis processors) the image data generated by the camera to
identify a condition of the one or more wayside devices. The
condition of the one or more wayside devices includes at least one
of damage to the one or more wayside devices, a missing wayside
device, deterioration of the one or more wayside devices, or a
change in terrain at or near the one or more wayside devices.
[0134] In one aspect, the image data is generated and examined
while the first vehicle system is moving along the route.
[0135] In one aspect, the condition of the one or more wayside
devices is identified based on at least one of an edge detection
algorithm, pixel metrics, an object detection algorithm, baseline
image data, or a pixel gradient in the image data.
[0136] In one aspect, the one or more wayside devices include a
signaling light and the image data is examined to identify a broken
or missing light of the signaling light based on the image
data.
[0137] In one aspect, the method also includes editing the image
data acquired during a trip of the first vehicle system to create
edited image data that includes the image data representative of
the condition of the one or more wayside devices and that does not
include other image data.
[0138] In one aspect, the method also includes determining a
location of the first vehicle system when the image data
representative of the one or more wayside devices is acquired. The
image data representative of the one or more wayside devices is
examined based on the location, and the image data acquired at one
or more other locations is not examined.
[0139] In one aspect, the image data is examined from two or more
previous trips of the first vehicle system and at least a second
vehicle system over a common segment of the route to identify the
condition of the one or more wayside devices.
[0140] In another example of the inventive subject matter described
herein, another system (e.g., a rail vehicle imaging system)
includes a digital camera and one or more analysis processors. The
camera is configured to be disposed in a rail vehicle and to
generate image data within a field of view of the camera. The field
of view includes at least a portion of a cab of the rail vehicle
and one or more wayside devices along a track being traveled by the
rail vehicle. The cab includes a space where an operator of the
rail vehicle is located during travel of the rail vehicle. The one
or more analysis processors are configured to be disposed onboard
the rail vehicle and to examine the image data generated by the
camera to identify a condition of the one or more wayside devices.
The condition includes at least one of damage to the one or more
wayside devices, a missing wayside device, or a changing condition
of terrain at or near the one or more wayside devices.
[0141] In one aspect, the digital camera is a high definition
camera.
[0142] In one aspect, the one or more analysis processors are
configured to identify the condition of the one or more wayside
devices based on at least one of an edge detection algorithm, pixel
metrics, an object detection algorithm, baseline image data, or a
pixel gradient in the image data.
[0143] In one aspect, the one or more wayside devices include at
least one of an inspection wayside device that inspects the rail
vehicle as the rail vehicle moves past the inspection wayside
device or a signaling wayside device that communicates information
with the rail vehicle as the rail vehicle moves past the signaling
wayside device.
[0144] In one aspect, the one or more analysis processors are
configured to determine a location of the rail vehicle when the
image data representative of the condition of the one or more
wayside devices is imaged.
[0145] In one example of the inventive subject matter described
herein, a system (e.g., an imaging system) includes a digital
camera and one or more analysis processors. The digital camera is
configured to be disposed in a first vehicle system and to generate
image data within a field of view of the camera. The field of view
includes at least a portion of a cab of the first vehicle system
and a portion of a route being traveled by the first vehicle
system. The cab includes a space where an operator of the first
vehicle system is located during travel of the first vehicle
system. The one or more analysis processors are configured to
examine the image data generated by the camera to identify at least
one of damage to the route or a deteriorating condition of the
route.
[0146] In one aspect, the digital camera is a high definition
camera.
[0147] In one aspect, the one or more analysis processors are
configured to be disposed onboard the first vehicle system for
examination of the image data.
[0148] In one aspect, the one or more analysis processors are
configured to identify the at least one of damage to the route or
the deteriorating condition of the route using at least one of edge
detection algorithms or pixel metrics.
[0149] In one aspect, the one or more analysis processors are
configured to identify the damage to the route as shifting of one
or more supporting bodies that connect rails of the route, bending
of the rails of the route, twisting of the rails of the route, or
spacing between the rails of the route that differs from a
designated distance.
[0150] In one aspect, the one or more analysis processors are
configured to edit the image data acquired during a trip of the
first vehicle system to create edited image data that includes the
image data representative of the at least one of damage to the
route or the deteriorating condition of the route and that does not
include other image data.
[0151] In one aspect, the one or more analysis processors are
configured to determine a location of the first vehicle system when
the image data representative of where the at least one of damage
to the route or the deteriorating condition of the route is
imaged.
[0152] In one aspect, the one or more analysis processors are
configured to examine the image data from two or more previous
trips of the first vehicle system and at least a second vehicle
system over a common segment of the route to identify the at least
one of damage to the route or the deteriorating condition of the
route.
[0153] In another example of the inventive subject matter described
herein, a method (e.g., an imaging method) includes generating
image data within a field of view of a camera disposed onboard a
first vehicle system. The field of view includes at least a portion
of a cab of the first vehicle system and a portion of a route being
traveled by the first vehicle system. The cab includes a space
where an operator of the first vehicle system is located during
travel of the first vehicle system. The method also includes
examining (using one or more analysis processors) the image data
generated by the camera to identify at least one of damage to the
route or a deteriorating condition of the route.
[0154] In one aspect, the image data is generated and examined
while the first vehicle system is moving along the route.
[0155] In one aspect, the at least one of damage to the route or
the deteriorating condition of the route is identified by using at
least one of edge detection algorithms or pixel metrics.
[0156] In one aspect, the damage to the route is identified by the
one or more analysis processors as shifting of one or more
supporting bodies that connect rails of the route, bending of the
rails of the route, twisting of the rails of the route, or spacing
between the rails of the route that differs from a designated
distance.
[0157] In one aspect, the method also includes editing the image
data acquired during a trip of the first vehicle system to create
edited image data that includes the image data representative of
the at least one of damage to the route or the deteriorating
condition of the route and that does not include other image
data.
[0158] In one aspect, the method also includes determining a
location of the first vehicle system when the image data
representative of where the at least one of damage to the route or
the deteriorating condition of the route is imaged.
[0159] In one aspect, examining the image data includes examining
the image data from two or more previous trips of the first vehicle
system and at least a second vehicle system over a common segment
of the route to identify the at least one of damage to the route or
the deteriorating condition of the route.
[0160] In another example of the inventive subject matter described
herein, another system (e.g., an imaging system) includes a digital
camera and one or more analysis processors. The digital camera is
configured to be disposed in a rail vehicle and to generate image
data within a field of view of the camera. The field of view
includes at least a portion of a cab of the rail vehicle and a
portion of a track being traveled by the rail vehicle. The cab
includes a space where an operator of the rail vehicle is located
during travel of the rail vehicle. The one or more analysis
processors are configured to be disposed onboard the rail vehicle
and to examine the image data generated by the camera to identify
at least one of damage to the track or a deteriorating condition of
the track.
[0161] In one aspect, the digital camera is a high definition
camera.
[0162] In one aspect, the one or more analysis processors are
configured to identify the at least one of damage to the track or
the deteriorating condition of the track using at least one of edge
detection algorithms or pixel metrics.
[0163] In one aspect, the one or more analysis processors are
configured to identify the damage to the track as bending of rails
of the track, twisting of the rails of the track, or spacing
between the rails of the track that differs from a designated
distance.
[0164] In one aspect, the one or more analysis processors are
configured to determine a location of the rail vehicle when the
image data representative of where the at least one of damage to
the track or the deteriorating condition of the track is
imaged.
[0165] In one example of the inventive subject matter described
herein, a system (e.g., an imaging system) includes a digital
camera and one or more analysis processors. The digital camera is
configured to be disposed in a first vehicle system and to generate
image data within a field of view of the camera. The field of view
includes at least a portion of a cab of the first vehicle system
and a portion of a route being traveled by the first vehicle
system. The cab includes a space where an operator of the first
vehicle system is located during travel of the first vehicle
system. The one or more analysis processors are configured to
examine the image data generated by the camera to identify at least
one of damage to the route or a deteriorating condition of the
route.
[0166] In one aspect, the digital camera is a high definition
camera.
[0167] In one aspect, the one or more analysis processors are
configured to be disposed onboard the first vehicle system for
examination of the image data.
[0168] In one aspect, the one or more analysis processors are
configured to identify the at least one of damage to the route or
the deteriorating condition of the route using at least one of edge
detection algorithms or pixel metrics.
[0169] In one aspect, the one or more analysis processors are
configured to identify the damage to the route as bending of rails
of the route, twisting of the rails of the route, or spacing
between the rails of the route that differs from a designated
distance.
[0170] In one aspect, the one or more analysis processors are
configured to edit the image data acquired during a trip of the
first vehicle system to create edited image data that includes the
image data representative of the at least one of damage to the
route or the deteriorating condition of the route and that does not
include other image data.
[0171] In one aspect, the one or more analysis processors are
configured to determine a location of the first vehicle system when
the image data representative of where the at least one of damage
to the route or the deteriorating condition of the route is
imaged.
[0172] In one aspect, the one or more analysis processors are
configured to examine the image data from two or more previous
trips of the first vehicle system and at least a second vehicle
system over a common segment of the route to identify the at least
one of damage to the route or the deteriorating condition of the
route.
[0173] In another example of the inventive subject matter described
herein, a method (e.g., an imaging method) includes generating
image data within a field of view of a camera disposed onboard a
first vehicle system. The field of view includes at least a portion
of a cab of the first vehicle system and a portion of a route being
traveled by the first vehicle system. The cab includes a space
where an operator of the first vehicle system is located during
travel of the first vehicle system. The method also includes
examining (using one or more analysis processors) the image data
generated by the camera to identify at least one of damage to the
route or a deteriorating condition of the route.
[0174] In one aspect, the image data is generated and examined
while the first vehicle system is moving along the route.
[0175] In one aspect, the at least one of damage to the route or
the deteriorating condition of the route is identified by using at
least one of edge detection algorithms or pixel metrics.
[0176] In one aspect, the damage to the route is identified by the
one or more analysis processors as bending of rails of the route,
twisting of the rails of the route, or spacing between the rails of
the route that differs from a designated distance.
[0177] In one aspect, the method also includes editing the image
data acquired during a trip of the first vehicle system to create
edited image data that includes the image data representative of
the at least one of damage to the route or the deteriorating
condition of the route and that does not include other image
data.
[0178] In one aspect, the method also includes determining a
location of the first vehicle system when the image data
representative of where the at least one of damage to the route or
the deteriorating condition of the route is imaged.
[0179] In one aspect, examining the image data includes examining
the image data from two or more previous trips of the first vehicle
system and at least a second vehicle system over a common segment
of the route to identify the at least one of damage to the route or
the deteriorating condition of the route.
[0180] In another example of the inventive subject matter described
herein, another system (e.g., an imaging system) includes a digital
camera and one or more analysis processors. The digital camera is
configured to be disposed in a rail vehicle and to generate image
data within a field of view of the camera. The field of view
includes at least a portion of a cab of the rail vehicle and a
portion of a track being traveled by the rail vehicle. The cab
includes a space where an operator of the rail vehicle is located
during travel of the rail vehicle. The one or more analysis
processors are configured to be disposed onboard the rail vehicle
and to examine the image data generated by the camera to identify
at least one of damage to the track or a deteriorating condition of
the track.
[0181] In one aspect, the digital camera is a high definition
camera.
[0182] In one aspect, the one or more analysis processors are
configured to identify the at least one of damage to the track or
the deteriorating condition of the track using at least one of edge
detection algorithms or pixel metrics.
[0183] In one aspect, the one or more analysis processors are
configured to identify the damage to the track as bending of rails
of the track, twisting of the rails of the track, or spacing
between the rails of the track that differs from a designated
distance.
[0184] In one aspect, the one or more analysis processors are
configured to determine a location of the rail vehicle when the
image data representative of where the at least one of damage to
the track or the deteriorating condition of the track is
imaged.
[0185] In another example of the inventive subject matter described
herein, a system (e.g., an imaging system) includes a digital
camera configured to be disposed in a first vehicle system. The
camera is configured to generate image data within a field of view
of the camera. The field of view includes at least a portion of a
cab of the first vehicle system and at least one of a portion of a
route being traveled by the first vehicle system or one or more
wayside devices disposed along the route being traveled by the
first vehicle system. The cab includes a space where an operator of
the first vehicle system is located during travel of the first
vehicle system. The system also can include one or more analysis
processors configured to examine the image data generated by the
camera to identify at least one of damage to the route, a
deteriorating condition of the route, or a condition of the one or
more wayside devices. The condition of the one or more wayside
devices includes at least one of damage to the one or more wayside
devices, a missing wayside device, deterioration of the one or more
wayside devices, or a change in terrain at or near the one or more
wayside devices.
[0186] In one aspect, the one or more analysis processors are
configured to identify at least one of the damage to the route, the
deteriorating condition of the route, or the condition of the one
or more wayside devices based on at least one of an edge detection
algorithm, pixel metrics, an object detection algorithm, baseline
image data, or a pixel gradient in the image data.
[0187] In one aspect, the one or more analysis processors are
configured to edit the image data acquired during a trip of the
first vehicle system to create edited image data that includes the
image data representative of the at least one of the damage to the
route, the deteriorating condition of the route, or the condition
of the one or more wayside devices but that does not include other
image data.
[0188] In one aspect, the one or more analysis processors are
configured to determine a location of the first vehicle system when
the image data representative of at least one of damage to the
route, the deteriorating condition of the route, or the condition
of the one or more wayside devices is obtained. The one or more
analysis processors also can be configured to examine the image
data representative of at least one of damage to the route, a
deteriorating condition of the route, or a condition of the one or
more wayside devices, the condition of the one or more wayside
devices including at least one of damage to the one or more wayside
devices but to not examine the image data acquired at one or more
other locations.
[0189] In another example of the inventive subject matter described
herein, another method (e.g., an imaging method) includes
generating image data within a field of view of a camera disposed
onboard a first vehicle system. The field of view includes at least
a portion of a cab of the first vehicle system and at least one of
a portion of a route being traveled by the first vehicle system or
one or more wayside devices disposed along the route being traveled
by the first vehicle system. The cab includes a space where an
operator of the first vehicle system is located during travel of
the first vehicle system. The method also can include examining,
using one or more analysis processors, the image data generated by
the camera to identify at least one of damage to the route, a
deteriorating condition of the route, or a condition of the one or
more wayside devices. The condition includes at least one of damage
to the one or more wayside devices, a missing wayside device,
deterioration of the one or more wayside devices, or a change in
terrain at or near the one or more wayside devices.
[0190] In one aspect, the at least one of damage to the route, the
deteriorating condition of the route, or the condition of the one
or more wayside devices is identified based on at least one of an
edge detection algorithm, pixel metrics, an object detection
algorithm, baseline image data, or a pixel gradient in the image
data.
[0191] In one aspect, the method also includes editing the image
data acquired during a trip of the first vehicle system to create
edited image data that includes the image data representative of
the at least one of damage to the route, the deteriorating
condition of the route, or the condition of the one or more wayside
devices but does not include other image data.
[0192] In one aspect, the method also includes determining a
location of the first vehicle system when the image data
representative of the at least one of damage to the route, the
deteriorating condition of the route, or the condition of the one
or more wayside devices is generated. The image data that is
representative of the at least one of damage to the route, the
deteriorating condition of the route, or the condition of the one
or more wayside devices is examined based on the location and the
image data acquired at one or more other locations is not
examined.
[0193] In another example of the inventive subject matter described
herein, another system (e.g., an imaging system) includes a digital
camera configured to be disposed in a rail vehicle. The camera is
configured to generate image data within a field of view of the
camera. The field of view includes at least a portion of a cab of
the rail vehicle and at least one of a portion of a track outside
of the rail vehicle or one or more wayside devices along the track
being traveled by the rail vehicle. The cab includes a space where
an operator of the rail vehicle is located during travel of the
rail vehicle. The system also includes one or more analysis
processors configured to be disposed onboard the rail vehicle and
to examine the image data generated by the camera to identify a
condition of at least one of the track or the one or more wayside
devices. The condition includes at least one of damage to the
track, damage to the one or more wayside devices, a missing wayside
device, or a changing condition of terrain at or near the one or
more wayside devices.
[0194] In one aspect, the one or more analysis processors are
configured to identify the condition of at least one of the track
or the one or more wayside devices based on at least one of an edge
detection algorithm, pixel metrics, an object detection algorithm,
baseline image data, or a pixel gradient in the image data.
[0195] In one aspect, the one or more analysis processors are
configured to determine a location of the rail vehicle when the
image data representative of the condition of at least one of the
track or the one or more wayside devices is imaged.
[0196] It is to be understood that the above description is
intended to be illustrative, and not restrictive. For example, the
above-described embodiments (and/or aspects thereof) may be used in
combination with each other. In addition, many modifications may be
made to adapt a particular situation or material to the teachings
of the inventive subject matter without departing from its scope.
While the dimensions and types of materials described herein are
intended to define the parameters of the inventive subject matter,
they are by no means limiting and are exemplary embodiments. Many
other embodiments will be apparent to one of ordinary skill in the
art upon reviewing the above description. The scope of the
inventive subject matter should, therefore, be determined with
reference to the appended claims, along with the full scope of
equivalents to which such claims are entitled. In the appended
claims, the terms "including" and "in which" are used as the
plain-English equivalents of the respective terms "comprising" and
"wherein." Moreover, in the following claims, the terms "first,"
"second," and "third," etc. are used merely as labels, and are not
intended to impose numerical requirements on their objects.
Further, the limitations of the following claims are not written in
means-plus-function format and are not intended to be interpreted
based on 35 U.S.C. .sctn. 112(f), unless and until such claim
limitations expressly use the phrase "means for" followed by a
statement of function void of further structure.
[0197] This written description uses examples to disclose several
embodiments of the inventive subject matter and also to enable a
person of ordinary skill in the art to practice the embodiments of
the inventive subject matter, including making and using any
devices or systems and performing any incorporated methods. The
patentable scope of the inventive subject matter is defined by the
claims, and may include other examples that occur to those of
ordinary skill in the art. Such other examples are intended to be
within the scope of the claims if they have structural elements
that do not differ from the literal language of the claims, or if
they include equivalent structural elements with insubstantial
differences from the literal languages of the claims.
[0198] The foregoing description of certain embodiments of the
inventive subject matter will be better understood when read in
conjunction with the appended drawings. To the extent that the
figures illustrate diagrams of the functional blocks of various
embodiments, the functional blocks are not necessarily indicative
of the division between hardware circuitry. Thus, for example, one
or more of the functional blocks (for example, processors or
memories) may be implemented in a single piece of hardware (for
example, a general purpose signal processor, microcontroller,
random access memory, hard disk, and the like). Similarly, the
programs may be stand-alone programs, may be incorporated as
subroutines in an operating system, may be functions in an
installed software package, and the like. The various embodiments
are not limited to the arrangements and instrumentality shown in
the drawings.
[0199] As used herein, an element or step recited in the singular
and proceeded with the word "a" or "an" should be understood as not
excluding plural of said elements or steps, unless such exclusion
is explicitly stated. Furthermore, references to "one embodiment"
of the inventive subject matter are not intended to be interpreted
as excluding the existence of additional embodiments that also
incorporate the recited features. Moreover, unless explicitly
stated to the contrary, embodiments "comprising," "including," or
"having" an element or a plurality of elements having a particular
property may include additional such elements not having that
property.
* * * * *