U.S. patent application number 14/694336 was filed with the patent office on 2016-10-27 for computer vision assisted work tool recognition and installation.
This patent application is currently assigned to Caterpillar Inc.. The applicant listed for this patent is Caterpillar Inc.. Invention is credited to Joseph E. Forcash, Qi Wang.
Application Number | 20160312432 14/694336 |
Document ID | / |
Family ID | 57147448 |
Filed Date | 2016-10-27 |
United States Patent
Application |
20160312432 |
Kind Code |
A1 |
Wang; Qi ; et al. |
October 27, 2016 |
Computer Vision Assisted Work Tool Recognition and Installation
Abstract
A method for installing a work tool for a machine is provided.
The method includes detecting, at an electronic controller unit of
a machine, a work tool based upon a first input signal from a
sensor coupled to the electronic controller unit. The method
includes determining, at the electronic controller unit, a first
three-dimensional location of the work tool relative to the
machine. The method includes detecting, at the electronic
controller unit, an occlusion of the work tool. The method includes
determining, at the electronic controller unit, a second
three-dimensional location of the work tool upon the detecting of
the occlusion based on the first three-dimensional location. The
method includes controlling, at the electronic controller unit, a
motion of the machine for installing the work tool based upon the
second three-dimensional location.
Inventors: |
Wang; Qi; (Pittsburgh,
PA) ; Forcash; Joseph E.; (Zelienople, PA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
Caterpillar Inc. |
Peoria |
IL |
US |
|
|
Assignee: |
Caterpillar Inc.
Peoria
IL
|
Family ID: |
57147448 |
Appl. No.: |
14/694336 |
Filed: |
April 23, 2015 |
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
E02F 9/2012 20130101;
E02F 3/96 20130101; E02F 9/265 20130101; E02F 9/262 20130101; E02F
9/2045 20130101 |
International
Class: |
E02F 3/34 20060101
E02F003/34 |
Claims
1. A method for installing a work tool for a machine, comprising:
detecting, at an electronic controller unit of a machine, a work
tool based upon a first input signal from a sensor coupled to the
electronic controller unit; determining, at the electronic
controller unit, a first three-dimensional location of the work
tool relative to the machine; detecting, at the electronic
controller unit, an occlusion of the work tool; determining, at the
electronic controller unit, a second three-dimensional location of
the work tool upon the detecting of the occlusion based on the
first three-dimensional location; and controlling, at the
electronic controller unit, a motion of the machine for installing
the work tool based upon the second three-dimensional location.
2. The method of claim 1, further comprising: classifying, at the
electronic controller unit, the detected work tool based upon a
match with a known work tool in a database; and selecting
automatically, at the electronic controller unit, an output mode of
the machine based upon the classifying.
3. The method of claim 2, wherein the controlling the motion of the
machine is performed based upon the selecting of the output
mode.
4. The method of claim 2, further comprising: determining, at the
electronic controller unit, a relative position of the machine with
respect to an attachment coupler of the work tool using a third
three-dimensional location of the attachment coupler, wherein the
controlling is performed based upon the determining.
5. The method of claim 1, further comprising: tracking, at the
electronic controller unit, the work tool based on the first input
signal from the sensor; and outputting, at an output device coupled
to the electronic controller unit, a three-dimensional scene of the
tracked work tool based upon at least one of the first
three-dimensional location or the second three-dimensional location
of the work tool, wherein the outputting includes displaying the
occlusion of the work tool.
6. The method of claim 5, wherein the tracking includes tracking a
linear motion, an angular motion, or both, of the machine or a
machine component.
7. The method of claim 1, further comprising: determining, at the
electronic controller unit, a dead-reckoning of the machine based
upon a second input signal from an inertial measurement unit
coupled to the electronic controller unit; and combining, at the
electronic controller unit, the first input signal and the second
input signal to update the second three-dimensional location of the
work tool upon the detecting of the occlusion.
8. The method of claim 1, further comprising: outputting
continuously, at the electronic controller unit, an updated
position of the work tool and an attachment coupler of the work
tool to a machine control system; installing, using the electronic
controller unit, the work tool to the machine at the attachment
coupler based on the updated position; and indicating, from the
electronic controller unit, to the machine control system that the
installing is complete.
9. The method of claim 1, wherein the detecting the occlusion of
the work tool includes detecting a machine component between the
sensor and the work tool, the machine component blocking a field of
view of the sensor.
10. A work tool installation system, comprising: a machine
attachable to a work tool, the machine including: a sensor; and an
electronic controller unit coupled to the sensor, the electronic
controller unit configured to: detect the work tool based upon a
first input signal from the sensor, determine a first
three-dimensional location of the work tool relative to the
machine, detect an occlusion of the work tool, determine a second
three-dimensional location of the work tool, when the occlusion is
detected, based on the first three-dimensional location, and
control a motion of the machine for installing the work tool based
upon the second three-dimensional location.
11. The work tool installation system of claim 10, wherein the
electronic controller unit is further configured to: classify the
detected work tool based upon a match with a known work tool in a
database, and select an output mode of the machine based upon the
match.
12. The work tool installation system of claim 11, wherein the
electronic controller unit is further configured to control the
motion of the machine based upon the output mode.
13. The work tool installation system of claim 11, wherein the
electronic controller unit is further configured to: determine a
relative position of the machine with respect to an attachment
coupler of the work tool using a third three-dimensional location
of the attachment coupler, wherein the electronic controller unit
is configured to control the motion of the machine based upon the
determined relative position.
14. The work tool installation system of claim 10, wherein the
electronic controller unit is further configured to: track the work
tool based on the first input signal from the sensor, and output,
at an output device controlled by the electronic controller unit, a
three-dimensional scene of the tracked work tool based upon at
least one of the first three-dimensional location or the second
three-dimensional location of the work tool, wherein the outputted
three-dimensional scene includes the occlusion of the work
tool.
15. The work tool installation system of claim 14, wherein the
electronic controller unit is configured to track a linear motion,
an angular motion, or both, of the machine or a machine
component.
16. The work tool installation system of claim 10, further
comprising: an inertial measurement unit on the machine coupled to
the electronic controller unit, wherein the electronic controller
unit is further configured to: determine a dead-reckoning of the
machine based upon a second input signal from the inertial
measurement unit, and combine the first input signal and the second
input signal to update the second three-dimensional location of the
work tool when the occlusion is detected.
17. The work tool installation system of claim 10, wherein the
electronic controller unit is further configured to: output
continuously an updated position of the work tool and an attachment
coupler of the work tool to a machine control system; install the
work tool to the machine at the attachment coupler based on the
updated position; and indicate to the machine control system that
the work tool is installed.
18. The work tool installation system of claim 10, wherein the
occlusion is detected by detecting a machine component between the
sensor and the work tool, the machine component blocking a field of
view of the sensor.
19. The work tool installation system of claim 10, wherein the
sensor is at least one camera on the machine.
20. An electronic controller unit of a machine comprising: a memory
including computer executable instructions for recognizing and
installing a work tool to a machine; and a processor coupled to the
memory and configured to execute the computer executable
instructions, the computer executable instructions when executed by
the processor cause the processor to: detect the work tool based
upon a first input signal from a sensor, determine a first
three-dimensional location of the work tool relative to the
machine, detect an occlusion of the work tool, determine a second
three-dimensional location of the work tool, when the occlusion is
detected, based on the first three-dimensional location, and
control a motion of the machine for installing the work tool based
upon the second three-dimensional location.
Description
TECHNICAL FIELD
[0001] This patent disclosure relates generally to attachment of
work tools to a machine, and more particularly, to a method and a
system for computer vision assisted work tool recognition and
installation for a machine.
BACKGROUND
[0002] Machines have multiple types of work tools or attachments
for different work purposes. For example, a wheel loader may use a
bucket for moving earth, and may use a fork for picking up pallets.
Changing work tools in a safe and quick fashion is one of the basic
qualifications of a good machine operator. However, it may require
a new operator a long time of training to master this skill.
[0003] A typical challenge encountered by the machine operator
during a changing of the work tool is that additional manual
operations may be required for changing the machine's output mode
settings. Usually, the machine operator has to select a proper
machine output mode for a specific work tool by manually pressing a
button on a control panel. Forgetting to press the button or
inadvertently selecting a wrong output mode may cause the machine
to malfunction.
[0004] Another challenge for the machine operator is a limited
front field of view of the work tool. The view of the work tool to
be installed or attached may be blocked by some mechanical parts on
a machine. For example, the hydraulic cylinders and mechanical
linkages in front of an operator cab of a wheel loader may block
the machine operator's view of a fork to be installed or attached
to the machine.
[0005] Yet another challenge faced by the machine operator is the
difficulty of aligning an attachment coupler of the work tool at a
distance from the machine to which the work tool attaches. When the
work tool to be installed is at a distance from the machine, it is
difficult for the machine operator to manually accurately align an
attachment coupler of the work tool with the machine due to the
distance.
[0006] WO 2014046313 discusses capturing an image of an attachment
and recognizing the attachment using a database. However,
conventional systems and methods do not address the challenges
faced by the machine operator, for example, when the view of the
work tool is blocked.
[0007] Accordingly, there is a need to resolve these problems and
other problems related to conventional methods and systems used for
attaching work tool to machines in order to reduce training time
for new operators, and increase work productivity.
SUMMARY
[0008] In one aspect of this disclosure, a method for installing a
work tool for a machine is provided. The method includes detecting,
at an electronic controller unit of a machine, a work tool based
upon a first input signal from a sensor coupled to the electronic
controller unit. The method includes determining, at the electronic
controller unit, a first three-dimensional location of the work
tool relative to the machine. The method includes detecting, at the
electronic controller unit, an occlusion of the work tool. The
method includes determining, at the electronic controller unit, a
second three-dimensional location of the work tool upon the
detecting of the occlusion based on the first three-dimensional
location. The method includes controlling, at the electronic
controller unit, a motion of the machine for installing the work
tool based upon the second three-dimensional location.
[0009] In another aspect of this disclosure, a work tool
installation system is provided. The work tool installation system
includes a machine attachable to a work tool. The machine includes
a sensor and an electronic controller unit coupled to the sensor.
The electronic controller unit is configured to detect the work
tool based upon a first input signal from the sensor, determine a
first three-dimensional location of the work tool relative to the
machine, detect an occlusion of the work tool, determine a second
three-dimensional location of the work tool, when the occlusion is
detected, based on the first three-dimensional location, and
control a motion of the machine for installing the work tool based
upon the second three-dimensional location.
[0010] In yet another aspect of this disclosure, an electronic
controller unit of a machine is provided. The electronic controller
unit includes a memory and a processor. The memory includes
computer executable instructions for recognizing and installing a
work tool to a machine. The processor is coupled to the memory and
configured to execute the computer executable instructions, the
computer executable instructions when executed by the processor
cause the processor to detect the work tool based upon a first
input signal from a sensor, determine a first three-dimensional
location of the work tool relative to the machine, detect an
occlusion of the work tool, determine a second three-dimensional
location of the work tool, when the occlusion is detected, based on
the first three-dimensional location, and control a motion of the
machine for installing the work tool based upon the second
three-dimensional location.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] FIG. 1 illustrates a work tool installation system, in
accordance with an aspect of this disclosure.
[0012] FIG. 2 illustrates a view of a work tool from inside an
operator cab of a machine in the work tool installation system of
FIG. 1, in accordance with an aspect of this disclosure.
[0013] FIG. 3 illustrates the work tool installation system of FIG.
1 with the machine moved closer to the work tool, in accordance
with an aspect of this disclosure.
[0014] FIG. 4 illustrates a view of the work tool from inside the
operator cab with a view of the work tool partially occluded, in
accordance with an aspect of this disclosure.
[0015] FIG. 5 illustrates a schematic block diagram of the work
tool installation system of FIG. 1, in accordance with an aspect of
this disclosure.
[0016] FIG. 6 illustrates a first frame showing a three-dimensional
scene of the work tool installation system of FIG. 1, in accordance
with an aspect of this disclosure.
[0017] FIG. 7 illustrates a second frame showing a
three-dimensional scene of the work tool installation system of
FIG. 3, in accordance with an aspect of this disclosure.
[0018] FIG. 8 illustrates a flowchart for a method for installing a
work tool for a machine, in accordance with an aspect of this
disclosure.
DETAILED DESCRIPTION
[0019] Various aspects of this disclosure are related to addressing
the problems in the conventional machines and methods by using
computer-vision assisted work tool recognition and
installation.
[0020] Now referring to the drawings, where like reference numerals
refer to like elements, FIG. 1 illustrates a work tool installation
system 100, in accordance with an aspect of this disclosure. The
work tool installation system 100 includes a machine 102 and a work
tool 104 in an exemplary work environment. It will be appreciated
that the work tool installation system 100 may include a plurality
of machines and a plurality of work tools and the machine 102 and
the work tool 104 illustrated in FIG. 1 are by way of example only
and not by way of limitation. Further, the work tool installation
system 100 may include additional components, including but not
limited to, a base station in communication with the machine 102, a
satellite system in communication with the machine 102, an unmanned
aerial vehicle in communication with the machine 102, and the like,
to assist recognition and installation of the work tool 104 to the
machine 102.
[0021] The machine 102 may be a movable machine or a stationary
machine having movable parts. In this respect, the term "movable"
may refer to a motion of the machine 102, or a part thereof, along
linear Cartesian axes, and/or along angular, cylindrical, or
helical coordinates, and/or combinations thereof. Such motion of
the machine 102 may be continuous or discrete in time. For example,
the machine 102, and/or a part of the machine 102, may undergo a
linear motion, an angular motion or both. Such linear and angular
motion may include acceleration, rotation about an axis, or both.
By way of example only and not by way of limitation, the machine
102 may be an excavator, a paver, a dozer, a skid steer loader
(SSL), a multi-terrain loader (MTL), a compact track loader (CTL),
a compact wheel loader (CWL), a harvester, a mower, a driller, a
hammer-head, a ship, a boat, a locomotive, an automobile, a
tractor, or other machine to which the work tool 104 is
attachable.
[0022] In the example shown in FIG. 1, the machine 102 includes a
machine component 108, a sensor 110, an operator cab 112, a chassis
114, tires 116, and a hood 118. The machine component 108 is
attachable to the work tool 104 at an attachment coupler 106 of the
work tool 104. The operator cab 112 includes, among other
components, a steering system 124 to guide the machine 102 in
various spatial directions, and an output device 140. The operator
cab 112 may be suitably sized to accommodate a human operator.
Alternatively, the machine 102 may be controlled remotely from a
base station, in which case, the operator cab 112 may be smaller.
The steering system 124 may be a steering wheel or a joystick, or
other control mechanism to guide a motion of the machine 102, or
parts thereof. Further, the operator cab 112 may include levers,
knobs, dials, displays, alarms, etc. to facilitate operation of the
machine 102.
[0023] Under the hood 118, the machine 102 includes an electronic
controller unit 126, an inertial measurement unit (IMU) 128, and a
machine control system 130. The machine 102 may include other
components (e.g., as part of the chassis 114) such as transmission
systems, engine(s), motors, power system(s), hydraulic system(s),
suspension systems, cooling systems, fuel systems, exhaust systems,
ground engaging tools, anchor systems, propelling systems,
communication systems including antennas, Global Positioning
Systems (GPS), and the like (not shown) that are coupled to the
machine control system 130.
[0024] By way of example only and not by way of limitation, the
machine component 108 may be an excavator arm including hydraulic
cylinders and mechanical linkages, although other types of
mechanical parts may be utilized to attach the work tool 104 to the
machine 102. The mechanical linkages may include attachment
components compatible to mate with the attachment coupler 106. The
machine component 108 may be extendable, expandable, contractable,
rotatable, translatable radially or axially, or otherwise movable
by the machine 102 to couple to the work tool 104. For example, a
height and a tilt of the machine component 108 may be variable to
facilitate attachment at the attachment coupler 106. Once attached
to the work tool 104, the machine component 108 may be configured
to receive requisite power from the machine 102 to perform various
operations (e.g., digging earth) in the exemplary worksite using
the work tool 104.
[0025] In one aspect of this disclosure, the sensor 110 may be a
camera positioned on, inside, or above the operator cab 112.
Alternatively, the sensor 110 may be a camera positioned on the
machine component 108, e.g., near or at a front end of the machine
component 108 closest to the work tool 104, although the sensor 110
may be positioned at other locations on the machine 102. By way of
example only and not by way of limitation, the sensor 110 may be a
monocular camera, a stereo camera, an infrared camera, an array of
one or more types of cameras, an opto-acoustic sensor, a radar, a
laser based imaging sensor, or the like, or combinations thereof,
configured to assist recognition, detection, tracking, and
installation of the work tool 104.
[0026] The work tool 104 is attachable to the machine 102, for
example, to a linkage at an end portion of the machine component
108 via the attachment coupler 106. By way of example only and not
by limitation, the work tool 104 may be a bucket for moving earth,
a fork for lifting pallets, a harvester attachment, a drill head, a
hammer head, a compactor head, or any other type of implement
attachable to the machine 102. In this respect, the machine 102 may
be configured to be attachable not just to one type of the work
tool 104, but also to different types of the work tool 104, as well
as to a plurality of work tools at the same time. Depending on the
type of the work tool 104, the machine 102 may be configured to
operate in an output mode specific to the type of the work tool
104. An output mode of the machine 102 is specified by appropriate
electrical and mechanical parameters for operation of the work tool
104 when attached to the machine component 108. For example, an
output mode for a bucket is different from an output mode of a fork
in terms of an output power delivered to the work tool 104. If an
incorrect output mode is selected, or if no output mode is selected
by a manual operator when the work tool 104 is attached to the
machine component 108, the machine 102 may not be able to properly
perform, or not perform, the job for which the machine 102 was
deployed. Further, depending on the type of the work tool 104, the
attachment coupler 106 may be an attachment pin, a latch, a hook, a
ball/socket joint, or other types of attachment components that
make the work tool 104 couplable to the machine component 108 of
the machine 102. In one aspect, the work tool 104 may be
stationary. In another aspect, the work tool 104 may be mobile or
movable towards the machine 102. For example, another machine (not
shown) may be used to push the work tool 104 to match a motion of
the machine 102 and/or of the machine component 108.
[0027] As illustrated in FIG. 1, the work tool 104 is at an initial
position relative to the machine 102. The initial position may be
determined as a first three-dimensional location 120 of the work
tool 104 from the sensor 110. At this initial position, FIG. 2
illustrates a view of the work tool 104 from inside the operator
cab 112, in accordance with an aspect of this disclosure. As shown
in FIG. 2, a view of the work tool 104 is not blocked and the work
tool 104 is clearly visible without obstruction from the machine
component 108, e.g., to an operator in the operator cab 112.
[0028] As the machine 102 moves closer to the work tool 104, as
illustrated in FIG. 3, the initial position indicated by the first
three-dimensional location 120 changes. The work tool 104 102 is
now determined to be at a second three-dimensional location 220
from the machine 102. The machine 102 may move towards the work
tool 104 continuously or discretely. Regardless of the way the
machine 102 moves, the relative position of the work tool 104 and
the machine 102 is updated.
[0029] Referring back to FIG. 1, in one aspect, the attachment
coupler 106 is determined to be at a third three-dimensional
location 122 in the initial position. Again, as the machine 102
moves, the relative distance or the relative position between the
attachment coupler 106 and the machine 102 changes, using the
location of the sensor 110 as a reference.
[0030] For example, referring to FIG. 3, the attachment coupler 106
is at a fourth three-dimensional location 222 from the machine 102.
Further, the first three-dimensional location 120, the second
three-dimensional location 220, the third three-dimensional
location 122, and the fourth three-dimensional location 222 may be
measured between various points on the machine 102, the machine
component 108, the work tool 104, and the attachment coupler 106,
and the illustrated positions shown in FIGS. 1 and 3 are by way of
example only and not by way of limitation.
[0031] It will be appreciated that the terms "first", "second",
"third", and "fourth" used herein with respect to the initial,
intermediate, or final positions of the machine 102 and the machine
component 108 relative to the work tool 104 are for differentiating
purposes only and not for any particular priority in which such
relative positions between the machine 102, the machine component
108, and the work tool 104 are effectuated. Although illustrated as
linear distances (as indicated by respective straight lines), the
first three-dimensional location 120, the second three-dimensional
location 220, the third three-dimensional location 122, and the
fourth three-dimensional location 222, as well as other
intermediate three-dimensional locations for intermediate relative
positions of the machine 102 with respect to the work tool 104, may
be vectors expressible in one or more coordinate systems and stored
in a memory 508 (shown in FIG. 5) of the electronic controller unit
126.
[0032] Referring to FIG. 4, another view of the work tool 104 from
inside the operator cab 112 is illustrated as the machine 102 moves
closer to the work tool 104, in accordance with an aspect of this
disclosure. The view of the work tool 104 illustrated in FIG. 4 is
occluded during operation of the machine 102, for example, by a
presence of the machine component 108. In one aspect, such an
occlusion of the work tool 104 may result in the attachment coupler
106 not being visible (fully or partially) to the operator of the
machine 102. In this respect, the term "occlusion" relates to a
partial or complete blocking of a view of the work tool 104 and/or
the attachment coupler 106 due to a presence of a component (e.g.,
the machine component 108) between the work tool 104 and/or the
attachment coupler 106 and the operator or the sensor 110 that
views the work tool 104 and/or the attachment coupler 106. It will
be appreciated that the occlusion of the work tool 104 and/or the
attachment coupler 106 may be due to the machine component 108, or
due to other components not belonging to the machine 102, or
both.
[0033] In one aspect of this disclosure, the machine control system
130 may include various hydraulic and electrical power systems
controlled by the electronic controller unit 126, based upon output
signals from the electronic controller unit 126 to the machine
control system 130. The machine control system 130 may include or
may be coupled to the steering system 124 configured to guide a
motion of the machine 102 and/or the machine component 108. In
another aspect, the machine control system 130, or a part thereof,
may be located remote from the machine 102, e.g., in a base station
physically separated from the machine 102. In this scenario, the
machine control system 130 may have a direct or indirect
communication link with the electronic controller unit 126 to
control the machine 102 for installing the work tool 104.
[0034] Referring to FIG. 5, a schematic diagram of the work tool
installation system 100 with the machine 102 including the
electronic controller unit 126 is illustrated, in accordance with
an aspect of this disclosure. The electronic controller unit 126 is
coupled to the sensor 110, the inertial measurement unit 128, the
machine control system 130, the output device 140, and the steering
system 124, as well as to other components of the machine 102 (not
shown).
[0035] In one aspect of this disclosure, the sensor 110 has a field
of view 502 within which the work tool 104 and/or the attachment
coupler 106 fall. During an occlusion of the work tool 104 and/or
the attachment coupler 106, as discussed, the machine component 108
may fall within the field of view 502 of the sensor 110 to
partially or fully block a view of the work tool 104 and/or the
attachment coupler 106. This may prevent the sensor 110 from
obtaining a full image of the work tool 104 and/or the attachment
coupler 106. In conventional systems, such occlusion may slow down
or make it difficult to attach the work tool 104 to the machine
component 108, and requires manual intervention that is disruptive
to the changing and installation process of the work tool 104. To
address this issue, the electronic controller unit 126 may
continuously receive an input signal 518 from the sensor 110 at an
input-output port 504 of the electronic controller unit 126. The
input signal 518 may include information regarding a current or an
updated three-dimensional location or position of the work tool 104
and/or the attachment coupler 106 relative to the machine 102
and/or the machine component 108. The electronic controller unit
126 may be configured to detect the work tool 104 and determine the
occlusion of the work tool 104 and/or the attachment coupler 106
based, at least partially, upon the information in the input signal
518. In another aspect, the electronic controller unit 126 is
coupled to the IMU 128 to receive an input signal 520 from the IMU
128. The input signal 520 may include data related to a
dead-reckoning of the machine 102.
[0036] In one aspect of this disclosure, the electronic controller
unit 126 includes the input-output port 504, a processor 506, and
the memory 508 coupled to each other, for example, by an internal
bus (not shown). The electronic controller unit 126 may include
additional components known to one of ordinary skill in the art,
which components are not explicitly illustrated in FIG. 5. For
example, the electronic controller unit 126 may include a
programmable logic circuit (PLC), a timer/clocking circuit, heat
sinks, visual indicators (e.g., light emitting diodes), impedance
matching circuitry, internal buses, co-processors or monitor
processors, batteries and power supply units, power controller
chips, transceivers, wireless modules, satellite communication
processing modules, and embedded systems on various integrated
chips. In one aspect, the electronic controller unit 126 may be
separate from an engine controller unit (not shown). In an
alternative aspect, the electronic controller unit 126 may be
integrated with or may share space and processing resources with
the engine controller unit.
[0037] The input-output port 504 may be a single port or a
collection of ports. The input-output port 504 is configured to
transmit and receive various inputs and data from other parts of
the machine 102 and forward such inputs and data to the processor
506. In one aspect, the input-output port 504 may be two separate
ports, one configured to receive various input signals from various
parts of the machine 102 (e.g., the sensor 110, the IMU 128, etc.)
and another configured to output signals for display (e.g., on the
output device 140) or for control of the machine 102 (e.g., to the
machine control system 130). Alternatively, the functionalities of
inputting and outputting maybe integrated into a single port
illustrated as the input-output port 504 in FIG. 5.
[0038] In one aspect, the processor 506 is a hardware device such
as an integrated circuit (IC) chip fabricated to implement various
features and functionalities of the aspects discussed herein. By
way of example only and not by way of limitation, the processor 506
may be fabricated using a Complementary Metal Oxide Semiconductor
(CMOS) fabrication technology. In one aspect, the processor 506 may
be implemented as an Application Specific Integrated Circuit
(ASIC), a Field Programmable Gate Array (FPGA), a System-on-a-Chip
(SOC), or the like. In another aspect, the processor 506 may
include components such as packaging, input and output pins, heat
sinks, signal conditioning circuitry, input devices, output
devices, processor memory components, cooling systems, power
systems and the like, which are not shown in FIG. 5. In one aspect,
the processor 506 is configured to execute various parts of a
method 800 illustrated in FIG. 8 by executing computer executable
instructions 510 in the memory 508. In yet another aspect, the
processor 506 may be a plurality of processors arranged, for
example, as a processing array.
[0039] The memory 508 may be implemented as a non-transitory
computer readable medium. By way of example only, the memory 508
may be a semiconductor based memory device including but not
limited to random access memory (RAM), read only memory (ROM),
Dynamic RAM, Programmable ROM, Electrically Erasable programmable
ROM (EEPROM), Static RAM, Flash memory, combinations thereof, or
other types of memory devices known to one of ordinary skill in the
art. In one aspect, the memory 508 is coupled to the processor 506
directly via a communication and signal bus. In one aspect, the
memory 508 may be made of or implemented using a non-transitory
computer readable storage medium on which the computer executable
instructions 510 reside. The computer executable instructions 510
when executed by the processor 506 cause the processor 506 to carry
out the features and functionalities of the various aspects of this
disclosure, such as those discussed with respect to FIG. 8. Such
non-transitory computer readable storage medium may include
semiconductor memory, optical memory, magnetic memory, mono- or
bi-stable circuitry (flip-flops, etc.) and the like, or
combinations thereof. Such non-transitory computer readable storage
medium excludes signals that are transitory.
[0040] The computer executable instructions 510 may be executed by
the processor 506 using high-level or low-level compilers and
programming languages (e.g., C++). In one aspect, the computer
executable instructions 510 may be executed remotely by a base
station, and results of such execution provided to the processor
506 for controlling an output of the machine 102 to install the
work tool 104 to the machine 102. In this respect, it will be
appreciated that the specific location of the computer executable
instructions 510 inside the memory 508 is by way of example only,
and not by way of limitation.
[0041] In one aspect, the memory 508 includes or is coupled to a
database 512. The database 512 includes images of a plurality of
work tools, including the work tool 104. Such images are saved as a
library of image files and computerized models in the database 512.
Such models may include or may be used to generate
three-dimensional and two dimensional views of the plurality of
work tools attachable to the machine 102, including the work tool
104. Each such image or model in the database 512 includes an image
of the respective attachment couplers of the work tools. For
example, an image of the work tool 104 in the database 512 includes
an image of the attachment coupler 106. Such image files may be in
standard format (e.g., JPEG) known to those of ordinary skill in
the art. The database 512 may further include various numerical
parameters associated with one or more dimensions of the work tool
104 and the attachment coupler 106, as well as other identification
information associated with the work tool 104 and the attachment
coupler 106. In one aspect, the processor 506 may be able to
generate an image of the work tool 104 and the attachment coupler
106 based upon the numerical parameters of the work tool 104 and
the attachment coupler 106 stored in the database 512. Such images
and information may be continuously accessible to the processor 506
before, during, and after an occlusion of the work tool 104 and/or
the attachment coupler 106 in the field of view 502 of the sensor
110 occurs.
[0042] Referring to FIG. 6, a frame of a three-dimensional scene
600 captured by the sensor 110 and outputted by the processor 506
on the output device 140 is illustrated, in accordance with an
aspect of this disclosure. The three-dimensional scene 600 is a
reconstruction of an actual work environment in which the machine
102 and the work tool 104 are placed. The three-dimensional scene
600 is a real time snapshot of the work environment as viewed by
the operator of the machine 102, for example, via the sensor 110 or
directly. For example, the three-dimensional scene 600 may be
associated with an initial relative position of the machine 102
with respect to the work tool 104, or vice-versa. In the
three-dimensional scene 600, the machine 102 is represented as a
machine image 602, the machine component 108 as a machine component
image 608, the work tool 104 is represented as a work tool image
604, and the attachment coupler 106 is represented as an attachment
coupler image 606 at a first instance in time. In FIG. 6, the
sensor 110 has a clear view of the work tool 104 as illustrated by
the work tool image 604, and the attachment coupler 106 as
illustrated by the attachment coupler image 606.
[0043] Referring to FIG. 7, another frame corresponding to a
three-dimensional scene 700 captured by the sensor 110 and
outputted by the processor 506 on the output device 140 is
illustrated, in accordance with an aspect of this disclosure. The
three-dimensional scene 700 is another reconstruction of the actual
work environment in which the machine 102 and the work tool 104 are
placed at a second time instance subsequent to the first time
instance of FIG. 6. Similar to the three-dimensional scene 600, the
three-dimensional scene 700 is a real time snapshot of the work
environment as viewed by the operator of the machine 102, for
example, via the sensor 110 or directly, at the second time
instance. In the three-dimensional scene 700, the machine 102 is
represented as a machine image 702 and the work tool 104 is
represented as a work tool image 704 at the second instance in
time. However, due to an occlusion caused by the machine component
108 represented as a machine component image 708, the attachment
coupler 106 is not visible to the operator or the sensor 110 in the
three-dimensional scene 700. As a result, the attachment coupler
image 606 previously shown on the output device 140 is at least
partially missing in FIG. 7.
[0044] It will be appreciated that the three-dimensional scene 600
and the three-dimensional scene 700 are two visual examples of the
machine 102 in operation as outputted on the output device 140, but
the output device 140 may continuously display a plurality of
three-dimensional scenes on a frame-by-frame basis as provided by
the processor 506 to the output device 140 based upon the input
signals (including the input signal 518) from the sensor 110. In
one aspect, the three-dimensional scene 600 and the
three-dimensional scene 700 may be provided on a display of a
remote operator of the machine 102 in a remote base station (not
shown) as a real-time video of the work scene in which the machine
102 and the work tool 104 are deployed. Such frame-by-frame
representation of the work environment of the machine 102 when used
for recognition and subsequent installation of the work tool 104
may be in a simulation format, or may be used as a simulation to
train operators of the machine 102 to install the work tool
104.
INDUSTRIAL APPLICABILITY
[0045] The present disclosure is applicable to accurately
recognizing work tools for installation to machines using
computer-vision.
[0046] Machines have multiple types of work tools (attachments) for
different work purposes. For example, a wheel loader may use a
bucket for moving earth, and may use a fork for picking up pallets.
Changing work tools in a safe and quick fashion is one of the basic
qualifications of a good machine operator. However, it may require
a new operator a long time of training to master this skill.
[0047] A typical challenge encountered by the machine operator
during a changing of the work tool is that additional manual
operations may be required for changing the machine's output mode
settings. Usually, the machine operator has to select a proper
machine mode for a specific work tool by manually pressing a button
on control panel. Forgetting to press the button or inadvertently
selecting a wrong mode may cause the machine to malfunction.
[0048] Another challenge for the machine operator is a limited
front field of view of the work tool. The view of the work tool to
be installed or attached may be blocked by one or more mechanical
parts on the machine. For example, the hydraulic cylinders and
mechanical linkages in front of an operator cab of a wheel loader
may block the machine operator's view of a fork to be installed or
attached to the machine.
[0049] Yet another challenge faced by the machine operator is the
difficulty of aligning an attachment coupler of a work tool at a
distance from the machine to which the work tool attaches. When the
work tool to be installed is at a distance from a machine (e.g., an
excavator), it is difficult for the machine operator to accurately
align the work tool with an attachment coupler due to the
distance.
[0050] Various aspects of this disclosure address the complex
problem of recognizing and installing or attaching the work tool
104 when there is an occlusion of the work tool 104 with respect to
the field of view 502 of the sensor 110, for example, due to the
machine component 108 being present between the sensor 110 and the
attachment coupler 106 of the work tool 104.
[0051] Referring to FIG. 8, there is illustrated a method 800 for
installing the work tool 104 for the machine 102, in accordance
with an aspect of this disclosure. FIG. 8 presents the method 800
as a flow diagram, although the method 800 may be understood using
other types of presentations such as process diagrams, graphs,
charts, equations, etc. In one aspect, one or more processes or
operations in the method 800 may be carried out by the electronic
controller unit 126 inside the machine 102. For example, the one or
more processes or operations may be carried out by the processor
506 inside the electronic controller unit 126, using various input
signals from the sensor 110 and/or the IMU 128, and by executing
the computer executable instructions 510 stored in the memory 508
of the electronic controller unit 126. As discussed, the input
signals from the sensor 110 and/or the IMU 128 may be received at
the electronic controller unit 126 and processed by the processor
506 while the machine 102 is in use or is in operation in a work
environment captured, for example, as the three-dimensional scene
600.
[0052] In another aspect, in the method 800, one or more processes
or operations, or sub-processes thereof, may be skipped or combined
as a single process or operation, and a flow of processes or
operations in the method 800 may be in any order not limited by the
specific order illustrated in FIG. 8. For example, one or more
processes or operations may be moved around in terms of their
respective orders, or may be carried out in parallel. The term
"flow" generally refers to a logical progression of operations in
an exemplary manner carried out by the processor 506 and/or other
components of the electronic controller unit 126. However, such a
flow is by way of example only and not by way of limitation, as at
a time, the flow may proceed along multiple operations or processes
of the method 800. Further, the method 800 may be carried out by
the electronic controller unit 126 for other types of work tools
attachable to the machine 102 and is not limited to the work tool
104. The method 800 may be implemented by the processor 506 in a
high level or a low level programming language (e.g., C++, assembly
language, etc.) using logic circuitry inside the electronic
controller unit 126 and by executing the computer executable
instructions 510 in the memory 508. Furthermore, communication
between the processor 506 and the database 512 may occur using
standard querying routines (e.g., Sequential Query Language or SQL)
and one or more search algorithms may be implemented using the
computer executable instructions 510 by the processor 506 to search
the database 512. Such search algorithms may be in assembly
language or a high level programming language, as will be
appreciated by one of ordinary skill in the art, in view of this
disclosure.
[0053] The method 800 may begin in an operation 802 in which the
processor 506 is configured to capture a work scene in which the
machine 102 and the work tool 104 are deployed. Such capturing of
the work scene may be done as a three-dimensional capture, as
indicated, for example, in the three-dimensional scene 600 and the
three-dimensional scene 700. The three-dimensional capture of the
work scene may be based upon continuous inputs from the sensor 110
and/or the IMU 128. For example, when the sensor 110 is a monocular
or a stereo camera, the three-dimensional scene 600 may be captured
as an initial work scene. The sensor 110 may continue capturing the
work scene on a frame-by-frame basis by capturing the
three-dimensional scene 700 and subsequent three-dimensional
scenes. In one aspect, the three-dimensional scene 600 and the
three-dimensional scene 700 may be captured by the sensor 110 and
outputted by the processor 506 on the output device 140 (as
illustrated in FIGS. 6 and 7) as a time series of a plurality of
three-dimensional scenes. In one aspect, such three-dimensional
scenes may be viewable as a video by an operator of the machine
102.
[0054] In one aspect, the work scene captured by the sensor 110 may
not have a desired level of detail. In this scenario, the processor
506 may add various additional features of the work scene based
upon prior knowledge of the work scene, using the database 512 to
create the three-dimensional scene 600. For example, when the
sensor 110 is an infrared camera operating at night, the processor
506 may create a simulation of the work scene to add more details
(e.g., surrounding structures, color, etc.) to the captured work
scene. Such details may then be presented as part of a simulation
of the three-dimensional scene 600 and/or the three-dimensional
scene 700 on the output device 140. The operation 802 may be
initiated manually (e.g., by an operator) or automatically (e.g.,
every time the machine 102 is started).
[0055] In an operation 804, the processor 506 detects the work tool
104 in the captured work scene. The work scene may include a
plurality of objects around the machine 102. The processor 506 may
detect the work tool 104 based upon the input signal 518 from the
sensor 110. In one aspect, the input signal 518 may be provided
every 30 s to the processor 506, although a frequency at which the
input signal 518 is generated by the sensor 110 may be programmable
and variable. For example, the input signal 518 from the sensor 110
may include information related to a plurality of shapes detected
by the sensor 110 that are in the field of view 502 of the sensor
110. Upon receiving the input signal 518, the processor 506 may
send a query to the database 512 to obtain one or more dimensions
of the work tool 104 stored as a three-dimensional model of the
work tool 104 in the database 512, as well as three-dimensional
models of other work tools and objects that may correspond to the
plurality of shapes detected by the sensor 110. The processor 506
may detect the work tool 104 by extracting visual and/or geometric
features corresponding to computer vision feature descriptors. Such
feature descriptors may include, but are not limited to, a
histogram of oriented gradients, speeded-up robust features,
scale-invariant feature transform, and the like, or combinations
thereof.
[0056] In another aspect, the processor 506 may extract the
visual/geometric features of the objects from the input signal 518
and compare these features with the three-dimensional model library
in the database 512. If there is a match, the processor 506
determines that the work tool 104 has been detected.
[0057] In yet another aspect, as part of the detection of the work
tool 104 by the processor 506, the work tool 104 may reflect an
output electromagnetic, acoustic, and/or optical signal from the
sensor 110. Such a reflected signal when captured back by the
sensor 110 is provided to the processor 506 as the input signal
518. The input signal 518 may include optical intensity variations
indicating a presence of the work tool 104. The processor 506
detects the work tool 104 based, for example, on the optical
intensity variations in the input signal 518. In contrast, when the
work tool 104 is not present or is not detected, the input signal
518 from the sensor 110 may be missing or may not have any
information regarding the presence of the work tool 104. In one
aspect, processor 506 may detect the work tool 104 based upon an
input provided to the processor 506 by an operator of the machine
102. The operator may view the work tool 104 directly.
Alternatively, the operator may obtain information regarding a
presence of the work tool 104 on the output device 140. The
processor 506 may apply image processing to detect images captured
by the sensor 110, e.g., when the sensor 110 is an optical
camera.
[0058] In an operation 806, the processor 506 carries out
classifying the work tool 104 detected in the operation 804. The
processor 506 obtains an image of the work tool 104 based upon the
input signal 518 from the sensor 110. The processor 506 may then
extract data associated with the physical features of the work tool
104. Such data may include, for example, dimensions and shape of
the work tool 104, determined, for example, in a manner similar to
the extraction of features of the work tool 104 for the detection
performed in the operation 804. The processor 506 may communicate
with the database 512 to obtain a match of the work tool 104 with a
known work tool whose information is stored in the database 512.
For example, the database 512 may store a library of computerized
models of a plurality of types of work tools that may be attached
to the machine 102, and one of such computerized models may match
the work tool 104, based upon the information in the input signal
518.
[0059] In one aspect, a search algorithm may be executed by the
processor 506 to search for data matching the data associated with
the work tool 104, as detected. For example, once a height, length,
and/or depth of the work tool 104 have been determined by the
processor 506 based upon extracting such information from the input
signal 518 of the sensor 110, the processor 506 may send a query to
the database 512, to determine whether the database 512 has
matching work tools meeting satisfying the criteria corresponding
to the dimensions of the work tool 104 in the query. The database
512 may communicate back one or more matches of the work tool 104.
When there is more than one match, the processor 506 may apply
additional criteria to identify and accurately recognize the work
tool 104. For example, the database 512 may indicate to the
processor 506 that there are two work tools matching the dimensions
of the work tool 104.
[0060] However, the processor 506 may determine that only one of
the two matching work tools are appropriate for the machine 102
and/or the work environment in which the machine 102 is deployed.
Accordingly, the processor 506 classifies the work tool 104 by
selecting information regarding the correctly matched known work
tool provided by the database 512, and rejects the other matching
results from the database 512. In this respect, the work tool 104
is identified or recognized using computer vision. The term
"computer vision" may relate to processing information related to
the work tool 104 to determine a type, dimensions, and a
positioning of the work tool 104 relative to the machine 102, and
further, applying the processed information to assist installation
of the work-tool using generated images of the work tool 104 (e.g.,
the work tool image 604 in FIG. 6).
[0061] In an operation 808, the processor 506 may select an output
mode of the machine 102 based upon the classifying carried out in
the operation 806. The output mode for the machine 102 may be set
automatically based upon the classifying of the work tool 104.
Alternatively, the processor 506 may output an indication to the
output device 140 based upon which an operator can select the
output mode of the machine 102. In one aspect, when the processor
506 automatically sets the output mode of the machine 102 based
upon the classifying, manual errors due to inadvertent erroneous
selection by the operator of the machine 102 may be avoided. The
output mode selection determines an amount of power to be
transmitted to the work tool 104, for example, by the machine
component 108 to which the work tool 104 is attached. Such an
output power is within a range to avoid potentially improper
functioning or potential damage to the work tool 104 and/or to the
work surface upon which the work tool 104 operates. For example,
based upon the output mode, the processor 506 may output signals to
the machine control system 130 to output appropriate electrical
current to one or more motors or hydraulic actuators of the machine
102, which motors or hydraulic actuators may then control a motion
of the machine 102 and/or the machine component 108.
[0062] In an operation 810, once the work tool 104 has been
detected by the processor 506, the processor 506 may determine the
first three-dimensional location 120 of the work tool 104. The
first three-dimensional location 120 may be an initial location or
position of the work tool 104 from the machine 102, for example,
when the work tool installation system 100 is initiated.
Alternatively, the first three-dimensional location 120 may refer
to any initial location or position of the work tool 104 with
respect to a calculation cycle or time slot of the processor 506
for which the method 800 is being implemented or carried out. For
example, the processor 506 may need to update a previous
determination of a calculated three-dimensional position of the
work tool 104. To do so, the processor 506 may restart calculating
and may use the first three-dimensional location 120 as a starting
point or starting value to determine subsequent three-dimensional
locations of the work tool 104 (e.g., the second three-dimensional
location 220).
[0063] In one aspect, the processor 506 determines the first
three-dimensional location 120 based upon a knowledge of a position
of the sensor 110 and determining a propagation time for a signal
emitted by the sensor 110 towards the work tool 104 to be reflected
back and received at the sensor 110. Upon reception of the
reflected signal, the sensor 110 provides the input signal 518 to
the processor 506. For example, the processor 506 may utilize such
a determination of the first three-dimensional location 120 when
the sensor 110 is an active perception sensor, such as a Lidar, a
time-of-flight three-dimensional camera, radar, and the like, or
combinations thereof.
[0064] However, when the sensor 110 is a passive perception sensor,
such as a monocular color camera, a monocular infrared camera, or a
binocular stereo camera, the processor 506 may apply computer
vision algorithms to determine the first three-dimensional location
120. For monocular cameras used as the sensor 110, the processor
506 may use computer vision technologies such as
"structure-from-motion" to generate, for example, the
three-dimensional scene 600 in the field of view 502 of the sensor
110. For binocular stereo cameras used as the sensor 110, the
processor 506 may use a stereo triangulation to generate, for
example, the three-dimensional scene 600 in the field of view 502
of the sensor 110. Once the three-dimensional scene 600 has been
reconstructed by the processor 506, the first three-dimensional
location 120 of the work tool 104 is calculated with respect to the
sensor 110. Since the sensor 110 location with respect to the
machine 102 are known (e.g., (x, y, z) coordinates, roll, pitch,
and yaw), the first three-dimensional location 120 of the work tool
104 with respect to the machine 102 may be obtained. It will be
appreciated by one of ordinary skill reading this disclosure that
the second three-dimensional location 220, the third
three-dimensional location 122, and the fourth three-dimensional
location 222, or any other three-dimensional locations in the work
tool installation system 100, may be determined by the processor
506 in a manner similar to the determination of the first
three-dimensional location 120 by the processor 506.
[0065] In one aspect, the processor 506 may use a linear
relationship to determine the first three-dimensional location 120
of the work tool 104. By way of example only and not by way of
limitation, the processor 506 may know a three-dimensional position
of the sensor 110 on the machine 102 in Cartesian coordinates
(e.g., X, Y, Z values) and may obtain this information from the
memory 508. Further, the processor 506 may obtain physical
dimensions of the machine 102 and the machine component 108 and
determine the relative position of the work tool 104 with respect
to the machine 102 and the machine component 108.
[0066] In an operation 812, the processor 506 may determine the
third three-dimensional location 122 of the attachment coupler 106
of the work tool 104. In one aspect, the third three-dimensional
location 122 is determined based upon the classifying of the work
tool 104 carried out in the operation 806. Upon classifying, the
processor 506 may know where the attachment coupler 106 is located
on the work tool 104 based upon the data for the work tool 104
retrieved from the database 512. In this respect, the processor 506
does not have to rely upon explicit input signals from the sensor
110 regarding the third three-dimensional location 122 of the
attachment coupler 106. Rather, once the work tool 104 has been
recognized, based upon the classifying, the processor 506 uses the
information regarding the work tool 104 from the database 512 to
determine the third three-dimensional location 122 of the
attachment coupler 106.
[0067] In one aspect, such a determination of the relative position
of the attachment coupler 106 is part of the computer vision based
determination utilized by the processor 506. Further, such computer
vision assisted determination of the relative position of the
attachment coupler 106 may include determining a precise position
of the attachment coupler 106 on the work tool 104 using the
information obtained from the database 512 regarding the work tool
104. For example, when the work tool 104 is a bucket, an attachment
pin of the bucket and a location of the attachment pin may be
determined by the processor 506 upon detection and classification
of the work tool 104 (e.g., as carried out in the operations 804
and 806). Accordingly, the processor 506 may form a complete
three-dimensional positional view of the work tool 104 independent
of an actual physical view of the work tool 104. The
three-dimensional view of the work tool 104 may then be provided to
the output device 140 as an output by the processor 506. Such an
output may be represented as part of the three-dimensional scene
600 and the three-dimensional scene 700 illustrated in FIGS. 6 and
7, respectively.
[0068] In an operation 814, the processor 506 may continuously
track the work tool 104 and the attachment coupler 106 of the work
tool 104 using the input signal 518 from the sensor 110. The
tracking of the work tool 104 and/or the attachment coupler 106 may
be based upon determining a relative distance between the machine
102 and the attachment coupler 106 starting, for example, with the
third three-dimensional location 122 of the attachment coupler 106.
As the machine 102 moves toward the work tool 104, the processor
506 applies computer-vision based on image information about the
work tool 104 and the attachment coupler 106 from the database 512
to accurately track the work tool 104 and/or the attachment coupler
106 to adjust and control the motion of the machine 102.
[0069] In addition to the input signal 518, the processor 506 is
configured for the tracking based on the classification of the work
tool 104 carried out in the operation 806. Since the processor 506
knows the exact type of the work tool 104, the processor 506 can
determine the type and location of the attachment coupler 106 on
the work tool 104. Then, based upon a velocity of the machine 102,
and hence the sensor 110, the processor 506 can continuously track
where the attachment coupler 106 on the work tool 104 is located.
The tracking of the work tool 104 and/or the attachment coupler 106
may be viewable on the output device 140. For example, the work
tool 104 and/or the attachment coupler 106 as tracked may be
presented on the three-dimensional scene 600 for an operator of the
machine 102. Such tracking of the work tool 104 and the attachment
coupler 106 may be referred to as "visual odometry" in which a
velocity of the machine 102 is used in conjunction with the images
of the work tool 104 retrieved from the database 512 by the
processor 506 to determine an accurate three-dimensional
localization of the attachment coupler 106. Such three-dimensional
localization of the attachment coupler 106 may be indicated on the
output device 140, for example, as geographical coordinates and
directions.
[0070] In one aspect, such tracking may be carried out on a
frame-by-frame basis based on input signals (including the input
signal 518) from the sensor 110 received at the processor 506. For
example, the three-dimensional scene 600 may be considered as a
first frame on the output device 140 in which the work tool 104
and/or the attachment coupler 106 are presented and tracked.
Likewise, subsequent three-dimensional scenes (e.g., the
three-dimensional scene 700) may be presented as a time based
progression of the captured work scene. In each such frame, the
work tool 104 and/or the attachment coupler 106 are tracked as
indicated by the relative positions of the work tool 104 and/or the
attachment coupler 106 with respect to the machine 102. It will be
appreciated that the tracking of the work tool 104 and/or the
attachment coupler 106 on a frame-by-frame basis may be for visual
presentation of the simulated locations of the work tool 104 and/or
the attachment coupler 106 (directly correlated with the actual
physical locations of the work tool 104 and/or the attachment
coupler 106). However, for purposes of calculating the relative
positions of the machine 102 and the work tool 104 and/or the
attachment coupler 106 (e.g., using the first three-dimensional
location 120, etc.), such a visual presentation is an example only,
and not a limitation. For example, the processor 506 may determine
the relative positions based on numerical or tabular values that
are updated as the work tool 104 and/or the attachment coupler 106
are tracked.
[0071] In an operation 816, the processor 506 may detect an
occlusion of the work tool 104. Such occlusion may be partial
(e.g., as shown in FIG. 4) or may be complete. Further, the
occlusion of the work tool 104 may refer to complete or partial
blocking of a view of the attachment coupler 106. Such occlusion
may be viewable on the output device 140 as the three-dimensional
scene 700 in which the attachment coupler image 606 from FIG. 6 is
unavailable, or partially unavailable, to the operator of the
machine 102. By way of example only and not by way of limitation,
the occlusion of the work tool 104 may occur due to a presence of
the machine component 108 in the field of view 502 of the sensor
110 and/or the view of the operator of the machine 102. It will be
appreciated that such occlusion may occur due to a variety of
reasons in addition to or other than the presence of the machine
component 108. For example, the occlusion may occur due to poor
weather conditions, or presence of other physical objects blocking
the view of the work tool 104 and/or the attachment coupler 106, or
combinations thereof. In this respect, the term "occlusion" may
refer to a lack of availability of a view of the work tool 104 in
general, and of the attachment coupler 106, in particular, for any
reason. Such lack of availability of a clear view of the attachment
coupler 106 may make it difficult and time consuming to manually
install the work tool 104 to the machine component 108.
[0072] In one aspect, the occlusion may be detected by the
processor 506 based upon the continuous tracking performed in the
operation 814. For example, the processor 506 may provide the
three-dimensional scene 600 as a first frame on the output device
140. In the first frame on the three-dimensional scene 600, the
work tool 104 and the attachment coupler 106 are clearly visible
correspondingly as the work tool image 604 and the attachment
coupler image 606. The processor 506 may continue to provide a
plurality of such frames as the machine 102 moves towards the work
tool 104 to install the work tool 104.
[0073] However, in one such plurality of frames represented, for
example, as the second frame in the three-dimensional scene 700,
the attachment coupler image 606 may not be available and may be
partially or fully blocked. In this second frame, the processor 506
detects that the work tool 104 has been occluded since the
attachment coupler image 606 is detected to be partially or fully
missing. The processor 506 carries out the occlusion detection
using the input signal 518 that informs the processor 506 about the
objects in the work scene in which the machine 102 is deployed. As
indicated by information in the input signal 518 to the processor
506, various machine components, including the machine component
108, are detected in three-dimensional space around the machine
102. Therefore, the processor 506 knows if one or more of the
machine components (e.g., the machine component 108) come into the
line of sight between the sensor 110 and the work tool 104, which
causes full occlusion or partial occlusion. The processor 506 keeps
tracking a previously detected object (e.g., the work tool 104) on
a frame-by-frame basis. In the meanwhile, the processor 506 keeps
performing object detection (e.g., in the operation 804) and
classification on each frame (e.g., in the operation 806). If the
processor 506 finds that the work tool 104, as tracked, shows
different geometric features between two adjacent frames (e.g.,
between the first three-dimensional scene 600 and the second
three-dimensional scene 700), the processor 506 determines that an
occlusion of the work tool 104 has happened.
[0074] In another aspect, the processor 506 may determine, based
upon the visual odometry performed in the operation 814 that the
machine component 108 is moving with a known velocity towards the
work tool 104 and/or the attachment coupler 106. Based upon the
first three-dimensional location 120, the second-three-dimensional
location 220 and/or the third three-dimensional location 122
determined in the operations 810 and 812, as well as the type of
the work tool 104 classified in the operation 806, the processor
506 may calculate an exact location of the work tool 104 and the
attachment coupler 106. Accordingly, even during the occlusion
detected by the processor 506, the method 800 for installing the
work tool 104 to the machine 102 does not stop and the machine 102
continues moving towards the work tool 104 and the attachment
coupler 106 to install the work tool 104 to the machine 102. In yet
another aspect, the processor 506 may seek confirmation regarding
the occlusion from the operator of the machine 102, or may receive
an input from a human operator of the machine 102 that an occlusion
has been visually detected, in addition to or independently of the
detection of occlusion determined by the processor 506 itself in
the operation 816.
[0075] In an operation 818, the processor 506 may update the
three-dimensional scene 600 based on the detected occlusion in the
operation 816. The three-dimensional scene 600, as updated to
reflect the detected occlusion of the work tool 104, may be
provided as the three-dimensional scene 700 outputted on the output
device 140. The processor 506 may further carry out the updating of
the three-dimensional scene 600 based upon information captured by
the sensor 110 in the operation 802 initially. For example, the
sensor 110 may include information regarding static structures
(trees, buildings, etc.) in or around the work scene of the machine
102. The processor 506 may use such static structure, in addition
to the dynamically changing positions of the various objects in the
field of view 502, to update the three-dimensional scene 600.
[0076] In one aspect, the processor 506 may update the
three-dimensional scene 600 using the tracking carried out in the
operation 814. For example, the three-dimensional scene 700 may
show an updated or current relative position of the work tool 104
and/or the attachment coupler 106 with respect to the machine 102
and/or the machine component 108 based upon the tracking by the
processor 506. In another example, even after the occlusion has
been detected, the processor 506 may continue updating the
three-dimensional scene 600 using computer vision since the
processor 506 has knowledge of the physical features of the work
tool 104 and a last known location of the work tool 104 and/or the
attachment coupler 106, as well as the velocity of the machine 102,
among other parameters (e.g., additional inputs from the sensor
110, the IMU 128, GPS location of the machine 102, etc.).
[0077] In an operation 820, the processor 506 may determine a
dead-reckoning or deduced reckoning of the machine 102. The
processor 506 may receive a second input signal (e.g., the input
signal 520) from the IMU 128 coupled to the electronic controller
unit 126. The second input signal is used by the processor 506 to
determine a motion of the machine 102 in three-dimensional space.
As known, the IMU 128 may receive inputs from various inertial
sensors (not shown) attached to the machine 102 to generate the
second input signal to the processor 506. In one aspect, the
processor 506 may further utilize a satellite signal or a signal
from an unmanned aerial vehicle to corroborate the information in
the second input signal regarding the dead-reckoning of the machine
102. In another aspect, the IMU 128 may be optional for purposes of
this disclosure. However, using the second input signal from the
IMU 128, in addition to the visual odometry and computer vision
based tracking performed by the processor 506, may result in a more
robust estimate of the first three-dimensional location 120, the
second three-dimensional location 220, the third three-dimensional
location 122, and the fourth three-dimensional location 222 of the
work tool 104 and the attachment coupler 106, respectively, with
respect to the sensor 110 on the machine 102.
[0078] In an operation 822, the processor 506 may perform a fusion
of the first input signal (e.g., the input signal 518) from the
sensor 110 and the second input signal (e.g., the input signal 520)
from the IMU 128 to generate a combined signal that provides a
current or most recent relative positioning between the machine 102
and the work tool 104. The combined signal may be generated by the
processor 506 by normalizing the first input signal and the second
input signal to a common format, and then verifying whether the
information regarding the motion of the machine 102 as deduced from
the visual odometry and the dead-reckoning are same or in a similar
range of values.
[0079] In an operation 824, based upon the fusion of the first and
the second input signals from the sensor 110 and the IMU 128,
respectively, the processor 506 determines an updated position of
the work tool 104 relative to the machine 102. For example, the
updated position of the work tool 104 may be obtained by
calculating the second three-dimensional location 220, after the
occlusion has been detected in the operation 816.
[0080] Likewise, in an operation 826, the processor 506 determines
the fourth three-dimensional location 222 of the attachment coupler
106 based upon the combined signal obtained in the operation 822.
For example, once an updated position of the work tool 104 is
determined by the processor 506, based upon one or more of the
classification of the work tool 104, the dead-reckoning on the
machine 102, and the third three-dimensional location 122 of the
attachment coupler 106, the processor 506 may determine the fourth
three-dimensional location 222 even after the occlusion occurs.
[0081] In an operation 828, the processor 506 may output an updated
position of the work tool 104 and the attachment coupler 106 based
upon the operations 824 and 826 to the machine control system 130.
For example, the processor 506 may output the second
three-dimensional location 220 of the work tool 104 and the fourth
three-dimensional location 222 of the attachment coupler 106 as
part of the updated position of the work tool 104 and the
attachment coupler 106. In one aspect, the processor 506 may output
the updated position of the work tool 104 and/or the attachment
coupler 106 in the three-dimensional scene 700 to the operator of
the machine 102. Such outputting of the updated positions of the
work tool 104 and/or the attachment coupler 106 may be provided on
a continuous time basis as the machine 102 moves and the occlusion
of the work tool 104 and/or the attachment coupler 106 changes, or
even vanishes after a certain time has passed without stopping the
machine 102 or without disrupting the process of installing the
work tool 104 to the machine 102.
[0082] In an operation 830, the processor 506 may control the
machine 102 and/or the machine component 108 based upon the updated
positions of the work tool 104 and the attachment coupler 106 using
the machine control system 130. Based upon the updated positions,
the machine control system 130 may provide a response signal to the
processor 506 to control a motion of the machine 102 and/or the
machine component 108. For example, the updated position may
require the machine 102 to move in a particular spatial direction
at a speed different from a current speed of the machine 102 upon
detecting the occlusion in the operation 816. Similarly, the
updated position of the work tool 104 and the attachment coupler
106 may require the machine component 108 to move in a specific
direction and speed different from a current direction in which the
machine component 108 was moving. For example, the machine
component 108 might have been moving at a particular angular
acceleration initially, prior to the occlusion. The machine control
system 130 may receive an output signal from the processor 506 to
reduce the angular acceleration with which the machine component
108 may be rotating upon detection of the occlusion. As a result,
the machine component 108 may be controlled to move in a more
accurate manner even though the attachment coupler 106 is partially
of fully occluded in the field of view 502 of the sensor 110. It
will be appreciated that the processor 506 may control the motion
of the machine 102 and/or the machine component 108 in other ways,
for example, along linear and/or circular directions, depending
upon the type of the work tool 104 and/or the attachment coupler
106.
[0083] In an operation 832, the processor 506 determines that the
machine component 108 and the attachment coupler 106 are at an
appropriate distance and relative orientation for the attachment
coupler 106 to couple to or attach to the machine component 108.
The processor 506 then outputs signals to the machine control
system 130 to attach the work tool 104 to the machine component 108
of the machine 102 at the attachment coupler 106. Based upon the
classifying of the work tool 104 and the attachment coupler 106 (in
operation 806), and the updated positions of the work tool 104 and
the attachment coupler 106 (in operations 824 and 826,
respectively), the processor 506 knows the dimensions and the
location of the attachment coupler 106 and installs the attachment
coupler 106, even when the attachment coupler 106 is occluded in
the field of view 502 of the sensor 110. Such installation may be
performed by the processor 506 outputting signals to the machine
control system 130 to open mechanical linkages on the machine
component 108 to couple to the attachment coupler 106.
[0084] In an operation 834, once the work tool 104 has been
installed to the machine component 108, the processor 506 may
output an indication of the installation of the work tool 104 to
the operator of the machine 102. Such an indication may be on the
output device 140, for example, as a visual, an audio, or an
audio-visual indication inside the operator cab 112 or to the
remote base station, or both.
[0085] In an operation 836, upon receiving the indication from the
processor 506 that the work tool 104 has been installed to the
machine 102, the operator may operate the machine 102, as
appropriate for the work site in which the machine 102 is deployed.
Such operating of the machine 102 may further include outputting
power from the machine 102 to the work tool 104 based upon the
output mode selected in the operation 808 specific to the type of
the work tool 104 as determined in the operation 806.
[0086] Various aspects of this disclosure aid the operator of the
machine 102 in the process of changing and attaching the work tool
104 when the work tool 104 and/or the attachment coupler 106 may be
temporarily or otherwise be out of view (or, occluded) from the
field of view 502 of the sensor 110. For example, when the sensor
110 is a camera, the work tool 104 may be out of the camera's view
due to the motion of the machine 102 and/or the machine component
108. This may occur, for example, when the camera is mounted on a
top of the operator cab 112 and the work tool 104 gets close to the
machine 102 at an end of a work tool changing and installation
process. In one example, the hydraulic cylinders in the machine
component 108 in front of the operator cab 112 may partially block
the field of view 502, because of which an operator of the machine
102 may not be able to determine the first three-dimensional
location 120 or the second three-dimensional location 220 of the
work tool 104.
[0087] To solve this complex problem, the processor 506 uses motion
tracking to continuously estimate the location of the machine 102
in three-dimensional space. Upon detection of an occlusion of the
work tool 104, the processor 506 can still calculate an updated
position of the work tool 104 using a last seen location of the
work tool 104 and the estimated motion of the machine 102
(received, for example, from the IMU 128). Similar to a
determination of the relative position of the work tool 104, the
processor 506 determines the relative position of the attachment
coupler 106 (e.g., an attachment pin) with respect to the machine
102 and/or the machine component 108 even after the occlusion
happens. As a result, various aspects of this disclosure ensure
that the occlusion does not disrupt the process of installing the
work tool 104 to the machine 102 at the attachment coupler 106.
[0088] It will be appreciated that the foregoing description
provides examples of the disclosed system and technique. However,
it is contemplated that other implementations of the disclosure may
differ in detail from the foregoing examples. All references to the
disclosure or examples thereof are intended to reference the
particular example being discussed at that point and are not
intended to imply any limitation as to the scope of the disclosure
more generally. All language of distinction and disparagement with
respect to certain features is intended to indicate a lack of
preference for those features, but not to exclude such from the
scope of the disclosure entirely unless otherwise indicated.
[0089] Recitation of ranges of values herein are merely intended to
serve as a shorthand method of referring individually to each
separate value falling within the range, unless otherwise indicated
herein, and each separate value is incorporated into the
specification as if it were individually recited herein. All
methods described herein can be performed in any suitable order
unless otherwise indicated herein or otherwise clearly contradicted
by context.
* * * * *