U.S. patent application number 09/810778 was filed with the patent office on 2003-09-04 for assembly line fluid filling system and method.
This patent application is currently assigned to American Controls, Inc.. Invention is credited to Besler, David, Brassard, Louis, Corkill, Dean, Crees, Tristan S., Czeranna, Hans W., Dale, James, Hoffman, Bryan, Maass, Al H., Maass, Craig A..
Application Number | 20030164200 09/810778 |
Document ID | / |
Family ID | 27805603 |
Filed Date | 2003-09-04 |
United States Patent
Application |
20030164200 |
Kind Code |
A1 |
Czeranna, Hans W. ; et
al. |
September 4, 2003 |
Assembly line fluid filling system and method
Abstract
A method for fueling an automotive vehicle 26 on a conveyer line
28 is provided. The conveyer line 28 moves generally along a first
axis 170 and the method utilizes a robotic arm 34 with an end
effector 54 for fueling a fuel stem 90 of the vehicle 26. The
method includes determining a first position of the fuel stem 90
along the first axis 170. The method further includes moving the
robotic arm 34 to position the end effector 54 proximate to the
first position. The method further includes determining a
three-dimensional position of the fuel stem 90 utilizing a vision
system 164. The method further includes moving the end effector 54
proximate to the three-dimensional position to enable the end
effector 54 to mate with the fuel stem 90 and moving said robotic
arm 34 relative to said first axis 170 at a speed substantially
equal to a speed of said conveyer line 28. Finally, the method
includes fueling the fuel stem 90 with the end effector 54.
Inventors: |
Czeranna, Hans W.;
(Plymouth, MI) ; Hoffman, Bryan; (Livonia, MI)
; Maass, Al H.; (Bloomfield Hills, MI) ; Maass,
Craig A.; (Bloomfield Hills, MI) ; Dale, James;
(Coquitlam, CA) ; Corkill, Dean; (Port Coquitlam,
CA) ; Besler, David; (Port Coquitlam, CA) ;
Crees, Tristan S.; (Burnaby, CA) ; Brassard,
Louis; (Port Moody, CA) |
Correspondence
Address: |
DYKEMA GOSSETT PLLC
39577 WOODWARD AVENUE
SUITE 300
BLOOMFIELD HILLS
MI
48304-5086
US
|
Assignee: |
American Controls, Inc.
|
Family ID: |
27805603 |
Appl. No.: |
09/810778 |
Filed: |
March 16, 2001 |
Current U.S.
Class: |
141/1 |
Current CPC
Class: |
B67D 7/0401 20130101;
B67D 2007/0473 20130101; B62D 65/18 20130101 |
Class at
Publication: |
141/1 |
International
Class: |
B65B 003/04; B65B
001/04 |
Claims
We claim:
1. A method for assembly line fluid fill of a container in a
vehicle moving along the assembly line comprising the steps of: (A)
determining a position of an inlet of the container using a machine
vision system while the vehicle is moving along the assembly line;
(B) positioning a fluid fill delivery outlet to the inlet using a
robot; and (C) filling the container via the outlet while the
vehicle continues to move.
2. The method of claim 1 further comprising the step of moving the
delivery outlet in substantial synchronism with said inlet.
3. The method of claim 2 wherein said moving step includes the
substep of updating the position of the container inlet.
4. A method for fueling an automotive vehicle on a conveyer line,
said conveyer line moving generally along a first axis, said method
utilizing a robotic arm with an end effector for fueling a fuel
stem of said vehicle, said robotic arm having a workspace within
which the fuel stem must lie in order to be fueled, said method
comprising: determining a first position of said fuel stem along
said first axis; moving said robotic arm to position said end
effector proximate to said first position such that said fuel stem
lies within said workspace; determining a three-dimensional
position of said fuel stem utilizing a vision system; moving said
end effector proximate to said three-dimensional position to enable
said end effector to mate with said fuel stem and moving said
robotic arm relative to said first axis at a speed substantially
equal to a speed of said conveyer line; and, fueling said fuel stem
with said end effector.
5. The method of claim 4 wherein said step of determining a first
position of said fuel stem includes: storing a first encoder
position value indicative of a position of said vehicle when said
vehicle passes a predetermined position on said conveyer line;
storing a second encoder position value indicative of a position of
said vehicle after said vehicle has passed said predetermined
position; and, calculating said first position of said fuel stem
responsive to said first and second encoder position values.
6. The method of claim 5 wherein the step of storing the second
position includes detecting the presence of the vehicle.
7. The method of claim 6 wherein said step of storing said second
position value includes: monitoring a light beam being projected
across said conveyer line; and, determining when said light beam is
interrupted by said vehicle.
8. The method of claim 4 wherein said vision system includes first
and second cameras and said step of determining said
three-dimensional position of said fuel stem includes: generating a
first digital image of said workspace including said fuel stem
utilizing said first camera; simultaneously generating a second
digital image of said workspace including said fuel stem utilizing
said second camera; and, calculating said three-dimensional
position of said fuel stem with respect to a predetermined
coordinate system responsive to said first and second digital
images.
9. The method of claim 4 wherein said three-dimensional position is
a center point of said fuel stem with respect to a predetermined
coordinate system, said center point lying on a plane defined by
the sealing surface (typically an outer edge) of said fuel
stem.
10. The method of claim 4 wherein said step of moving said end
effector to said second position includes the steps of: monitoring
said speed of said conveyer line while said end effector is mated
with said fuel stem; and matching the speed and position of the end
effector with that of the fuel stem.
11. The method of claim 4 wherein said step of fueling further
includes: monitoring a force exerted on said end effector by said
fuel stem; and adjusting said position of said end effector
responsive to said force.
12. The method of claim 4 wherein said step of fueling said fuel
stem with said end effector includes: opening a fuel valve in said
end effector to supply fuel to said fuel stem; and, closing said
fuel valve after a predetermined amount of fuel is pumped into said
fuel stem.
13. The method of claim 4 further including retracting said end
effector from said fuel stem after a predetermined amount of fuel
is pumped into said fuel stem.
14. The method of claim 4 further including illuminating said fuel
stem.
15. A method for providing position data to a controller to enable
a robotic arm controlled by said controller to move to a position
of an object, said object lying within the workspace of said
robotic arm, said method utilizing first and second cameras
disposed at first and second coordinate systems, respectively, said
method comprising: generating a first digital image of said
workspace including said object utilizing said first camera;
searching said first digital image with a first image template to
determine a location in said first image where a first correlation
score between that portion of said image being searched and said
template is greater than a predetermined threshold; simultaneously
generating a second digital image of said workspace including said
object utilizing said second camera; searching said second digital
image with a second image template to determine a location in said
second image where a second correlation score between that portion
of said image being searched and said template is greater than a
predetermined threshold; calculating a three-dimensional position
of said object with respect to a predetermined coordinate system
responsive to said first and second digital images when said first
and second correlation scores are both greater than a threshold
correlation score; calculating a triangulation error of said
position; and, transferring position data indicative of said
three-dimensional position to said controller when said
triangulation error is less than a threshold error value.
16. The method of claim 15 wherein said object is a fuel stem and
said three-dimensional position is a center point of said fuel
stem.
17. The method of claim 15 wherein said step of calculating said
three-dimensional position includes: calculating a first direction
vector from an origin of said first coordinate system responsive to
said first digital image, said first vector pointing towards an
estimated first center point of said object; calculating a position
of said first coordinate system and an orientation of said first
direction vector relative to said predetermined coordinate system;
calculating a second direction vector from an origin of said second
coordinate system responsive to said second digital image, said
second direction vector pointing towards an estimated second center
point of said object; calculating a position of said second
coordinate system and an orientation of said second direction
vector relative to said predetermined coordinate system;
determining a first point along said first direction vector that is
closest to said second direction vector; determining a second point
along said second direction vector that is closest to said first
point; and, calculating a midpoint between said first and second
points to obtain said three-dimensional position of said
object.
18. The method of claim 15 further including illuminating said
object.
19. The method of claim 15 further including moving said robotic
arm to said three-dimensional position.
20. A fueling system for fueling an automotive vehicle on a
conveyer line, said conveyer line moving generally along a first
axis, comprising: a gantry having a carriage configured to move
generally parallel to said first axis; a robotic arm attached to
said carriage that moves with said carriage, said robotic arm
having an end effector configured to mate with a fuel stem on said
vehicle and to supply fuel to said fuel stem through a fuel hose; a
vision system including first and second cameras for iteratively
determining a three-dimensional position of said fuel stem relative
to a predetermined coordinate system; a robot controller configured
to command said carriage to move proximate said three-dimensional
position and to move said end effector to said position to mate
with said fuel stem, said robot controller being further configured
to move said robotic arm relative to said first axis at a speed
substantially equal to a speed of said conveyer line.
21. The fueling system of claim 20 further including a position
encoder operatively connected to said conveyer line.
22. The fueling system of claim 20 further including a light sensor
for detecting when said vehicle passes a predetermined location on
said conveyer line.
23. The fueling system of claim 20 further including a light for
illuminating said fuel stem.
24. The fueling system of claim 20 wherein said vision system
further includes a vision controller and a frame grabber, said
frame grabber retrieving digital images generated by said first and
second cameras, said vision controller configured to calculate said
three-dimensional position of said fuel stem responsive to said
digital images.
25. The fueling system of claim 20 wherein the robotic controller
includes means for positioning a joint of said robotic arm; means
for calculating a desired joint position; means for automatically
positioning a stop corresponding to said desired joint position
using a low powered motor and a self-locking mechanism; and means
for driving said joint against said stop using a pneumatic
actuator.
26. The fueling system of claim 20 comprising: further including
means for detecting loss of or damage to said fuel hose; means for
providing said fuel hose with a tip having a size configured to
impair retraction through a boot of said end effector; means for
actuating said fuel hose using a pneumatic cylinder; means for
arranging a stroke of said pneumatic cylinder such that said stroke
remains when said tip has bottomed in said boot of said end
effector; means for providing a first limit switch which energizes
when said pneumatic cylinder has retracted to a first position
corresponding to said hose tip being bottomed in said boot; means
for providing a second limit switch which energizes when said
pneumatic cylinder is fully retracted to a second position beyond
said first position; and means for providing said robot controller
with logic to sense that said fuel hose has been lost or damaged
when said second limit switch is energized.
27. A method for providing updated kinematic parameters to a
controller to enable a robotic arm controlled by said controller to
compensate for dimensionally unstable portions of said robotic arm,
said method comprising the steps of: providing one or more markers
on an end effector of said robotic arm; positioning the markers
within a workspace of said robotic arm; using a vision system
associated with said robotic arm to determine respective
three-dimensional locations of said markers; and calculating
updated kinematic parameters by comparing the determined locations
of said markers returned by said vision system to respective
expected locations of said markers.
Description
REFERENCE TO A COMPUTER PROGRAM LISTING APPENDIX
[0001] A computer program listings appendix is contained on a
compact disc submitted herewith hereby incorporated by reference.
One compact disc and one duplicate are submitted according to 37
C.F.R. .sctn.1.52(e) and each contains the following files:
1 FILE NAME SIZE IN BYTES DATE OF CREATION Cal.rsp 1 KB May 21,
1995 Cal_eval.c 12 KB Jul. 15, 1995 Cal_main.c 72 KB Sep. 12, 1995
Cal_main.h 10 KB Jul. 15, 1995 Cal_tran.c 14 KB Oct. 22, 1995
Cal_util.c 8 KB May 14, 1995 Cc_cd.dat 9 KB Oct. 28, 1995
Cc_cpcc.dat 1 KB Oct. 28, 1995 Ccal.c 4 KB Apr. 1, 1995 Ccal.log 4
KB Oct. 28, 1995 Ccal.run 1 KB Apr. 2, 1995 Ccal_fo.c 4 KB Apr. 1,
1995 Changes.txt 6 KB Oct. 28, 1995 Csyn.c 7 KB May 21, 1995
Dpmpar.c 7 KB Apr. 1, 1995 Dpmpar.f 6 KB Feb. 15, 1995 Ecal.c 4 KB
May 17, 1995 Ecal.log 3 KB Oct. 28, 1995 Ecal.run 1 KB Oct. 28,
1995 Ecalmain.c 23 KB Oct. 15, 1995 Ecccpcc.dat 1 KB Oct. 28, 1995
Encccpcc.dat 1 KB Oct. 28, 1995 Enorm.c 4 KB Apr. 1, 1995 Enorm.f 4
KB Mar. 25, 1994 F2c.h 5 KB Feb. 25, 1992 F2c.ps 138 KB Oct. 20,
1995 Faq.txt 12 KB Oct. 28, 1995 Fdjac2.c 5 KB Apr. 1, 1995
Fdjac2.f 4 KB Mar. 25, 1994 Gasdev.c 2 KB Jul. 17, 1995 Ic2wc.c 3
KB Apr. 1, 1995 Index.txt 3 KB Oct. 28, 1995 Lmdif.c 17 KB Apr. 1,
1995 Lmdif.f 16 KB Mar. 25, 1994 Lmpar.c 10 KB Apr. 1, 1995 Lmpar.f
9 KB Mar. 25, 1994 Makefile.bor 3 KB Oct. 23, 1995 Makefile.unx 3
KB Oct. 20, 1995 Matrix.c 12 KB Jul. 15, 1995 Matrix.h 1 KB Jul.
15, 1995 Minpack.rsp 1 KB May 14, 1995 Ncc_cd.dat 27 KB Oct. 28,
1995 Ncc_cpcc.dat 1 KB Oct. 28, 1995 Nccal.c 4 KB May 20, 1995
Nccal.log 4 KB Oct. 28, 1995 Nccal.run 1 KB Apr. 2, 1995 Nccal_fo.c
4 KB May 20, 1995 Ncsyn.c 7 KB May 21, 1995 Notes.txt 6 KB Oct. 28,
1995 Qrfac.c 7 KB Apr. 1, 1995 Qrfac.f 6 KB Mar. 25, 1994 Qrsolv.c
8 KB Apr. 1, 1995 Qrsolv.f 7 KB Mar. 25, 1994 Wc2ic.c 3 KB Apr. 1,
1995 Xfd2xfu.c 4 KB Jul. 15, 1995
BACKGROUND OF THE INVENTION
[0002] 1. Technical Field
[0003] This invention relates to a robotic assembly line filling
system.
[0004] 2. Description of the Related Art
[0005] Referring to FIG. 1, a known system 10 for fueling an
automotive vehicle 12 is shown. The vehicle 12 is moved along an
assembly line via a conveyer line 14. When the vehicle 12
progresses into a vehicle fueling area an operator 16 inserts a
fuel nozzle 18 into a fuel stem (not shown) to fuel the vehicle 12.
Because the conveyer line 14 and the vehicle 12 are moving, a
gantry 20 and a carriage 22 are utilized to move the fuel nozzle 18
along with the vehicle 12. After insertion of the nozzle 18, a
predetermined amount of fuel is pumped into the fuel stem of the
vehicle 12. Thereafter, the operator 16 may remove the nozzle 18
from the fuel stem or the nozzle 18 may be automatically removed
from the fuel stem. The known fueling system 10, however, has a
drawback in that the operator 16 must manually insert the fuel
nozzle 18 into the fuel stem. Labor and associated manufacturing
costs are increased.
[0006] U.S. Pat. No. 4,708,175 issued to Janashak et al. discloses
a robot that fills a container mounted on a vehicle with a fluid.
The vehicle is mounted on a conveyer line that moves into a work
cell where the robot is located. The robot utilizes a vision system
to determine the position of the inlet of the container. The robot
then moves a robotic arm to the position of the inlet to fill the
container with fluid, using gauge holes as visual target points.
While Janashak et al. appear to disclose that the system may be
used to fill a moving container provided the robot is capable of
tracking the moving vehicle, no description of how this can be
accomplished is provided. In particular, Janashak et al. do not
disclose how to compensate for motions of the vehicle relative to
the assembly line conveyor that cannot be detected by an encoder
connected to an assembly line conveyor, do not disclose how to
compensate for vehicle to vehicle variations in the location of the
fuel stem relative to other parts of the vehicle, nor do Janashak
et al. disclose how to accomplish such filling of a container while
the vehicle moves without first touching the vehicle or attaching
anything to the vehicle in order to facilitate detection of the
container inlet or, finally, how to coordinate such motion to fill
the container without constraining the motion of the vehicle in an
uncharacteristic fashion (i.e., without stopping them, stabilizing
them, etc. where such motion is only a requirement of the fueling
system and not typical of the conveyor itself). Applicants assume
that, Janashak et al. teach no more than that the conveyor line
therefore must stop in order to fill the container. Stopping a
conveyer line results in an increased time to complete the vehicle
build, which results in increased manufacturing and labor
costs.
[0007] There is thus a need for a fluid filling system and method
that reduces and/or minimizes one or more of the above-identified
deficiencies.
SUMMARY OF THE INVENTION
[0008] One advantage of the present invention is that the vehicles
on the assembly line need not be stopped to have a fluid container,
such as a fuel tank, filled.
[0009] The present invention, broadly, provides a system and method
for filling a container with a fluid in a vehicle moving along an
assembly line. The fluid can be, for example only, fuel, coolant,
windshield washer fluid, and the like. The method includes the
steps of determining a position of an inlet of the container (as
the vehicle is moving) using a machine vision system. The next step
involves moving a fluid fill outlet to the container inlet using a
robot. Finally, the container is filled via the outlet while the
vehicle continues to move.
[0010] In a preferred embodiment, a fueling system in accordance
with the present invention includes a gantry having a carriage
configured to move generally parallel to a first axis (i.e., the
conveyor line on which the vehicle moves). The fueling system
further includes a robotic arm attached to the carriage that moves
with the carriage. The robotic arm has an end effector configured
to mate with a fuel stem on the vehicle and to supply fuel to the
fuel stem. The fueling system further includes a vision system
including first and second cameras for iteratively determining a
three-dimensional position of the fuel stem relative to a
predetermined coordinate system. The position is preferably a
center point of the fuel stem lying on a plane defined by the
sealing surface (typically an outer edge) of the fuel stem.
Finally, the fueling system includes a robot controller configured
to command the carriage to move proximate to the position of the
fuel stem and to move the end effector to the three-dimensional
position to mate with the fuel stem. The robot controller is
further configured to move the robotic arm relative to the first
axis at a speed substantially equal to a speed of the conveyer
line.
[0011] A method is also provided for fueling an automotive vehicle
on a conveyer line. The method utilizes a robotic arm with an end
effector for fueling a fuel stem of the vehicle. The method
includes determining a first position of the fuel stem along a
first axis. The method further includes moving the robotic arm to
position the end effector proximate to the first position. The
method further includes determining a three-dimensional position of
the fuel stem utilizing a vision system. The method further
includes moving the end effector proximate to the three-dimensional
position to enable the end effector to mate with the fuel stem and
moving the robotic arm relative to the first axis at a speed
substantially equal to a speed of the conveyer line. Finally, the
method includes fueling the fuel stem with the end effector.
[0012] A further method for providing position data to a controller
is also provided. The method provides position data to the
controller to enable a robotic arm controlled by the controller to
move to a position of an object. The method utilizes first and
second cameras disposed at first and second camera coordinate
systems, respectively. The method in a preferred embodiment
includes simultaneously acquiring a first digital image and a
second digital image of a workspace including the object utilizing
the first camera and the second camera, respectively. The method
further includes searching the first digital image with a first
fuel stem template to determine a location in the first image where
a first correlation score between that portion of the image being
searched and the template is greater than a predetermined
threshold. The method further includes searching the second digital
image with a second fuel stem template to determine a location in
the second image where a second correlation score between that
portion of the second image being searched and the template is
greater than a predetermined threshold. The method further includes
calculating a three-dimensional position of the object with respect
to a predetermined coordinate system. The position is calculated
responsive to the locations in the first and second digital images
where the correlation scores were above the threshold. The method
further includes calculating a triangulation error of the position.
In a more preferred embodiment, the method further includes
transferring position data indicative of the calculated position to
the controller when the fuel stem matches are found in both images
and the triangulation error is less than a threshold error
value.
[0013] The fueling system for fueling an automotive vehicle on a
conveyer line and the method related thereto represent a
significant improvement over conventional fueling systems and
methods. In particular, the inventive fueling system allows the
automatic fueling of a fuel stem without the need for an operator,
resulting in labor savings. Further, the inventive fueling system
can track the position of the fuel stem on a moving vehicle (on a
conveyer line) and move an end effector to mate with the fuel stem.
Thus, the vehicle can be fueled without stopping the conveyer line
resulting in decreased manufacturing costs and increased
manufacturing efficiency.
[0014] These and other features and advantages of this invention
will become apparent to one skilled in the art from the following
detailed description and the accompanying drawings illustrating
features of this invention by way of example.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] FIG. 1 is a schematic of a known fueling system for an
automotive vehicle on a conveyer line.
[0016] FIG. 2 is a schematic of a fueling system in accordance with
the present invention.
[0017] FIG. 3 is a schematic of a gantry and a robotic arm shown in
FIG. 2.
[0018] FIG. 4 is a side view of the gantry and the robotic arm
shown in FIG. 3.
[0019] FIG. 5 is an enlarged perspective view of the robotic arm
shown in FIG. 3.
[0020] FIG. 6 is a side view of the robotic arm in its kinematic
zero position (all joint variables equal to zero).
[0021] FIG. 7 is a schematic showing the coordinate systems
utilized to control the position of the robotic arm.
[0022] FIG. 8 is an exploded schematic of an end effector of the
robotic arm shown in FIG. 5.
[0023] FIG. 9 is block diagram of a robotic control system utilized
by the inventive fueling system.
[0024] FIGS. 10A-10C are schematics illustrating the tracking
variables utilized for positioning the robotic arm.
[0025] FIG. 10D is a tracking equation utilized to position the
robotic arm along the J1 axis.
[0026] FIG. 11 is a schematic illustrating a triangulation
technique utilized by the vision system to determine the three
dimensional position of the fuel stem.
[0027] FIG. 12 is a schematic illustrating a two-dimensional
digital image coordinate system.
[0028] FIG. 13 is a schematic illustrating a two-dimensional camera
sensor coordinate system.
[0029] FIG. 14 is a schematic illustrating a three-dimensional
camera coordinate system.
[0030] FIG. 15 is a schematic illustrating equations that define
the intrinsic camera model.
[0031] FIG. 16 is a schematic illustrating a force vector applied
to the end effector.
[0032] FIG. 17 is a schematic illustrating a Jacobian Transpose
relationship utilized for force feedback control of the robotic
arm.
[0033] FIG. 18 is a schematic illustrating an inverse kinematics
model utilized by the robot control system to position and orient
an end effector.
[0034] FIG. 19 is a schematic of a fuel stem.
[0035] FIG. 20 is a schematic of a docking path of an end effector
with a fuel stem.
[0036] FIGS. 21A-21G are flowcharts illustrating the modes of
operation for the fueling system in accordance with the present
invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0037] Referring now to the drawings wherein like reference
numerals are used to identify identical components in the various
views, FIG. 2 illustrates a fueling system 24 for fueling an
automotive vehicle 26 on a conveyer line 28. As shown, the vehicle
26 is on a carriage 30 moving on the conveyer line 28. It should be
understood, however, that the inventive fueling system 24 may be
configured to operate with any known type of conveyer line,
including for example, overhead conveyer lines (see FIG. 1) and
floor mounted conveyer lines. An advantage of the fueling system 24
is that the vehicle 26 may be fueled without stopping the vehicle
26 on the conveyer line 28. The inventive fueling system 24
includes a gantry 32, a robotic arm 34, and a robot control system
36.
[0038] Referring to FIG. 3, the gantry 32 is provided to move the
robotic arm 34 substantially parallel to the conveyer line 28. The
gantry 32 includes a frame 38, a carriage 40, and a motor 42. The
frame 38 may be supported by legs (not shown) or may be mounted to
ceiling supports. The motor 42 is mounted on the gantry 32 and is
operatively connected to a drive belt (not shown) that is connected
to the carriage 40. Thus, rotational movement of a rotor (not
shown) of the motor 42 causes the drive belt to move the carriage
40 along the J1 axis in either a forward direction (to the right in
FIG. 2) or a backward direction (to the left in FIG. 2). The motor
42 is electrically connected to a motor driver 44 (see FIG. 9) that
is controlled by the robot controller 46. Thus, the controller 46
can selectively control the position of the carriage 40 and the
robotic arm 34 along a J1 axis. The motor 42 further includes an
internal encoder 48 (not shown in FIG. 1). The encoder 48 generates
a position value J1_VAL indicative of the position of the carriage
40 that is received by the robot controller 46.
[0039] Referring to FIG. 3, the robotic arm 34 is provided to fuel
the automotive vehicle 26. As shown, the robotic arm 34 is
connected to the carriage 40 and moves with the carriage 40 along
the J1 axis. Prior to fueling the vehicle 26, the robotic arm 34 is
rotated to a desired angle about the J2 axis which matches the
orientation of a fuel stem 90 in the plane of J2 (see FIG. 10A) of
the vehicle 26. In a preferred embodiment, the robotic arm 34 is
automatically rotated to the desired J2 angle. In an alternate
embodiment, the J2 angle could be manually fixed. Referring to FIG.
5, the robotic arm 34 includes a manipulator arm 50, a pitch arm
52, an end effector 54, a camera arm 56, and a light 58.
[0040] The manipulator arm 50 is provided to support the pitch arm
52 and the end effector 54. Referring to FIGS. 4 and 5, the
manipulator arm 50 includes a frame 60, a motor 62, a linear
actuator 64, and a joint 66. As shown, the frame 60 is connected to
a first end of the pitch arm 52 via the joint 66. The motor 62 is
provided to drive the linear actuator 64 to thereby move the pitch
arm 52 about the J3 axis. In particular, the linear actuator 64
converts rotary movement of a rotor (not shown) of the motor 62
into linear movement of the upper end of a rod 68. The lower end of
the rod 68 is connected to the pitch arm 52 via a pin joint 70.
Thus, linear actuation of the upper end of the rod 68 causes
rotational motion of the pitch arm 52 about the J3 axis.
[0041] Referring to FIGS. 5 and 9, the motor 62 is electrically
connected to a motor driver 72 that is controlled by the robot
controller 46. Thus, the controller 46 can selectively control the
position of the linear actuator 64 to move the pitch arm 52 to a
desired rotational angle about the J3 axis. Further, an encoder 74
is operatively mounted on the joint 66. The encoder 74 generates a
position value .theta.3_VAL indicative of a rotational angle of the
pitch arm 52 about the J3 axis that is received by the robot
controller 46. Still further, an encoder 76 is operatively mounted
between the carriage 40 and the manipulator arm 50. The encoder 76
generates a position value .theta.2_VAL indicative of a rotational
angle of the manipulator arm 50 about the J2 axis that is received
by the robot controller 46.
[0042] Referring to FIG. 4, the pitch arm 52 is provided to support
the end effector 54. The pitch arm 52 includes an L-shaped frame
78, a pneumatic cylinder 80, and a joint 82. As illustrated, a
second end of the pitch arm 52 is connected to the end effector 54
via the joint 82. The pneumatic cylinder 80 is mounted on the pitch
arm 52 and has a cylinder rod 84 connected to the end effector 54.
Further, the cylinder 80 is connected to a pneumatic control valve
86 (see FIG. 9) which is selectively controlled by the robot
controller 46. The valve 86 may be a dual solenoid three-position
open or closed center valve having an extend solenoid (not shown)
and a retract solenoid (not shown). When the extend solenoid is
energized (and the retract solenoid is de-energized), the cylinder
80 extends the rod 84 to rotate the end effector 54 to a desired
rotational angle about the J4 axis as defined by an adjustable hard
stop. The hard stop connected to the cylinder rod is adjusted prior
to moving the actuator (which only makes full motion movements).
This may be done manually, but is preferably done automatically.
This method of controlling the joint position takes advantage of
the speed, power, and safety (i.e., in proximity to gasoline) of a
pneumatic actuator while avoiding the difficulty in mid-positioning
pneumatic actuators, and particularly avoiding the lack of
stiffness exhibited by a mid-positioned pneumatic actuator. The
hard stop is preferably moved by a low power electric or pneumatic
motor using a self-locking mechanism such as a worm gear drive or a
self-locking lead screw so that the power required to move the stop
is less than the power required to move the joint. When the retract
solenoid is energized (and the extend solenoid is de-energized) the
cylinder 80 retracts the rod 84 to rotate the end effector 54 back
to a home position about the J4 axis. Referring to FIGS. 5 and 9,
an encoder 88 is operatively mounted on the joint 82. The encoder
88 generates a position value .theta.4_VAL indicative of a
rotational angle of the pitch arm 52 about the J4 axis that is
received by the robot controller 46.
[0043] The end effector 54 is provided to mate with the fuel stem
90 of the automotive vehicle 26. Referring to FIG. 8, the end
effector 54 includes a rodless cylinder 92, a mounting bracket 94,
a guide 96, a manifold 98, a fuel hose 100, a boot 102, a front
housing 104, load cells 106, 108, 110, 112, pneumatic cylinder 114,
fuel valves 116, 118, elbows 120, 122, 124, 126, 128, 130, pipe
tees 132, 134, 136, 138, check valves 140, 142, vapor fittings 144,
146, 148, 150, and a position potentiometer 152.
[0044] The rodless cylinder 92 is provided to move the mounting
bracket 94 and the remaining components of the end effector 54
along the J5 axis. In particular, the cylinder 92 is provided to
move the boot 102 against the fuel stem 90. The cylinder 92 has a
slidable plate 154 that may be extended or retracted along the J5
axis. Referring to FIGS. 8 and 9, the cylinder 92 may be
operatively connected to a pneumatic control valve 156 that is
controlled by the robot controller 46. As shown, the plate 154 is
attached to the mounting bracket 94.
[0045] Referring to FIG. 8, the mounting bracket 94 is provided to
support the remaining components of the end effector 54 (excluding
the rodless cylinder 92). As shown, the guide 96 is attached to
bracket 94 to allow axial movement of the manifold 98 and the fuel
hose 100 relative to the bracket 94 and the boot 102.
[0046] The manifold 98 is provided to direct fuel from either fuel
valve 116 or fuel valve 118 through the fuel hose 100. Further, the
pneumatic cylinder 114 is provided to selectively move the manifold
98 (discussed in greater detail below) along the guide 96 relative
to the bracket 94.
[0047] The fuel valve 116 may receive fuel (e.g., gasoline) from
the elbow 122 which is connected to a first fuel line (not shown).
When the valve 116 is open, fuel is supplied to the manifold 98 and
to the fuel hose 100. When the fuel valve 116 is closed, the fuel
may be recirculated through elbow 120 for cooling by cooling
equipment (not shown). The valve 116 is selectively controlled by
the robot controller 46.
[0048] The fuel valve 118 may receive fuel (e.g., diesel fuel) from
the elbow 126 which is connected to a second fuel line (not shown).
When the valve 118 is open, fuel is supplied to the manifold 98 and
to the fuel hose 100. When the fuel valve 118 is closed, no fuel is
supplied to the manifold 98. The valve 118 is also selectively
controlled by the robot controller 46.
[0049] The boot 102 is provided to mate with the fuel stem 90
during fueling of the stem 90. The boot 102 may be constructed of a
resilient material such as rubber or plastic and includes a hollow
body portion 103 and a boot seal 101. The boot seal 101 is
configured to seal against a top edge of the fuel stem 90 to
prevent fuel and fuel vapor from escaping from the stem 90 during
fueling. As shown, the boot 102 is attached to the front housing
104. The boot is also electrically connected to the front housing,
and provides a conductive path which grounds the robot to the
vehicle upon insertion.
[0050] The fuel hose 100 is provided to be inserted into the fuel
stem 90 beneath the no-lead insert (not shown) after the boot 102
has mated with the stem 90. The fuel hose 100 is constructed of a
flexible plastic and extends from an outlet (not shown) on the
manifold 98 and through the housing 104 and the boot 102. As shown,
the pneumatic cylinder 114 is provided to move the manifold 98 and
the fuel hose 100 along the J5 axis to insert the hose 100 into the
stem 90. Referring to FIGS. 8 and 9, the cylinder 114 is
operatively connected to a pneumatic control valve 158 that is
selectively controlled by the robot controller 46.
[0051] The check valves 140, 142 are provided to purge manifold 98
and fuel hose 100 of fuel. When both fuel valves 116, 118 are
closed, an air supply (not shown) may apply air through both of the
check valves 140, 142 to force any fuel remaining in the manifold
98 and the fuel hose 100 into the fuel stem 90.
[0052] The vapor recovery fittings 144, 146, 148, 150, pipe tee
138, and elbow 130 may be connected together for fuel vapor
recovery as known by those skilled in the art.
[0053] The load cells 106, 108, 110, 112 are provided to measure a
force applied to the end effector 54 by the fuel stem 90 during
fueling. The load cells 106, 108, 110, 112 are conventional in the
art and are mounted between the front housing 104 and the mounting
bracket 94. Thus, the front housing 104 transmits force exerted on
the boot 102 to the load cells 106, 108, 110, 112. Referring to
FIG. 9, the load cells 106, 108, 110, 112 are electrically
connected to the robot controller 46. The controller 46 may receive
signals generated by the load cells 106, 108, 110, 112 and
calculate a force vector for force feedback control of the robotic
arm 34 during fueling.
[0054] Referring to FIGS. 4 and 8, during the process of mating the
boot 102 with the fuel stem 90, the end effector 54 is rotated
about the joint 82 to a calculated rotation angle .theta..sub.4.
The rodless cylinder 92 then extends the remaining components of
the end effector 54 along the J5 axis to mate the boot 102 with the
fuel stem 90. Finally, the pneumatic cylinder 114 extends the fuel
hose 100 along the J5 axis to enter the fuel stem 90 beneath the
no-lead insert (not shown).
[0055] Referring to FIG. 5, the camera arm 56 is provided to house
cameras 182, 184. As shown, the camera arm 56 is attached to a side
of the manipulator arm 50.
[0056] The light 58 is provided to illuminate the fuel stem 90 on
the vehicle 26 to allow the vision system 164 to recognize the fuel
stem 90 regardless of ambient lighting conditions. The light 58 may
comprise a fluorescent metal halide fixture (e.g., a Class I Div. 2
rated fixture in a constructed embodiment). Referring to FIG. 7,
the coordinate systems utilized by the robot controller 46 to
position the robotic arm 34 (and the end effector 54) are shown. In
particular, a coordinate system CS0 represents a home position of
the carriage 40. The origin of the coordinate system CS0 lies on
the J1 axis. A coordinate system CS1 is located on the carriage 40
and accordingly moves with the carriage 40. The coordinate system
CS1 is utilized extensively for tracking the fuel stem 90 along the
J1 axis, as discussed in greater detail below. The coordinate
system CS2 is utilized for positioning the robotic arm 34 about the
J2 axis. Coordinate system CS2 is also the reference coordinate
system for the vision system, which uses it to locate and orient
the camera coordinate systems and to report object position and
orientation vectors. The coordinate system CS3 is utilized for
positioning the pitch arm 52 about the J3 axis. Further, the
coordinate system CS4 is utilized for positioning the end effector
54 about the J4 axis. Finally, the coordinate system CS5 is
utilized for positioning the boot 102 along the J5 axis.
[0057] Referring to FIG. 9, a robot control system 36 is provided.
The robot control system 36 includes a light sensor 160, a conveyer
line encoder 162, a robot controller 46, a vision system 164,
encoders 48, 74, 76, 88, the position potentiometer 52, load cells
106, 108, 110, 112, motor drivers 44, 72, control valves 86, 156,
fuel valves 116, 118, and the fuel hose extend valve 158.
[0058] Referring to FIG. 2, the light sensor 160 is provided to
detect when the vehicle 26 enters a vehicle fueling area on an
assembly line. The light sensor 160 is conventional in the art and
includes a light transmitter and a light receiver in package 166 to
be used with a corresponding reflector 168. The package 166 is
oriented on a first side of the conveyer line 28 to project a light
beam, in one embodiment, across the conveyer line 28. In the
illustrated embodiment, the reflector 168 is oriented and
positioned on an opposite side of the conveyer line 28 to receive
the light beam. When the vehicle 26 enters the fueling area, the
vehicle 26 blocks the light beam from being reflected by the
reflector 168. In response, the receiver in package 168 generates a
trigger signal V.sub.TRIG that is received by the robot controller
46. In the case where the beam is reflected by the vehicle, it is
the presence (not absence) of a reflected beam that generates the
trigger signal. In any event, the position of the light sensor 160
is designated as the light sensor trigger position hereinafter (see
FIG. 10A).
[0059] In an alternate embodiment V.sub.TRIG could be generated by
any other sufficiently repeatable mechanism (e.g., mechanical
switch, ultrasonic sensor, bar code reader, etc.) to reliably
indicate the presence of the next vehicle to the robot
controller.
[0060] Referring to FIGS. 2 and 9, the conveyer line encoder 162 is
provided to generate an encoder count CL_VAL indicative of a
current encoder count with respect to the conveyer line 28. The
robot controller 46 utilizes two encoder counts to determine a
gross displacement distance along the axis 170 of the vehicle 26
(and the fuel stem 90) from the light sensor trigger position.
[0061] Referring to FIG. 9, the robot controller 46 is provided to
control the carriage 40 and the robotic arm 34 for fueling the
automotive vehicle 26. The robot controller 46 has a communication
bus 172 for communicating with the vision controller 174. Further,
robot controller 46 also has a communication bus 176 for
communicating with a supervisory PLC 178 which may control the
operation of the fuel pumping and metering equipment and various
safety devices. The PLC 178 is not a component of the robot control
system 36 and is shown only for illustrative purposes. As shown,
the robot controller 46 is electrically connected to the conveyer
line encoder 162, the light sensor 160, the load cells 106, 108,
110, 112, and the vision controller 174. The robot controller 46
receives encoder counts (from the conveyer line encoder 162), and
the signals V.sub.TRIG, V.sub.L1, V.sub.L2, V.sub.L3, V.sub.L4, and
a three-dimensional position P.sub.e of the fuel stem 90 from the
vision controller 174--to control the position of the robotic arm
34. The method for determining the position of the fuel stem 90
will be explained in greater detail below. The controller 46 is
also electrically connected to encoders 48, 74, 76, 88 and the
position potentiometer 52. Thus, the controller 46 receives the
joint position values J1_VAL, .theta.2_VAL, .theta.3_VAL,
.theta.4_VAL, J5_VAL and is able to calculate a current position of
the end effector 54. The controller 46 is also electrically
connected to motor drivers 44, 72, control valves 86, 156, 158,
fuel valves 116, 118, and generates control signals to control the
foregoing devices. The controller 46 further includes a
programmable memory for implementing the various modes of operation
of the fueling system 24 which will be described in greater detail
below.
[0062] Referring to FIG. 18, the robot controller 46 utilizes an
inverse kinematic model of the robotic arm 34 to position the arm
34. In particular, the inverse kinematic model is a set of
equations that allow the joint variables (i.e., .theta.3 and J5 of
the robotic arm 34 to be calculated in order to place the end
effector 54 in a desired position and orientation. Inverse
kinematic models are utilized extensively in robotic applications
and can readily be determined by one skilled in the art.
Accordingly, the underlying inverse kinematic model equations will
not be discussed in any further detail. As shown, one set of inputs
x, y, z represent the desired position P.sub.e of the end of boot
102 with respect to a predetermined coordinate system, such as
coordinate system CS1. The inputs x, y, z (i.e., point P.sub.e) are
determined by the vision system 164 while tracking the fuel stem 90
and may be stored in the variables CARRIAGE_TO_STEMX,
CARRIAGE_TO_STEMY, CARRIAGE_TO_STEMZ in the robot controller 46.
The input a.sub.x, a.sub.y, a.sub.z, is a unit vector representing
the orientation of the fuel stem 90 (see FIG. 20) with respect to
the coordinate system CS 1. Because the orientation of the fuel
stem 90 does not vary a large amount when the vehicle 26 moves on
the conveyer line 28, the orientation of the fuel stem 90 may be a
stored value. In particular, unit vector a.sub.x, a.sub.y, a.sub.z,
may be stored as VID data in non-volatile memory (e.g., hard disk)
of the vision controller and is readily determined by one skilled
in the art for a specific fuel stem 90. The DH constants (see FIGS.
6 and 7) are geometric dimensions relating to the robotic arm 34
and are stored in the non-volatile memory (e.g., hard disk) of the
robot controller 46. Finally, the angles .theta.2 and .theta.4
represent desired positions of the robotic arm 34 about the J2 axis
and J4 axis, respectively, for fueling the fuel stem 90. The angles
.theta.2 and .theta.4 may also be stored in the non-volatile memory
(e.g., hard disk) of the robot controller 46 or may be calculated
from the unit vector a.sub.x, a.sub.y, a.sub.z.
[0063] Referring to FIG. 9, the vision system 164 is provided to
generate a three-dimensional position value (i.e., point P.sub.e)
of a fuel stem 90 with respect to a predetermined coordinate system
(e.g., CS1). The vision system 164 includes the vision controller
174, the frame grabber 180, inner camera 182, and outer camera
184.
[0064] The vision controller 174 is provided to calculate a
three-dimensional position P.sub.e of the fuel stem 90. In
particular, the robot controller 46 may request a position of the
fuel stem 90 from the controller 174 via the bus 172. In response,
the controller 174 may calculate the position P.sub.e and return
the position P.sub.e to the controller 46. The vision controller
174 may calculate the position of the fuel stem 90 responsive to
two digital images generated by the cameras 182, 184--which will be
explained in greater detail below.
[0065] The robot controller 46 may also request vehicle
identification data (VID) from the vision controller 174. The VID
data may be stored in non-volatile memory (e.g., hard disk) of the
vision controller 174. In particular, a user or an assembly line
controller may input a VID number identifying the particular
vehicle type, via an input device or a serial bus (not shown), to
the robot controller 46. The robot controller 46 may transmit the
VID number to the vision controller 174, which retrieves a VID
record from its hard disk. The VID record may be transmitted to the
robot controller 46 and maintained in a random access memory (RAM)
of the vision controller 174. The VID record contains vehicle
dependent information used by the robot controller 46 to track the
position of the fuel stem 90. In particular, the VID record may
include the following information:
[0066] a.sub.x, a.sub.y, a.sub.z, unit vector defining orientation
of the fuel stem 90 with respect to the coordinate system CS1;
[0067] HOME_TO_STEM=distance from a home position of the carriage
40 to a position P.sub.0 on the J1 axis projection of fuel stem
center onto z0) when the vehicle 26 crosses the light sensor
trigger position (see FIG. 10A);
[0068] DESIRED_CARRIAGE_TO_STEMY=desired distance measured with
respect to y1 from the carriage 40 to a position P.sub.1 on the J1
axis for the end effector 54 to mate with the fuel stem 90;
[0069] DIGITAL_IMAGE_TEMPLATE1=digital image template of the fuel
stem 90 for the inner camera 182;
[0070] DIGITAL_IMAGE_TEMPLATE2=digital image template of the fuel
stem 90 for the outer camera 184;
[0071] APPROACH_ANGLE=the angular offset in the vertical plane
between z5 and the stem axis 243.
[0072] The information contained in the VID record may be
determined by one skilled in the art. In a preferred embodiment,
the information is measured and stored by the vision controller 174
and the robot controller 46. In an alternate embodiment, the
information could be determined using conventional survey
equipment.
[0073] The vision controller 174 may utilize commercially available
vision software for template matching to determine if cameras 182
and 184 are viewing the fuel stem 90. In a constructed embodiment,
the commercially available vision software comprised Matrox Imaging
Library, Version 6.0 sold by Matrox Electronic Systems Ltd. of
Dorval, Quebec, Canada. In particular, the conventional software
searches the first digital image of the workspace, acquired by
camera 182 and frame grabber 180, for a location in which the
correlation score with the first digital image template (i.e.,
DIGITAL_IMAGE_TEMPLATE1) is higher than a pre-determined acceptance
level. If such an image location is found then there is a fuel stem
match in the first digital image. The conventional software
searches the second digital image of the workspace, acquired by
camera 184 and frame grabber 180, for a location in which the
correlation score with the second digital image template (i.e.,
DIGITAL_IMAGE_TEMPLATE2) is higher than a pre-determined acceptance
level. If such an image location is found then there is a fuel stem
match in the second digital image. If there is a fuel stem match in
both digital images, i.e., if both cameras 182, 184 are viewing the
fuel stem 90, then the vision controller 174 further proceeds to
calculate a three-dimensional point P.sub.e corresponding to the
center of the fuel stem. The software methodology for calculating
the point P.sub.e will be explained in greater detail below.
[0074] The frame grabber 180 is provided to simultaneously trigger
the cameras 182, 184 and to simultaneously digitize the first and
second analog images acquired by the cameras 182, 184,
respectively, and to transmit the first and second digital images
to the vision controller 174. The frame grabber 180 is conventional
in the art and may comprise a multi-channel frame grabber capable
of simultaneously triggering and digitizing images coming from at
least two monochrome progressive scan analog cameras. In a
constructed embodiment, frame grabber 180 comprises a Matrox
Meteor-II/MC Frame Grabber manufactured by Matrox Electronics Ltd.
of Dorval, Quebec, Canada. The frame grabber 180 may receive a
retrieve signal V.sub.TR from vision controller 174. In response,
the frame grabber 180 may generate retrieve signals V.sub.TRI and
V.sub.TRO to simultaneously trigger the cameras 182, 184 to acquire
the first and second analog images, respectively, of the workspace
186. The cameras 182, 184 may simultaneously begin to transfer the
first and second analog images to the frame grabber 180, which may
simultaneously digitize them and may further transfer the first and
the second digital images to the vision controller 174.
[0075] The first and second cameras 182, 184 are used to acquire
the first and the second analog images, respectively, of the
workspace 186. The cameras 182, 184 are conventional in the art and
may comprise monochrome CCD cameras having a progressive scan
capability and a trigger shutter mode capability. In a constructed
embodiment, each of cameras 182, 184 comprised a XC-55 Progressive
Scan Camera Module manufactured by Sony Electronics Inc. of Itasca,
Ill. As previously discussed, the cameras 182, 184 may acquire
first and second analog images of the workspace responsive to the
trigger signals V.sub.TRI and V.sub.TRO, respectively.
[0076] Referring to FIG. 5, the cameras 182, 184 are mounted within
the camera arm 56. The camera arm 56 is provided to serve a number
of functions. It acts as an enclosure which protects the cameras
182, 184 and their lenses (not shown) from dust, drips, and
tampering. The camera arm is mounted to the manipulator arm 34
using precision shoulder screws (not shown) to guarantee precise
and repeatable alignment. This allows camera arms 56 to be factory
calibrated but interchangeable in the field. Finally, the camera
arm is provided to position the cameras 182, 184 such that their
fields of view cover the workspace 186 (see FIG. 4). The workspace
186 is defined to be the volume of space containing the various
possible positions in which the robotic manipulator 34 and the end
effector 50 are capable of servicing the fuel stem 90. Thus, when
the cameras 182, 184 are disposed proximate to the fuel stem 90,
such that the fuel stem 90 lies within the workspace 186, the
vision controller 174 is able to determine a three-dimensional
position P.sub.e of the fuel stem 90.
Mathematical Background for the Vision System
[0077] Before explaining the various modes of operation of the
fueling system 24, the methodology and mathematics for determining
a three-dimensional position P.sub.e of the fuel stem 90 will now
be explained. In order to determine the position P.sub.e, a
mathematical camera model of each of the cameras 182, 184 is
utilized. These mathematical models are identical except for the
numerical values of parameters measured after the cameras 182,184
have been mounted in the camera arm 56 and the aperture and focus
of their lenses have been adjusted. For the purposes of simplicity
and clarity, only the mathematical model for camera 182 will be
further described hereinbelow.
[0078] The camera model comprises two parts: an intrinsic camera
model and an extrinsic camera model. Referring to FIGS. 12 and 14,
the intrinsic camera model specifies the relationship between a
point (i.e., point (x.sub.i,y.sub.i)) with respect to a digital
image coordinate system and a direction in space (i.e., unit vector
D.sub.i) with respect to a camera coordinate system CS.sub.ci. The
intrinsic camera model for camera 182 depends on the internal
geometric, electronic and optical characteristics of camera 182 and
on the electronic characteristics of the frame grabber 180 which
will be discussed in greater detail below.
[0079] During run time, the extrinsic camera model of each the
cameras 182, 184 specifies the position and orientation of the
camera coordinate system CS.sub.c (i.e., coordinate system
CS.sub.ci for the camera 182 and coordinate system CS.sub.co for
the camera 184) with respect to a predetermined coordinate system
that is fixed with respect to the cameras, such as CS2. In a
constructed embodiment, the extrinsic camera model parameters are
measured in the factory and are stored in non-volatile memory
(e.g., hard disk) of the vision controller with respect to the
camera arm 56 mounting frame. During run time the vision controller
174 always supplies position and orientation vectors with respect
to CS2 but the robot controller 46 always converts and uses these
vectors with respect to CS1. Because each extrinsic camera
model--which is a coordinate system transformation matrix--may be
readily determined by one skilled in the art, only the intrinsic
camera model will be discussed.
[0080] The intrinsic camera model for camera 182 will now be
discussed. The intrinsic camera model comprises several equations
(discussed hereinafter) that are used to determine a direction
vector D.sub.i pointing in space to the center point P.sub.i of the
fuel stem 90. Referring to FIGS. 12, 13, and 14, the intrinsic
camera model utilizes three coordinate systems: a digital image
coordinate system, a camera sensor coordinate system, and a camera
coordinate system. Referring to FIG. 12, the two-dimensional
digital image coordinate system defined by the axes X.sub.i and
Y.sub.i is illustrated. The digital image coordinate system is
defined by a plurality of rows and columns of pixels 185. Further,
the Principal Point (C.sub.x, C.sub.y) is the point projected onto
the digital image coordinate system that corresponds to the optical
axis of a camera lens (not shown) of the camera 182. Further, the
image point (x.sub.i, y.sub.i) corresponds to the center of the
fuel stem 90 and is generated by the commercially available vision
software previously discussed.
[0081] Referring to FIG. 13, the two-dimensional camera sensor
coordinate system defined by the axes X.sub.s and Y.sub.s is
illustrated. The CCD camera 182 has a sensor plane 188 comprising a
plurality of rows and columns of CCD sensor cells 187. The center
of the fuel stem 90 is imaged on the sensor point (x.sub.s,
y.sub.s) (coordinates with respect to the camera sensor coordinate
system) and after digitization by the frame grabber 180, this point
corresponds to the image point (x.sub.i, y.sub.i) (coordinates with
respect to the digital image coordinate system).
[0082] Referring to FIG. 14, the three-dimensional camera sensor
coordinate system CS.sub.ci is partially illustrated in two
dimensions. The camera coordinate system CS.sub.ci is defined by
the axes X.sub.ci, Y.sub.ci, Z.sub.ci has an origin O.sub.ci that
corresponds to the center of a lens (not shown) of the camera 182.
The origin O.sub.ci is also located on the optical axis of the
camera 182. As shown, the unit direction vector D.sub.i is
determined with respect to the camera coordinate system
CS.sub.ci.
[0083] Before discussing the intrinsic camera model equations, the
parameters used in the equations will be set forth. The parameters
include:
2 f = effective focal length of the camera lens (meters), S.sub.x =
camera aspect ratio, C.sub.x = x image coordinate (pixel) of the
Principal Point, C.sub.y = y image coordinate (pixel) of the
Principal Point, K.sub.1 = second degree radial distortion
coefficient of the camera lens, d.sub.x = width of a sensor cell
(meters), d.sub.y = height of a sensor cell (meters), N.sub.cx =
number of sensor cells on a row of the camera Sensor, N.sub.cy =
number of sensor cells on a column of the camera Sensor, N.sub.1x =
number of pixels on an image row.
[0084] The parameters d.sub.x, d.sub.y, N.sub.cx, N.sub.cy,
N.sub.ix may be supplied by the manufacturer of the camera 182, and
the parameter N.sub.ix is a known parameter of the frame grabber
180. The parameters f, S.sub.x, C.sub.x, C.sub.y, K.sub.1, may be
readily determined by one skilled in the art utilizing an algorithm
set forth in the following publication: "A Versatile Camera
Calibration Technique for High-Accuracy 3D Machine Vision Metrology
Using Off-the-Shelf TV Cameras and Lenses", IEEE Journal of
Robotics and Automation Vol RA-3, No 4, August 1987--which is
incorporated by reference in its entirety.
[0085] Referring to FIG. 15, camera model equations 1-8 are used to
map the image point (x.sub.i, y.sub.i) with respect to a digital
image coordinate system to a sensor point (x.sub.s, y.sub.s) with
respect to a camera sensor coordinate system and to further
calculate an undistorted sensor point (x.sub.u, y.sub.u) with
respect to the camera sensor coordinate system, from which is
calculated a direction vector D.sub.i with respect to the camera
coordinate system CS.sub.ci. In particular, equations (1), (2),
(7), and (8) are used to map the image point (x.sub.i, y.sub.i) to
the sensor point (x.sub.s, y.sub.s)--where light from a point
P.sub.i of the fuel stem 90 is projected. Referring to FIG. 14,
sensor point (x.sub.s, y.sub.s) is shown on the sensor plane 188.
It should be understood, however, that camera 182 (and all
conventional cameras) radially distorts projected light beams.
Accordingly, the equation (3) is utilized to calculate a radial
distortion factor k for the camera 182 to compensate for the radial
distortion of camera 182. Next, equations (4) and (5) are utilized
to calculate the undistorted sensor point (x.sub.u, y.sub.u)
(coordinates with respect to the camera sensor coordinate system)
utilizing the radial distortion factor k. Referring to FIG. 14, the
point (x.sub.u, y.sub.u) is shown on the sensor plane 188. The
preferred embodiment uses an improved version of Tsai's algorithm
that was developed by Reg Willson and that is freely available on
Internet at the following URL:
http://www.cs.cmu.edu/afs/cs.cmu.edu/us- er/rgw/www/TsaiCode.html,
as also shown in the computer program listing Appendix on a compact
disc submitted herewith.
[0086] Referring to FIG. 15, the equation (6) is used to calculate
the unit vector D.sub.i which is centered at the origin O.sub.ci
and points toward the center point P.sub.i of the fuel stem 90. The
numerator of the equation (6) represents a vector pointing toward
the center point P.sub.i. The denominator of equation (6)
represents the magnitude of the vector D.sub.i. Thus, the vector
D.sub.i represents a unit vector.
[0087] The foregoing camera model equations are valid if the
following criteria regarding the camera 182, the lens (not shown)
and the frame grabber 180 are true. First, the optical axis of the
camera 182 is presumed to be perpendicular to the sensor plane 188.
Second, the lens is focused so that an object in the camera
workspace 186 is in focus. Third, the focus of the lens is fixed.
In other words, a zoom lens is not used. Fourth, the digitization
of a digital image by the frame grabber 180 is accurate.
[0088] It should be understood that the intrinsic camera model for
camera 184 may also be defined by the equations (1)-(8) utilizing
the ten parameters f, S.sub.x, C.sub.x, C.sub.y, K.sub.1, d.sub.x,
d.sub.y, N.sub.cx, N.sub.cy, N.sub.ix determined for the camera
184. Referring to FIG. 11, the camera 184 may have an origin
O.sub.co with a coordinate system CS.sub.co. Generally, to
determine a position of an object, an observer first establishes a
stereo match of the object and then uses triangulation to determine
a distance to the object with respect to the observer. The vision
system 164 works in a similar manner. When the cameras 182, 184 and
the frame grabber 180 generate first and second digital images,
respectively, of the workspace 186 the conventional vision software
(discussed above), within the vision controller 174, performs image
template matching and generates first and second image match
locations. We will hereafter refer to the first image match
location as the inner image location and to the second image match
location as the outer image location.
[0089] The conventional vision software determines an inner image
point (i.e., x.sub.i, y.sub.i) and an outer image point (not shown)
that are representative of the center point of the fuel stem 90 in
the first and second digital images, respectively. The inner image
point (i.e., x.sub.i, y.sub.i) is utilized to calculate the
direction vector D.sub.i as explained above. Similarly, the outer
image point is utilized to calculate the direction vector D.sub.o.
It should be understood that the direction vector Do may be
calculated utilizing the equations (1)-(8) discussed above. The two
direction vectors D.sub.i and D.sub.o are utilized in conjunction
with the extrinsic camera model of the cameras 182, 184 to
triangulate as estimated position P.sub.e of the fuel stem 90 with
respect to the coordinate system CS2. This position with respect to
CS2 is transmitted by the vision controller 174 to the robot
controller 46 which transforms it with respect to CS1.
[0090] Referring to FIG. 11, to triangulate the position of the
estimated triangulation point P.sub.e with respect to the
coordinate system CS2, the position and orientation of the inner
and outer camera coordinate systems CS.sub.co and CS.sub.ci with
respect to coordinate system CS2, i.e., the extrinsic camera models
of cameras 182, 184 must be known. The transformation of a first
coordinate system (i.e., CS.sub.co or CS.sub.ci) to a second
coordinate system (i.e., CS2) is well known in the art and will not
be discussed in any further detail. The vision controller 174
calculates an inner triangulation point P.sub.i along the line
{O.sub.ci, D.sub.i} that is the closest point to the line
{O.sub.co, D.sub.o}. The controller 174 then calculates an outer
triangulation point P.sub.o that is the closest point on the line
{O.sub.co, D.sub.o} to the inner triangulation point P.sub.i.
Finally, the controller 174 calculates the midpoint between the
inner triangulation point P.sub.i and the outer triangulation point
P.sub.o, which is the triangulated point P.sub.e. As previously
discussed, the triangulated point P.sub.e is the estimated position
of a center point of the fuel stem 90 with respect to the
coordinate system CS2. Further, the vision controller 174
calculates a triangulation error which is the distance between
points P.sub.e and P.sub.i. If the triangulation error is less than
a threshold error value, the vision controller 174 transmits the
position P.sub.e to the robot controller 46. In particular, the
position P.sub.e may be transmitted to the controller 174 utilizing
the following variables: CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY,
CARRIAGE_TO_STEMZ.
Background for Force Feedback
[0091] During fueling of the fuel stem 90, force feedback control
may be utilized to correct the position of the robotic arm 34. In
particular, if a force vector applied to the end effector 54
exceeds a threshold magnitude, the end effector 54 may be moved to
reduce the force vector to a desired magnitude. The desired
magnitude may comprise a zero value or a non-zero value depending
upon the amount of force needed to seal the end effector 54 against
the fuel stem 90.
[0092] Referring to FIG. 16, a boot 102 of the end effector 54 is
illustrated. Further, the coordinate system CS5 is shown on the
boot 102 which corresponds to a contact point of a fuel stem 90
with the boot 102. When the boot 102 mates with the fuel stem 90, a
force vector FV is applied to the boot 102 with force components
f.sub.x5, f.sub.y5 and f.sub.z5 along the X5, Y5 and Z5 axes,
respectively. It should be noted that the Y5 axis extends outwardly
from the page. The undesirable force f.sub.y5 results from an error
in position for the robotic arm 34 along the J1 axis. The
undesirable force f.sub.x5 results from an error in position of the
robotic arm 34 about the J3 axis. As shown, the load cells 106,
108, 110, 112 may be utilized to determine a force vector applied
to the front housing 104. Thereafter, the force vector FV and the
component forces f.sub.x5 and f.sub.y5 may be readily determined
utilizing simple force vector equations well know to those skilled
in the art.
[0093] The forces f.sub.x5, f.sub.y5 and f.sub.z5 applied to the
boot 102 create (i) an undesirable force f.sub.1 on the carriage 40
along the J1 axis and (ii) an undesirable torque .tau..sub.3 on the
pitch arm 52 about the J3 axis. Referring to FIG. 17, the force
f.sub.1 and the torque .tau..sub.3 may be calculated utilizing a
Jacobian Transpose equation. In particular, the Jacobian Transpose
equation utilizes (i) the geometric DH constants g, h, k of the
robotic arm 34 (see FIG. 7), (ii) the current joint values
.theta..sub.2, .theta..sub.3, .theta..sub.4, J5, and (iii) the
forces f.sub.x5, f.sub.y5, f.sub.z5--to calculate the force f.sub.1
and the torque .tau..sub.3. The Jacobian Transpose equation may be
readily determined by one skilled in the art.
[0094] The force f.sub.1 and the torque .tau..sub.3 are utilized to
calculate displacement error values for the carriage 40 along the
J1 axis and the pitch arm 52 about the J3 axis. Because the boot
102 is compliant, it can be modeled as a spring utilizing the
spring equation:
Force=k*d, wherein
[0095] d=displacement of boot 102,
[0096] k=the spring constant
[0097] Thus, an error in force or torque (i.e., f.sub.1 and
.tau..sub.3) is proportional to an error in displacement (i.e.,
displacement d) of the robotic arm 34 along the J1 and J3 axes. In
a constructed embodiment, the force f.sub.1 is input into a first
PID controller (implemented in software) that calculates a
displacement error value .DELTA..sub.J1 for the carriage 40 on the
J1 axis. Similarly, the torque .tau..sub.3 is input into a second
PID controller that calculates an angular displacement error value
.DELTA..sub..theta.3 for the pitch arm 52 about the J3 axis. As
previously discussed, the values .DELTA..sub.J1, and
.DELTA..sub..theta.3 may be utilized by the robot controller 46 to
correct the position of the carriage 40 and the pitch arm 52,
respectively, to reduce the undesirable forces applied to the end
effector 54 during fueling of the fuel stem 90.
Background for Controlling the Position of the Robotic Arm Along
the J1 Axis
[0098] The robot controller 46 utilizes three types of sensor
feedback to position the robotic arm 34 along the J1 axis. The
types of sensor feedback include a line encoder feedback, a vision
system feedback, and a force feedback. Referring to FIG. 2, the
line encoder feedback includes encoder counts from the conveyer
line encoder 162 that are utilized to determine a gross position of
the fuel stem 90. The encoder counts are also indicative of the
speed of the conveyer line 28. Referring to FIG. 11, the vision
system feedback includes a three-dimensional position P.sub.e of
the fuel stem 90 with respect to the coordinate system CS1.
Finally, the force feedback includes a measured force exerted on
the end effector 54 by the fuel stem 90 during fueling of the fuel
stem 90.
[0099] Referring to FIGS. 10A and 10D, the robot controller 46
moves the robotic arm 34 along the J1 axis using the tracking
equation (9). The equation (9) utilizes a STEM_DISPLACEMENT
variable that corresponds to the distance that the fuel stem 90
(and the vehicle 26) have traveled along conveyer axis 170 since
the vehicle 26 passed the light sensor trigger position. The
STEM_DISPLACEMENT variable is calculated using the following
equation:
STEM_DISPLACEMENT=(current encoder count-first encoder count)*CF,
wherein;
[0100] first encoder count=encoder count from conveyer line encoder
162 when the V.sub.TRIG signal is generated;
[0101] CF=conversion factor for converting an encoder count to a
distance along the conveyer line axis 170.
[0102] Accordingly, the STEM_DISPLACEMENT variable is updated to a
new value whenever the current encoder count is updated by the
conveyer line encoder 162.
[0103] The equation (9) also utilizes the DESIRED_CARRIAGE_TO_STEMY
constant obtained from the VID record. The
DESIRED_CARRIAGE_TO_STEMY constant represents the desired distance
along the J1 axis from the origin of coordinate system CS1 (on the
carriage 40) to a point P1 directly across from the fuel stem
90--to allow the end effector 54 to mate with the fuel stem 90.
[0104] The equation (9) also utilizes the HOME_TO_STEM constant
obtained from the VID record. The HOME_TO_STEM constant represents
the distance along the J1 axis from the origin of coordinate system
CS0 to a point P0 directly across from the fuel stem 90--when the
vehicle 26 passes the light sensor trigger position.
[0105] The equation (9) also utilizes a VISION_TRACKING_ERROR
variable. As previously discussed, the vision system 164 calculates
a three-dimensional position P.sub.e of the fuel stem 90 with
respect to a coordinate system CS1. Further, the vision system 164
transfers the position P.sub.e to the robot controller 46 using the
following variables: CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY,
CARRIAGE_TO_STEMZ. Because the CARRIAGE_TO_STEMY variable
represents a current distance from the carriage 40 (and coordinate
system CS1) to the fuel stem 90 along the J1 axis, the variable can
be used to calculate the VISION_TRACKING_ERROR. In particular, the
VISION_TRACKING_ERROR variable may be calculated using the
following equation:
VISION_TRACKING_ERROR=(CARRIAGE_TO_STEMY_DESIRED_CARRIAGE_TO_ST
EMY)
[0106] Referring to FIG. 10A, the carriage 40 (and coordinate
system CS1) is perfectly positioned along the J1 axis to allow
engagement of the end effector 54 and the fuel stem 90. As shown,
the CARRIAGE_TO_STEMY distance returned by the vision system 164 is
equal to the DESIRED_CARRIAGE_TO_STEMY. Accordingly, the
VISION_TRACKING_ERROR value equals a zero value.
[0107] Referring to FIG. 10B, the carriage 40 (and coordinate
system CS1) are positioned too far in front of the fuel stem 90
along the J1 axis to allow engagement of the end effector 54 and
the fuel stem 90. Accordingly, the VISION_TRACKING_ERROR is equal
to a negative number (i.e., a negative value along the Y1 axis)
which decreases the COMMANDED_J1_POSITION value. In response, the
carriage 40 and the robotic arm 34 are moved to a position along
the J1 axis corresponding to the new COMMANDED_J1_POSITION.
[0108] Referring to FIG. 10C, the carriage 40 (and coordinate
system CS1) are positioned too far behind a desired position on the
J1 axis to allow engagement of the end effector 54 and the fuel
stem 90. Accordingly, the VISION_TRACKING_ERROR is equal to a
positive number (i.e., a positive value along the Y1 axis) which
increases the COMMANDED_J1_POSITION variable. In response, the
carriage 40 and the robotic arm 34 are moved to a position along
the J1 axis corresponding to the new COMMANDED_J1_POSITION.
[0109] The equation (9) also utilizes a FORCE_FEEDBACK_ERROR
variable. As previously discussed, the robot controller 46 monitors
a force exerted on the robotic arm 34 by the fuel stem 90 during
the fueling of the stem 90. Further, the controller 46 calculates a
displacement error value .DELTA..sub.J1 to correct the position of
the carriage 40 along the J1 axis responsive to the force. Note
that the force control also compensates for misalignments in the
vertical directional using the J3 axis. When the boot 102 is mated
with the fuel stem 90, the FORCE FEEDBACK_ERROR variable is set
equal to the calculated displacement error value .DELTA..sub.J1.
When the boot 102 is not mated with the fuel stem 90, the
FORCE_FEEDBACK_ERROR is set equal to a zero value.
Modes of Operation of the Fueling System
[0110] The modes of operation of the fueling system 24 in
accordance with the present invention will now be discussed.
Referring to FIG. 21A, the modes of operation include a power up
mode 190, an auto mode 192, and a manual mode 194. The various
modes of operation are implemented in software that is stored in
the ROM of the robot controller 46.
[0111] During the power up mode 190, the robot controller 46 and
the vision controller 174 establish communication with each other
via the communication bus 172. Further, power is applied to the
motor drivers 44, 72.
[0112] Next, the robot controller 46 advances to a step 191 that
prompts the user of the fueling system 24 to select between the
different modes of system operation. As discussed, the modes of
operation include auto mode 192 or manual mode 194. The robot
controller 46 also provides for the option to shut down the fueling
system 24. The modes of operation may be chosen using a GUI and a
pointing device (not shown), a push-button based operator panel
(not shown), or similar controls on the supervisory PLC 178.
[0113] When the user selects the auto mode 192 of operation, the
software modules comprising the auto mode 192 are executed to
implement the method for fueling a vehicle 26 in accordance with
the present invention. Referring to FIG. 21B, the auto mode 192
includes a stow module 196, an idle module 198, a gross position
module 200, a track fuel stem module 202, an insert module 204, a
fuel module 206, a purge fuel module 208, and an extract module
210.
[0114] Referring to FIGS. 2 and 4, the stow module 196 performs
steps to move the carriage 40 to a home position (i.e., origin of
the coordinate system CS0) along the J1 axis. Further, the joints
66, 82 are moved to predetermined stowed positions.
[0115] Referring to FIG. 21C, the idle module 198 includes a step
212 of determining if a vehicle trigger signal V.sub.TRIG was
received. Referring to FIG. 10A, when the vehicle 26 passes the
light sensor trigger position, a continuously emitted light beam is
not detected by the light sensor 160. In response, the light sensor
160 generates the signal V.sub.TRIG which is received by the robot
controller 46. Referring to FIG. 21C, if the signal V.sub.TRIG is
received, the module 198 advances to a step 214 which stores a
first encoder count from the conveyer line encoder 162. Thereafter,
the module 198 is exited and the auto mode 192 advances to the
gross position module 200. Alternately, if the signal V.sub.TRIG is
not received, the module 198 iteratively performs the step 212
until a vehicle 26 is detected by the light sensor 160.
[0116] Referring to FIG. 21D, the gross position module 200
includes a step 216 of requesting a VID record from the vision
controller 174. As previously discussed, the VID record contains
vehicle dependent information for tracking the fuel stem 90. In a
preferred embodiment, unit vector a.sub.x, a.sub.y, a.sub.z is
obtained from the VID record, and is then used to calculate the
optimum J2 and J4 angles. The J2 angle and J4 hard stop are
adjusting during the gross position module 200. In an alternate
embodiment the J2 and J4 angles are automatically calculated but
are manually adjusted off line prior to automatic operation.
[0117] The module 200 further includes a step 218 which iteratively
calculates a gross position of the fuel stem 90 with respect to the
J1 axis. Referring to FIG. 10, the step 218 utilizes the tracking
equation (9) to calculate the COMMANDED_J1_POSITION which also
represents the gross position of the fuel stem 90. Further, both
the VISION_TRACKING_ERROR and the FORCE_FEEDBACK_ERROR are equal to
a zero value during the step 218.
[0118] Referring to FIG. 21D, the module 200 further includes a
step 220 of moving the robotic arm 34 along the J1 axis to position
the end effector 54 proximate to the gross position of the fuel
stem 90. Further, the module 200 moves the robotic arm 34 (and the
end effector 54) at a speed substantially equal to the speed of the
conveyer line 28. During the step 200, the cameras 182, 184 on the
robotic arm 34 should be positioned along J1 such that the field of
view of the cameras 182, 184 covers the fuel stem 90.
[0119] The module 200 further includes a step 222 of triggering the
frame grabber 180. In particular, the robot controller 46 generates
a signal V.sub.TR that causes the frame grabber 180 to transfer
first and second digital images of the fuel stem 90 from the
cameras 182, 184 to the vision controller 174.
[0120] The module 200 further includes a step 224 of requesting a
three-dimensional position P.sub.e of the fuel stem 90 from the
vision controller 174. As previously discussed, the vision
controller 174 performs template matching on the first and second
digital images and calculates first and second correlation scores,
respectively. Further, the controller 174 returns the position
P.sub.e if the first and second correlation scores are above a
threshold correlation score and a triangulation error of the
position P.sub.e is below a predetermined triangulation error.
[0121] The module 200 further includes a step 226 which determines
whether a three-dimensional position P.sub.e was received from the
vision controller 174. If the position P.sub.e was received, the
module 200 is exited and the auto mode 192 advances to the track
fuel stem module 202. Alternately, if the position P.sub.e was not
received, the module 200 returns to step 222.
[0122] Referring to FIG. 21B, the auto mode 192 advances to the
track fuel stem module 202 after module 200. Referring to FIG. 21E,
the module 202 includes a step 228 of triggering the frame grabber
180 to obtain first and second digital images of the fuel stem 90
generated by the cameras 182, 184, respectively.
[0123] The module 202 further includes a step 230 that requests a
three-dimensional position P.sub.e of the fuel stem 90 from the
vision controller 174.
[0124] The module 202 further includes a step 232 which determines
whether a three-dimensional position P.sub.e of the fuel stem 90
was received by the robot controller 46. If the position P.sub.e
was received, the module 202 advances to the step 234. Otherwise,
the module 202 returns to the step 228.
[0125] The module 202 further includes a step 234 of calculating
the VISION_TRACKING_ERROR. As previously discussed, the
VISION_TRACKING_ERROR is utilized in the equation (9) to more
accurately position the robotic arm 34 along the J1 axis relative
to the fuel stem 90. It should be understood that the
VISION_TRACKING_ERROR is iteratively calculated in the track fuel
stem module 202 and the insert module 204 so long as the fuel stem
90 is viewed by both cameras 182, 184.
[0126] The module 202 further includes a step 236 which moves the
robotic arm 34 along the J1 axis proximate the fuel stem 90
responsive to the COMMANDED_J1_POSITION.
[0127] Referring to FIG. 21B, the auto mode 192 advances to the
insert module 204 after the module 202. Referring to FIG. 21F, the
module 204 includes the steps 238, 240, 242, 244, 246. The step 238
calculates joint variables .theta..sub.3 and J5 using the inverse
kinematic model previously discussed. The step 240 moves the pitch
arm 52 about the J3 axis to the calculated angle .theta..sub.3. The
step 242 moves the end effector 54 about the J4 axis to a
predetermined angle .theta..sub.4.
[0128] The step 244 moves the boot 102 along the J5 axis a distance
J5 to allow the boot 102 to mate with the fuel stem 90. Referring
to FIG. 20, during the steps 238, 240, 242, the end effector 54 is
positioned to allow the step 244 to move the end effector 54 along
a docking line toward the fuel stem 90. As shown, the docking line
extends between a point P.sub.b on the nozzle axis 245 to the fuel
stem point P.sub.e. Further, the docking line forms an approach
angle .theta..sub.D with respect to the stem axis 243. Finally, the
step 246 moves the fuel hose 100 into the fuel stem 90.
[0129] Referring to FIG. 21B, the auto mode 192 advances to the
fuel module 206 after the module 204. Referring to FIG. 21G, the
fuel module 206 includes the steps 248, 250, 252. The step 248
calculates the FORCE_FEEDBACK_ERROR which is utilized by the
controller 46 to calculate the COMMANDED_J1_POSITION of the
carriage 40 (and the robotic arm 34) along the J1 axis. The step
250 opens a fuel valve 116 in the end effector 54 to supply fuel to
the fuel stem 90. The step 252 closes the fuel valve 116 after a
predetermined amount of fuel is pumped into the fuel stem 90.
[0130] Referring to FIG. 21B, the auto mode 192 advances to the
purge fuel module 208 after the module 206. Referring to FIG. 8,
the purge fuel module 208 performs steps to apply air pressure to
the check valves 140, 142 to force any residual fuel in the
manifold 98 and the fuel hose 100 into the fuel stem 90.
[0131] The auto mode 192 advances to the extract module 210 after
the module 208 which extracts the fuel hose 100 from the fuel stem
90 and moves the robotic arm 34 to a predetermined retract
position.
[0132] Referring to FIG. 21B, the auto mode 192 after exiting
module 210 advances to a step 254 which determines if another
vehicle 26 has passed the light sensor trigger position. If another
vehicle 26 is detected, the auto mode 192 advances to the gross
position module 200. Otherwise, the auto mode 192 advances to the
stow module 196.
Kinematic Parameters Update
[0133] There is a self-test operating mode during which the robot
controller 46 places the robotic arm 34 such that the two robot tip
markers TM0 and TM1 as shown in FIG. 20 are within the field of
view of the cameras 182, 184. The vision controller 174 then
acquires a first digital image and a second digital image from the
cameras 182, 184, and the frame grabber 180, and locates the first
and second image locations of the tip marker 0 "TM0" and of the tip
marker 1 "TM1." Using a triangulation method already described for
the localization of the fuel stem point, it calculates the 3D
positions of both tip markers with respect to CS2. This information
may be used by the robot controller 46 to periodically update the
kinematic parameters which vary due to dimensional instability of
the elastomeric boot hose 103.
Limit Switch Retract
[0134] Pneumatic cylinder 114 is provided to extend the filler hose
100 through the boot 102 in order to pump fuel into the filler neck
from a point below the no-lead insert. When the filler hose 100 is
retracted, the cylinder 114 is prevented from reaching its full
stroke due to the fact that the tip of the filler hose is too large
to fit through the boot 102. Should the tip of the hose 100 be
lost, the cylinder 114 can be fully retracted. This configuration
allows the robot to sense if the hose 100 has been lost or damaged
by simply using two limit switches on the cylinder 114 retract
stroke, one of which is triggered when the hose 100 is retracted
far enough to reach the boot, and one of which is triggered when
the cylinder 114 is fully retracted.
[0135] The inventive fueling system 24 and the method related
thereto represent a significant improvement over conventional
fueling systems and methods. In particular, the inventive fueling
system 24 allows the automatic fueling of an automotive vehicle 26
without the need for an operator. Thus, the inventive fueling
system 24 and method result in labor savings. Further, the fueling
system 24 can move an end effector 54 to mate with the fuel stem 90
while the vehicle 26 is moving. Thus, the vehicle 26 can be fueled
without stopping the conveyer line 28 resulting in manufacturing
cost savings and increased assembly line efficiency.
[0136] While the invention has been particularly shown and
described with reference to the preferred embodiments thereof, it
is well understood by those skilled in the art that various changes
and modifications can be made in the invention without departing
from the spirit and the scope of the invention.
* * * * *
References