U.S. patent application number 17/556083 was filed with the patent office on 2022-06-23 for line scanner having target-tracking and geometry- tracking modes.
The applicant listed for this patent is FARO Technologies, Inc.. Invention is credited to Paul C. Atwell, John Lucas Creachbaum, Nitesh Dhasmana, Fabiano Kovalski, Andrea Riccardi, William E. Schoenfeldt, Marco Torsello, Christopher Michael Wilson.
Application Number | 20220198704 17/556083 |
Document ID | / |
Family ID | 1000006109716 |
Filed Date | 2022-06-23 |
United States Patent
Application |
20220198704 |
Kind Code |
A1 |
Atwell; Paul C. ; et
al. |
June 23, 2022 |
LINE SCANNER HAVING TARGET-TRACKING AND GEOMETRY- TRACKING
MODES
Abstract
A handheld three-dimensional (3D) measuring system operates in a
target mode and a geometry mode. In the target mode, a target-mode
projector projects a first line of light onto an object, and a
first illuminator sends light to markers on or near the object. A
first camera captures an image of the first line of light and the
illuminated markers. In the geometry mode, a geometry-mode
projector projects onto the object a first multiplicity of lines,
which are captured by the first camera and a second camera. One or
more processors determines 3D coordinates in the target mode and
the geometry mode.
Inventors: |
Atwell; Paul C.; (Lake Mary,
FL) ; Creachbaum; John Lucas; (Deltona, FL) ;
Dhasmana; Nitesh; (Lake Mary, FL) ; Kovalski;
Fabiano; (Sanford, FL) ; Riccardi; Andrea;
(Rezzato, IT) ; Schoenfeldt; William E.; (Oviedo,
FL) ; Torsello; Marco; (Brescia, IT) ; Wilson;
Christopher Michael; (Lake Mary, FL) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
FARO Technologies, Inc. |
Lake Mary |
FL |
US |
|
|
Family ID: |
1000006109716 |
Appl. No.: |
17/556083 |
Filed: |
December 20, 2021 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
63130006 |
Dec 23, 2020 |
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
G06T 7/74 20170101; H04N
13/239 20180501; H04N 5/23229 20130101; H04N 5/2256 20130101; G01C
11/02 20130101; G06T 2207/10012 20130101; H04N 5/2254 20130101;
G06T 2207/30204 20130101 |
International
Class: |
G06T 7/73 20060101
G06T007/73; H04N 5/232 20060101 H04N005/232; H04N 5/225 20060101
H04N005/225; G01C 11/02 20060101 G01C011/02 |
Claims
1. A system comprising: a first target-mode projector operable to
project a first line of light onto an object; a first illuminator
operable to illuminate markers on or near the object; a first
geometry-mode projector operable to project a first multiplicity of
lines onto the object; a first camera operable to capture a first
target-mode image of the first line of light on the object and the
illuminated first markers; the first camera further operable to
receive a first geometry-mode image of the projected first
multiplicity of lines; a second camera operable to receive a second
geometry-mode image of the projected first multiplicity of lines,
the first geometry-mode projector being a different distance from
the second camera than from the first camera; and one or more
processors operable to determine first three-dimensional (3D)
coordinates of points on the object based at least in part on the
captured first target-mode image, the one or more processors being
further operable to determine second 3D coordinates of points on
the object based at least in part on the captured first
geometry-mode image and the captured second geometry-mode
image.
2. The system of claim 1 wherein: the one or more processors are
further operable to determine 3D coordinates of points on the
object in a global frame of reference by combining the first 3D
coordinates obtained during a first interval of time with the first
3D coordinates obtained during a second interval of time, the
combining based at least in part on the second 3D coordinates
obtained during the first interval of time and the second 3D
coordinates obtained during the second interval of time.
3. The system of claim 2 further comprising: a second geometry-mode
projector operable to project onto the object a second multiplicity
of lines of light; the first camera further operable to capture a
third geometry-mode image of the projected second multiplicity of
lines; a second camera operable to capture a fourth geometry-mode
image of the projected second multiplicity of lines, the second
geometry-mode projector being a different distance from the second
camera than the first camera; the one or more processors further
operable to determine the second 3D coordinates of points on the
object further based at least in part on the third geometry-mode
image, the fourth geometry-mode image, and the second multiplicity
of lines from the second geometry-mode projector.
4. The system of claim 3 further comprising: an auxiliary
target-mode projector selected from the group consisting of the
first target-mode projector and a second target-mode projector, the
auxiliary target-mode projector being operable to project a second
line of light onto the object; a second illuminator operable to
illuminate second markers with light; the second camera further
operable to capture a second target-mode image of the projected
second line of light and the illuminated second markers; and the
one or more processors further operable to determine the first
three-dimensional (3D) coordinates of points on the object further
based at least in part on the captured second target-mode
image.
5. A system comprising: a photogrammetry camera having a first
enclosure, the photogrammetry camera operable to capture a
plurality of images of markers from a corresponding plurality of
poses of the photogrammetry camera, the first enclosure being sized
for handheld operation; a wearable unit having a computing unit and
a battery, the wearable unit including a second enclosure
physically separated from the first enclosure but electrically
connected to first enclosure by some combination of wired and
wireless connections; and a first collection of processors included
within some combination of the first enclosure, the wearable unit,
and external computers, the first collection of processors operable
to determine in a global frame of reference three-dimensional (3D)
coordinates of the markers based at least on the plurality of
images.
6. The system of claim 5 further comprising: a third enclosure
coupled to a first projector, a first illuminator, and a first
camera, wherein the third enclosure is sized for handheld
operation, the first projector operable to project a first line of
light onto an object, the first illuminator operable to illuminate
markers on or near the object, the first camera operable to capture
a first image at a first time and a second image at a second time,
each of the first image and the second image including the first
line of light and some of the illuminated markers; the wearable
unit operable to be detached from the first enclosure and attached
to the third enclosure; and a second collection of processors
included within some combination of the third enclosure, the
wearable unit, and external computers, the second collection of
processors operable to determine in the global frame of reference
3D coordinates of points on the object based at least in part on
the first image and the second image.
7. A method comprising: with a three-dimensional (3D) measuring
device having an image sensor, projecting a pattern of light onto
an object; with the image sensor, capturing an image of the
projected pattern of light, the captured image having pixel values
each based at least in part on a selection among two or more
conversion gains; reading out the selected pixel values from the
image sensor; and with a processor, determining 3D coordinates of
points on the object based at least in part on the projected
pattern of light and the read-out pixel values.
8. A method comprising: with a three-dimensional (3D) measuring
device having an image sensor, projecting a pattern of light onto
an object; with the image sensor, capturing an image of the
projected pattern of light, the captured image having pixel values
based at least in part on a change in pixel response at each of a
plurality of compression break points; reading out the selected
pixel values from the image sensor; and with a processor,
determining 3D coordinates of points on the object based at least
in part on the projected pattern of light and the read-out pixel
values.
9. A system comprising: a projector operable to project a line of
light onto an object; a camera having a lens and an image sensor,
the camera operable to capture an image of the projected line of
light, the image sensor having a shorter side and a longer side,
the longer side being aligned parallel to the projected line of
light, the image sensor being set to a mode enabling direct read
out of pixel values along a direction of the shorter side of the
image sensor; and one or more processors operable to determining
three-dimensional (3D) coordinates of points on the object based at
least in part on the projected line of light and the captured image
of the projected line of light, the 3D coordinates being directly
calculated based on the enabled read-out of the pixel values along
the shorter side of the image sensor.
10. A system comprising: a first enclosure coupled to a first
projector, a first illuminator, and a first camera, wherein the
first enclosure is sized for handheld operation; a first projector
operable to project a first line of light onto an object; a first
illuminator operable to illuminate markers on or near the object; a
first camera operable to capture a first image at a first time and
a second image at a second time, each of the first image and the
second image including the first line of light and some of the
illuminated markers; a lens adapter assembly operable for
attachment to the first camera, the lens assembly operable to
change the optical magnification of the first camera; and one or
more processors included within some combination of the first
enclosure, the wearable unit, and external computers, the one or
more processors operable to determine in a global frame of
reference three-dimensional (3D) coordinates of points on the
object based at least in part on the first image and the second
image.
11. The system of claim 10 further comprising: a kinematic
attachment mechanism for repeatably reattaching the adapter lens
assembly to first camera; illuminator elements affixed to the lens
adapter assembly; and an electrical interconnection for passing an
electrical signal from the first enclosure to the illuminator
elements.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application is a nonprovisional application of U.S.
Provisional Application Ser. No. 63/130,006 filed on Dec. 23, 2020,
the contents of which is incorporated by reference herein.
BACKGROUND
[0002] The present disclosure relates to a coordinate measuring
system, which may include, for example, a line scanner rigidly or
removably affixed to an articulated arm coordinate measuring
machine (AACMM) or a hand-held line scanner.
[0003] A line scanner includes one or more projectors that emit one
or more lines of light captured in images by one or more cameras.
The relative positions of at least some of the cameras are known
relative to at least some of the projectors. One or more processors
coupled to the line scanners determines three-dimensional (3D)
coordinates of points on objects illuminated by the projected lines
of light.
[0004] Portable articulated arm coordinate measuring machines
(AACMMs) have found widespread use in the manufacturing or
production of parts where there is a need to verify the dimensions
of the part rapidly and accurately during various stages of the
manufacturing or production (e.g., machining) of the part. Portable
AACMMs represent a vast improvement over known stationary or fixed,
cost-intensive, and relatively difficult to use measurement
installations, particularly in the amount of time it takes to
perform dimensional measurements of relatively complex parts.
Typically, a user of a portable AACMM simply guides a probe along
the surface of the part or object to be measured.
[0005] A probe such as a tactile probe or a laser line probe (LLP),
defined as a line scanner in the form of a probe, is used to
measure 3D coordinates of points on an object. A tactile probe
typically includes a small spherical probe tip that is held in
contact with a point to be measured. An LLP, typically held away
from the object, emits a line of light that intersects the object.
A camera captures an image of the projected light on the object,
and a processor evaluates the captured image to determine
corresponding 3D coordinates of points on the object surface.
[0006] In some cases, the LLP on the AACMM may be removed from the
AACMM and used in a handheld mode to measure 3D coordinates of
points on an object. Alternatively, the LLP may be designed for use
entirely in a handheld mode without the possibility of attachment
to an AACMM.
[0007] An LLP or handheld line scanner uses the principle of
triangulation to determine 3D coordinates of points on an object
relative to the LLP coordinate system (frame of reference). When
attached to an AACMM, the pose of the LLP is determined based
partly on the readings obtained by angular encoders attached to
rotating joints of the LLP. When the LLP is used in a handheld mode
detached from an LLP, a different method is needed to register the
multiple 3D coordinates obtained as the LLP is moved from place to
place. In one approach, markers affixed to an object are used to
assist in registering the multiple 3D coordinates to a global frame
of reference.
[0008] Today, when handheld line scanners are used, it is common to
attach adhesive markers to an object under test. Imaging such
markers with a stereo camera as the handheld scanner is moved
provides a way to register 3D coordinates as the handheld scanner
is moved from point to point. Although this approach is effective,
it requires that adhesive markers be attached before measurement
and removed after measurement.
[0009] There is also a need today for handheld scanners and
photogrammetry cameras that can be moved from point to point
without being constrained by electrical wires run from the handheld
scanners to an external computer.
[0010] A further need is for handheld scanners that provide obtain
high dynamic range (HDR) images without the requirement for
intermediate computations that slow measurement.
[0011] Accordingly, while existing handheld line scanners are
suitable for their intended purposes, there remains a need for
improvement.
BRIEF DESCRIPTION
[0012] According to one aspect of the present disclosure, a system
comprises: a first target-mode projector operable to project a
first line of light onto an object; a first illuminator operable to
illuminate markers on or near the object; a first geometry-mode
projector operable to project a first multiplicity of lines onto
the object; a first camera operable to capture a first target-mode
image of the first line of light on the object and the illuminated
first markers; the first camera further operable to receive a first
geometry-mode image of the projected first multiplicity of lines; a
second camera operable to receive a second geometry-mode image of
the projected first multiplicity of lines, the first geometry-mode
projector being a different distance from the second camera than
from the first camera; and one or more processors operable to
determine first three-dimensional (3D) coordinates of points on the
object based at least in part on the captured first target-mode
image, the one or more processors being further operable to
determine second 3D coordinates of points on the object based at
least in part on the captured first geometry-mode image and the
captured second geometry-mode image.
[0013] According to a further aspect of the present disclosure, a
system comprises: a photogrammetry camera having a first enclosure,
the photogrammetry camera operable to capture a plurality of images
of markers from a corresponding plurality of poses of the
photogrammetry camera, the first enclosure being sized for handheld
operation; a wearable unit having a computing unit and a battery,
the wearable unit including a second enclosure physically separated
from the first enclosure but electrically connected to first
enclosure by some combination of wired and wireless connections;
and a first collection of processors included within some
combination of the first enclosure, the wearable unit, and external
computers, the first collection of processors operable to determine
in a global frame of reference three-dimensional (3D) coordinates
of the markers based at least on the plurality of images.
[0014] According to a further aspect of the present disclosure, a
method comprises: with a three-dimensional (3D) measuring device
having an image sensor, projecting a pattern of light onto an
object; with the image sensor, capturing an image of the projected
pattern of light, the captured image having pixel values each based
at least in part on a selection among two or more conversion gains;
reading out the selected pixel values from the image sensor; and
with a processor, determining 3D coordinates of points on the
object based at least in part on the projected pattern of light and
the read-out pixel values.
[0015] According to a further aspect of the present disclosure, a
system comprises: a projector operable to project a line of light
onto an object; a camera having a lens and an image sensor, the
camera operable to capture an image of the projected line of light,
the image sensor having a shorter side and a longer side, the
longer side being aligned parallel to the projected line of light,
the image sensor being set to a mode enabling direct read out of
pixel values along a direction of the shorter side of the image
sensor; and one or more processors operable to determining
three-dimensional (3D) coordinates of points on the object based at
least in part on the projected line of light and the captured image
of the projected line of light, the 3D coordinates being directly
calculated based on the enabled read-out of the pixel values along
the shorter side of the image sensor.
[0016] According to a further aspect of the present disclosure, a
system comprises: a projector operable to project a line of light
onto an object; a camera having a lens and an image sensor, the
camera operable to capture an image of the projected line of light,
the image sensor having a shorter side and a longer side, the
longer side being aligned parallel to the projected line of light,
the image sensor being set to a mode enabling direct read out of
pixel values along a direction of the shorter side of the image
sensor; and one or more processors operable to determining
three-dimensional (3D) coordinates of points on the object based at
least in part on the projected line of light and the captured image
of the projected line of light, the 3D coordinates being directly
calculated based on the enabled read-out of the pixel values along
the shorter side of the image sensor.
[0017] According to an further embodiment of the present
disclosure, a system comprises: a first enclosure coupled to a
first projector, a first illuminator, and a first camera, wherein
the first enclosure is sized for handheld operation; a first
projector operable to project a first line of light onto an object;
a first illuminator operable to illuminate markers on or near the
object; a first camera operable to capture a first image at a first
time and a second image at a second time, each of the first image
and the second image including the first line of light and some of
the illuminated markers; a lens adapter assembly operable for
attachment to the first camera, the lens assembly operable to
change the optical magnification of the first camera; and one or
more processors included within some combination of the first
enclosure, the wearable unit, and external computers, the one or
more processors operable to determine in a global frame of
reference three-dimensional (3D) coordinates of points on the
object based at least in part on the first image and the second
image.
[0018] These and other advantages and features will become more
apparent from the following description taken in conjunction with
the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The subject matter, which is regarded as the invention, is
particularly pointed out and distinctly claimed in the claims at
the conclusion of the specification. The foregoing and other
features, and advantages of the invention are apparent from the
following detailed description taken in conjunction with the
accompanying drawings in which:
[0020] FIG. 1 is an isometric view of a portable AACMM according to
an embodiment of the present disclosure;
[0021] FIG. 2 is an isometric view of an LLP affixed to the end of
an AACMM according to an embodiment of the present disclosure;
[0022] FIG. 3 is an isometric view of an LLP detached from the
AACMM according to an embodiment of the present disclosure;
[0023] FIG. 4 is a front view of an LLP affixed to the end of an
AACMM according to an embodiment of the present disclosure;
[0024] FIG. 5 is a schematic representation of the LLP emitting a
line of light to illustrate the principle of triangulation
according to an embodiment of the present disclosure;
[0025] FIG. 6 is an exploded isometric view of an LLP affixed to
the end of an AACMM according to an embodiment of the present
disclosure;
[0026] FIG. 7 is a second exploded isometric view of the LLP
affixed to the end of the AACMM according to an embodiment of the
present disclosure;
[0027] FIG. 8 is an isometric view of a removable LLP affixed to an
AACMM according to an embodiment of the present disclosure;
[0028] FIG. 9 is a close-up isometric view of the removable LLP and
the end of the AACMM according to an embodiment of the present
disclosure;
[0029] FIG. 10 is an isometric view of a handheld LLP according to
an embodiment of the present disclosure;
[0030] FIG. 11 is an isometric view of the handheld LLP, further
showing two emitted planes of light, according to an embodiment of
the present disclosure;
[0031] FIG. 12 is a schematic representation of a handheld LLP
showing possible connections to optional accessory elements
including a wearable computer, a desktop computer, and mobile
display according to an embodiment of the present disclosure;
[0032] FIG. 13A is an isometric view of a handheld line scanner
operable in a target tracking mode and a geometry mode according to
an embodiment of the present disclosure;
[0033] FIG. 13B illustrates a light pattern emitted by a handheld
scanner in a first mode of operation according to an embodiment of
the present disclosure;
[0034] FIGS. 13C, 13D illustrate light patterns emitted by the
handheld scanner in a second mode of operation according to an
embodiment of the present disclosure;
[0035] FIG. 13E is a schematic representation of a handheld line
scanner operable in a target tracking mode and a geometry mode with
possible connections to optional accessory elements including a
wearable computer, a desktop computer, and a mobile display
according to an embodiment of the present disclosure;
[0036] FIG. 13F is an isometric view of a handheld photogrammetry
camera according to an embodiment of the present disclosure;
[0037] FIG. 13G is a schematic representation of a photogrammetry
camera with possible connections to optional accessory elements
including a wearable computer, a desktop computer, and a mobile
display according to an embodiment of the present disclosure;
[0038] FIG. 14 is a block diagram showing electronics within the
handheld portion of the scanning system according to an embodiment
of the present disclosure;
[0039] FIG. 15 is a block diagram showing electrical components
within a wearable computer and other system components according to
an embodiment of the present disclosure;
[0040] FIG. 16 is a block representation of a method for
determining 3D coordinates of points on an object according to an
embodiment of the present disclosure;
[0041] FIGS. 17A, 17B, 17C are plots illustrating the relationship
between input data and output data at a pixel of an image sensor
for high gain, low-gain, and combined gain modes, respectively,
according to an embodiment of the present disclosure;
[0042] FIG. 18A illustrates use of multiple compression break
points to obtain high dynamic range according to an embodiment of
the present disclosure;
[0043] FIG. 18B is a description of a method for using multiple
compression points to obtain high dynamic range;
[0044] FIG. 19 is an image that illustrates how ability to select
between vertical and horizontal readout provides many advantages to
3D measuring systems in some cases;
[0045] FIG. 20A is perspective view of a stereo camera and stand
according to an embodiment of the present disclosure;
[0046] FIG. 20B is a schematic representation of a handheld 3D
measuring device with a collection of reflectors or light sources
for imaging by the stereo camera of FIG. 20A according to an
embodiment of the present disclosure;
[0047] FIGS. 21A, 21B is a schematic representation of two cameras
connected to a processor according to an embodiment;
[0048] FIG. 21C is a schematic representation of a handheld
measuring device with a collection of reflectors or light sources
for imaging by the stereo camera of FIGS. 21A, 21B according to an
embodiment of the present disclosure, and
[0049] FIG. 22A-22C show perspective views of a camera in
accordance with an embodiment.
[0050] The detailed description explains embodiments of the
disclosure, together with advantages and features, by way of
example with reference to the drawings.
DETAILED DESCRIPTION
[0051] Improvements described herein below include systems and
methods that reduce the need for application and removal of
adhesive markers. Another improvement is in providing ways to move
handheld scanners and photogrammetric cameras for measurement of
large objects without being constrained by wires. Further
improvements include methods to obtain high-dynamic range (HDR)
images with reduced need for intermediate computations that slow
measurements.
[0052] FIG. 1 illustrates, in isometric view, an articulated arm
coordinate measurement machine (AACMM) 10 according to various
embodiments of the present disclosure, the AACMM being one type of
coordinate measuring machine. In an embodiment, a first segment 50
and a second segment 52 are connected to a base 20 on one end and a
measurement device on the other end. In an embodiment, the
measurement device is a tactile-probe assembly 90.
[0053] In an embodiment illustrated in FIG. 1, the AACMM 10
comprises includes seven rotational elements; hence the AACMM 10 is
referred to as a seven-axis AACMM. In other embodiments, the AACMM
10 is a six-axis AACMM. The seven-axis AACMM 10 of FIG. 1 includes
first-axis assembly 60, second-axis assembly 61, third-axis
assembly 62, fourth-axis assembly 63, fifth-axis assembly 64,
sixth-axis assembly 65, and seventh-axis assembly 66. In an
embodiment, a tactile-probe assembly 90 and a handle 91 are
attached to the seventh-axis assembly. Each of the axis assemblies
may provide either a swivel rotation or a hinge rotation. In the
embodiment illustrated in FIG. 1, the first-axis assembly 60
provides a swivel rotation about an axis aligned to a mounting
direction of the base 20. In an embodiment, the second-axis
assembly 61 provides a hinge rotation about an axis perpendicular
to the first segment 50. The combination of the first-axis assembly
60 and the second-axis assembly 61 is sometimes colloquially
referred to as a shoulder 12 since in some embodiments the possible
motions of the shoulder 12 of the AACMM 10 resemble the motions
possible with a human shoulder.
[0054] In the embodiment illustrated in FIG. 1, the third-axis
assembly 62 provides a swivel rotation about an axis aligned to the
first segment 50. The fourth-axis assembly 63 provides a hinge
rotation about an axis perpendicular to second segment 52. The
fifth-axis assembly 64 provides a swivel rotation about an axis
aligned to the second segment 52. The combination of the third-axis
assembly 62, the fourth-axis assembly 63, and the fifth-axis
assembly 64 is sometimes colloquially referred to as an elbow 13
since in some embodiments the possible motions of the elbow 13 of
the AACMM 10 resemble the motions possible with a human elbow.
[0055] In the embodiment illustrated in FIG. 1, the sixth-axis
assembly provides a hinge rotation about an axis perpendicular to
the second segment 52. In an embodiment, the AACMM 10 further
comprises a seventh-axis assembly, which provides a swivel rotation
of probe assemblies (e.g., probe 90) attached to the seventh axis.
The sixth-axis assembly 65, or the combination of the sixth-axis
assembly 65 and the seventh-axis assembly 66, is sometimes
colloquially referred to as a wrist 14 of the AACMM 10. The wrist
14 is so named because in some embodiments it provides motions like
those possible with a human wrist. The combination of the shoulder
12, first segment 50, elbow 13, second segment 52, and wrist 14
resembles in many ways a human arm from human shoulder to human
wrist. In some embodiments, the number of axis assemblies
associated with each of the shoulder, elbow, and wrist differ from
the number shown in FIG. 1. It is possible, for example, to move
the third-axis assembly 62 from the elbow 13 to the shoulder 12,
thereby increasing the number of axis assemblies in the shoulder to
three and reducing the number of axis assemblies in the wrist to
two. Other axis combinations are also possible.
[0056] FIG. 2 shows an isometric view of an LLP 200 coupled to the
seventh-axis assembly 66. The LLP 200 includes the camera 220 and
the projector 210. In an embodiment, the LLP 200 further includes
the handle 91. The seventh-axis assembly 66 includes the
seventh-axis housing/yoke 202. Attached to the seventh-axis
assembly 66 is tactile-probe assembly 90, which includes the probe
tip 92.
[0057] In FIG. 3, the handle 91 includes wires that send electrical
signals from handle buttons 93 through the handle-to-arm connector
94. In an embodiment, high-speed signals obtained from a camera 220
of the LLP 200 pass through the handle-to-arm connector 94 to
further within the AACMM. In an embodiment, the LLP 200 includes
the projector 210, which is separated by a baseline distance from
the camera 220. A processor within the system performs a
triangulation calculation to determine 3D coordinates of points
illuminated by a line of light or other features or targets seen on
the object.
[0058] FIG. 4 shows the line 400 defining a plane of the beam of
light emitted by the projector 210 according to an embodiment. As
seen in the front view of FIG. 4, the beam resides in a vertical
plane. From a side view, however, the beam of light 400 is seen to
be expanding as it moves away from the LLP 200.
[0059] FIG. 5 shows a schematic illustration of elements of an LLP
500, including a projector 520 and a camera 540. FIG. 5 is a
schematic illustration of the LLP 200 when viewed from the top with
the LLP 500 looking toward object surfaces 510A, 510B. Because of
the change in viewpoint, the camera 220 is to the left of the
projector 210 in FIG. 4, while the equivalent camera 540 is to the
right of the projector 520 in FIG. 4 in the changed viewpoint. The
projector 520 includes a source pattern of light 521 and a
projector lens 522. The projector lens 522 includes a projector
perspective center and a projector optical axis that passes through
the projector perspective center. In the exemplary system of FIG.
5, a central ray 524 of the beam of light coincides with the
projector optical axis. The camera 540 includes a camera lens 534
and a photosensitive array 2641. The camera lens 534 has a camera
lens optical axis 536 that passes through a camera lens perspective
center 537. In the exemplary LLP 500, the camera lens optical axis
536 and the projector optical axis are both perpendicular to a
plane that encompasses the line of light 523 projected by the
source pattern of light 521. In other words, the plane that
encompasses all the lines of light 523 is in the direction
perpendicular to the plane of the paper of FIG. 5. The line of
light 523 strikes an object surface, which at a first distance from
the projector is object surface 510A and at a second distance from
the projector is object surface 510B. The line of light 523
intersects the object surface 510A (in the plane of the paper) at a
point 526, and it intersects the object surface 510B (in the plane
of the paper) at a point 527. For the case of the intersection
point 526, a ray of light travels from the point 526 through the
camera lens perspective center 537 to intersect the photosensitive
array 2641 at an image point 2646. For the case of the intersection
point 527, a ray of light travels from the point 527 through the
camera lens perspective center 537 to intersect the photosensitive
array 2641 at an image point 647. By noting the position of the
intersection point relative to the position of the camera lens
optical axis 536, the distance from the camera (and projector) to
the object surface can be determined using the principles of
triangulation, which typically rely on the "baseline" distance
between the perspective centers of the projector 520 and the camera
540. The distance from the projector to other points projected by
the line of light 523 onto the object, that is points on the line
of light that do not lie in the plane of the paper of FIG. 5, may
likewise be found using the principles of triangulation.
[0060] In the embodiment of FIGS. 6, 7, the end assembly 600 is
coupled to an LLP 605 by a first accessory interface 650 and a
second accessory interface 655. In an embodiment, the latch arm 660
is rotated to allow the coupling assembly 650, 655 to lock the LLP
605 in place, thereby connecting the LLP 605 to the end assembly
600 both electrically and mechanically. The LLP 605 includes a
projector 610 and a camera 620.
[0061] In an embodiment, an accessory noncontact 3D measuring
device 800 may be attached to the AACMM 10 as illustrated in FIGS.
8, 9 or detached from the AACMM as illustrated in FIG. 10. In FIG.
8, the noncontact 3D measuring device 800 is attached to the AACMM
10, which further includes a probe tip 92 for contact 3D
measurement. In an embodiment, the device 800 is attached to the
first accessory interface 650. FIG. 9 shows elements in the device
800, including device body 810, first camera 820A, second camera
820B, and projector assembly 850. In an embodiment, the projector
assembly includes two illuminators that project planes of laser
light.
[0062] FIG. 10 shows the noncontact 3D measuring device, such as a
line scanner 1000, detached from the AACMM 10. The noncontact 3D
measuring device 800 includes the cameras 820A, 820B, and projector
assembly 850 described in FIGS. 8, 9. It further includes a handle
1010 and optional light-emitting diodes (LEDs) 822A, 822B.
[0063] FIG. 11 shows the noncontact 3D measuring device 1000 in a
mode of operation in which a plane of laser light is emitted from
each of light sources 1110A, 1110B. In an embodiment, each of the
light sources 1110A, 1110B emits light at a different wavelength.
In an embodiment, the camera 820A has an optical coating that
passes the wavelength of the light 1110A and blocks the wavelength
of the light 1110B. In contrast, the camera 820B has an optical
coating that passes the wavelength of the light 1110B and blocks
the wavelength of the light 1110A. In an embodiment, both cameras
820A, 820B pass the wavelengths emitted by the LEDs 822A, 822B so
that markers illuminated by the LEDs 822A, 822B are visible to both
cameras.
[0064] FIG. 12 shows several possible accessories that may be used
with the 3D measuring device 1000. In an embodiment, the 3D
measuring device attaches to a wearable unit 1200 that includes a
computing unit 1205 and a battery 1210. In an embodiment, the
battery 1210 is rechargeable and removable. In an embodiment, the
wearable unit receives a signal over a USB or Ethernet cable 1215.
Ethernet is a family of computer networking technologies first
standardized in 1985 as IEEE 802.3. Ethernet that supports 1
gigabit per second is often referred to as Gigabit Ethernet. Higher
speed Ethernet versions with multi-gigabit bandwidth such as 2.5G,
5G, and 10G are becoming increasingly common. In embodiments, the
cable 1215 carries one of Gigabit Ethernet, 2.5G, 5G, and 10G.
First released in 1996, the USB standard is maintained by the USB
Implementers Forum. There are several versions of USB from the
initial USB 1.0 that operates at 1.2 Mbps to USB4 that operates at
40 Gbps, with intermediate versions having intermediate data rates.
Data may be sent from the wearable unit 1200 to an external
computer 1220, which might be a desktop computer or a computer
network. Connection from the wearable unit 1200 may be made through
cable 1230, through wireless connection 1235, or through a
removable memory storage device. Connection may alternatively be
made between the 3D measuring device 1000 and the external computer
1220.
[0065] Captured data may be displayed using a mobile display unit
1240. In an embodiment, the mobile display 1240 is magnetically
attached to the rear side of the 3D measuring device 1000. The
mobile phone may receive power from the 3D measuring device 1000,
which in turn may receive power from the battery 1210 or external
computer 1220. The mobile display 1240 may communicate with the
wearable unit 1200 through wireless connection 1245 or through a
cable from the wearable device. Alternatively, captured data may be
displayed using a monitor 1222 provided to operate in conjunction
with the external computer 1220.
[0066] FIG. 13A shows a handheld 3D measuring device 1300 (e.g. a
photogrammetric camera or line scanner) in which a shaft 1305
provides a handle for an operator 1302. The 3D measuring system
1300 illustrated in FIG. 13A may be operated in a target tracking
mode or a geometry tracking mode, according to a selection made by
the operator. FIG. 13A illustrates features applicable to both
modes.
[0067] FIGS. 13A, 13B illustrate the target tracking mode. In this
mode, light source 1310A emits a plane of light at the first
wavelength. This light is captured by the camera 1320A as the line
1330A. At the same time, light source 1310B emits a plane of light
at a second wavelength. This light is captured by the camera 1320B
as the line 1330B. In an embodiment, the first wavelength is
different than the second wavelength. At the same time, LEDs 1322A,
1322B emit light at a different third wavelength to illuminate
reflective markers 1330C placed on or near the object under test.
The first camera 1320A includes optical elements coated to pass the
first and third wavelengths, while the second camera 1320B includes
optical elements coated to pass the second and third wavelengths.
Hence each of the cameras 1320A, 1320B sees one of the two
projected lines of laser light as well as the illuminated
reflective markers 1320C. The lines of light imaged by the cameras
1320A, 1320B are processed to determine the 3D coordinates of
illuminated points on the object within the frame of reference of
the 3D measuring device 1300. The reflective 3D markers 1330C
imaged by the cameras 1320A, 1320B are processed to determine the
3D coordinates of the markers 1330C in successive frames. This
enables the 3D coordinates determined for the lines 1330A, 1330B to
be tracked (registered) over successive frames.
[0068] In the geometry tracking mode illustrated in FIG. 13C, light
source 1312A emits multiple parallel planes of light 1340 at a
fourth wavelength. The fourth wavelength is different than the
first wavelength, second wavelength, and third wavelength. The
first camera 1320A and the second camera 1320B both include
elements coated to pass the fourth wavelength, and hence both
cameras 1320A, 1320B see the projected lines 1340A. Because the
optical axis of the camera 1320A is more closely aligned to the
optical axis of the projector 1312A than to the optical axis of the
projector 1312B, the projected lines of light from the projector
1312A will tend to sweep more slowly across the image sensor as the
distance to the object changes than will the projected lines of
light from the projector 1312B. The difference in these lines of
light as seen by the cameras 1320A, 1320B enables the identity of
each line to be uniquely determined. The process of identifying
which projected lines correspond to which imaged lines is referred
to a "disambiguation" of the lines. In an embodiment, a method used
for doing this disambiguation is described in Willomitzer et al.,
"Single-shot three-dimensional sensing with improved data density,"
in Applied Optics, Jan. 20, 2015, pp 408-417. Further improvement
in the geometry tracking mode is possible by further projecting
multiple planes of light 1340B with the projector 1312B. In an
embodiment, the patterns 1340A, 1340B are alternately
projected.
[0069] As illustrated in FIGS. 13C, 13D, the projected multiple
planes of light appear as lines of light 1340A, 1340B when striking
a planar surface. Deviations in the imaged lines of light 1340A,
1340B from perfect straightness indicates that the surface being
measured is not perfectly planar. Deviations resulting from edges,
dips, or bulges can be detected and correlated from shot to shot to
determine the amount and direction of movement in each frame. An
advantage of the geometry tracking mode compared to target tracking
mode is faster measurements since adhesive markers do not need to
be applied or removed.
[0070] In the embodiment illustrated in FIG. 13E, the
photogrammetric camera 1300 is powered by a battery 1210 within a
wearable unit 1200. In an embodiment, the power connector 1216 is
conveniently disconnected from a handheld scanner such as the
scanner 1000, 1300 and plugged into the scanner handle to provide
power to the photogrammetric camera. In an embodiment, computing
unit 1205 is used to process images obtained by the photogrammetric
camera 1300 of target markers affixed on or near the object under
test. Computing unit 1205 may also cooperate with an external or
networked computer 1220 to process target images. In an embodiment,
the mobile display 1240 is used to provide instructions or
information on preferred positions and orientations of the
photogrammetric camera 1300 in capturing images. In addition, in an
embodiment, the mobile display 1240 displays captured data using
the mobile display unit 1240. In an embodiment, the mobile display
1240 is magnetically attached to the rear side of the 3D measuring
device 1300. The mobile phone may receive power from the 3D
measuring device 1300, which in turn may receive power from the
battery 1210 or external computer 1220. The mobile display 1240 may
communicate with the wearable unit 1200 through wireless connection
1245 or through a cable from the wearable device. Alternatively,
captured data may be displayed using a monitor 1222 provided to
operate in conjunction with the external computer 1220.
[0071] A photogrammetric camera 1350 shown in FIG. 13F may be used
in combination with a handheld line scanner such as the scanner
1300. The photogrammetric camera 1350 includes a camera assembly
1360, which include a camera lens, image sensor, and electronics.
Surrounding the camera assembly 1360 are a collection of light
sources 1370 such as light emitting diodes (LEDs). In an
embodiment, the photogrammetric camera further includes a handle
1380 having control buttons 1382, 1384. In an embodiment, the
photogrammetric camera is used with scale bars or other scaled
objects to provide scale in the captured images. In an embodiment,
the light sources 1370 illuminate the object, which may include
target reflectors or markers 1330C like those shown in FIG. 13B.
Markers 1330C may also be placed on the scale bars. In an
embodiment, markers 1330C are placed over a relatively large area
on the object. The photogrammetry camera 1350 captures images of
the object and scale bars from a variety of positions and
perspectives. Software is then used to perform a least-squares fit
(or other optimization procedure) to determine the 3D coordinates
of the markers in space over the relatively large area of the
object. This enables the handheld line scanner 1300, which may
measure over a relatively small area at a time, to be accurately
registered over a much larger area. If the photogrammetric camera
1350 is used with the scanner 1300 in the geometry tracking mode
illustrated in FIGS. 13C, 13D, the photogrammetric camera may be
used to measure natural features such as edges or corners to
provide registration assistance for a handheld line scanner such as
the scanner 1300.
[0072] In the embodiment illustrated in FIG. 13F, the
photogrammetric camera 1350 is powered by a battery, which may for
example be inserted into the handle 1380. In an alternative
embodiment illustrated in FIG. 13G, the photogrammetric camera 1350
is powered by a battery 1210 within the wearable unit 1200. In an
embodiment, the power connector 1216 is conveniently disconnected
from a handheld scanner such as the scanner 1000, 1300 and plugged
into the handle 1380 to provide power to the photogrammetric
camera. In an embodiment, computing unit 1205 is used to process
images obtained by the photogrammetric camera 1350 of target
markers affixed on or near the object under test. Computing unit
1205 may also cooperate with an external or networked computer 1220
to process target images. In an embodiment, the mobile display 1240
is used to provide instructions or information on preferred
positions and orientations of the photogrammetric camera 1350 in
capturing images.
[0073] FIG. 14 is a block diagram illustrating exemplary
electronics 1400 within a handheld line scanner such as the
handheld line scanner 1000 or 1300. Processing for images captured
by each of the two image sensors 1410A, 1410B within the line
scanner is carried out by a corresponding field programmable gate
arrays (FPGAs) 1415A, 1415B and double date rate 4 synchronous
dynamic random-access memory (DDR4 SDRAM or simply DDR4) 1418A,
1418B. Printed circuit boards (PCBs) 1420A, 1420B provide direct
current (DC) electrical power to components in the electronics
1400. For example, voltages may be provided at 0.9, 1.2, 1.8, 2.5,
3.0, 3.3 and 5 volts. Laser drivers 1430 provide current to lasers
1432 or other light sources that emit lines or other patterns of
light. LED drivers 1434 provide current to LEDs ring PCBs 1436.
Interface PCB 1440 provides an electrical interface to components
outside of electronics 1400. The PCBs 1420A, 1420B also provide
electrical power to the button PCB 1442, status LEDs 1444, inertial
measurement units (IMUs) 1550, buffers/translators 1452,
temperature sensors 1454, and fans 1456. An environmental recorder
1460 records environmental events and is supplied electrical power
by battery 1462 to record such events even when power from AC power
mains is not available to electronics 1400. For example, the
environmental recorder may record high-g shocks measured by the
IMUs 1450 during shipping.
[0074] FIG. 15 shows electronics 1500 within the exemplary wearable
unit 1200 (FIG. 12). In an embodiment, a handheld 3D measurement
device such as 1000 or 1300 sends data over a USB-C cable 1505,
which can transfer data at up to 10 Gbps to the wearable unit 1200.
Data arrives at a first industrial USB connector 1515A within a
power distribution PCB 1510. The data is transferred to the USB hub
1520, which in an embodiment is a USB 3.2 Gen 2 hub capable of
transferring data at 10 Gbps. Electrical power is delivered to the
USB hub 1520 from a battery charger 1542 (via DC/DC converter 1523
for example) that may receive electrical power from either a
19-volt DC line 1540 or from either of two batteries 1545. In an
embodiment, the batteries 1545 are removable and rechargeable. The
battery charger 1542 sends some DC power to the USB hub 1520, which
distributes DC power upstream to the handheld unit (such as 1000 or
1300) according to the instructions of the power controller 1522.
The battery charger 1542 also sends some DC power downstream
through the DC power output connector 1552 through the cable 1527
to the DC power input connector 1554, which distributes power as
required in the components of a System on a Chip (SoC) 1530. Data
is passed from the USB hub 1520 to a second industrial USB
connector 1515B and through a second USB-C cable 1525 to a USB
SuperSpeed+ port 1526 affixed to the SoC 1530. In an embodiment,
the SoC 1530 is an Intel Next Unit of Computing (NUC) device. In an
embodiment, the SoC 1530 is interfaced to Wi-Fi 1532, Ethernet
1534, and a USB SuperSpeed flash drive 1537. In an embodiment,
Wi-Fi 1532 sends wireless signals 1533 to a mobile phone display
1240. Wi-Fi is a trademark of the non-profit Wi-Fi Alliance. Wi-Fi
devices, which are compliant with the IEEE 802.11 standard, are
used for local wireless network connectivity applications such as
to the mobile display 1240 and external computer 1220. In an
embodiment, Ethernet 1534 is a Gigabit Ethernet (GbE) that sends
signals at 1 Gbit per second over wires (cables) 1535 to an
external computer 1220. Ethernet, which is compliant with the IEEE
802.3, is used for wired network connectivity applications. In an
embodiment, scan data is saved on a USB SuperSpeed flash drive 1537
via USB port 1536. The Universal Serial Bus (USB) is an industry
standard maintained by the USB Implementer's Forum. USB is designed
to provide power as well as data communications. USB-C SuperSpeed+
provides data transfer at 10 Gbps. The battery charger 1547 not
only delivers DC power from the batteries to the battery charger
when required, it also charges the batteries 1545 when power is
being supplied by the DC power line 1540.
[0075] To improve accuracy of determined 3D coordinates of points
measured on an object by a 3D measuring device such as 1000 or
1300, it is desirable to increase the dynamic range of the imaged
lines of laser light as much as possible. When dynamic range is
large, the 3D measuring device can capture bright reflections
without saturation and faint reflections without excessive noise.
One method of increasing dynamic range was described in commonly
owned U.S. patent application Ser. No. 17/073,923 (hereafter Faro
'923) filed on Oct. 19, 2020, (Attorney Docket FA0989US4), the
contents of which are incorporated by reference herein. This method
uses a photosensitive array having a selectable conversion gain
(CG), where CG is defined as the voltage produced per electron (e)
in a pixel electron well. For example, a pixel having a CG=130
.mu.V/e produces a 1.3-volt signal in response to 10,000 electrons
in its electron well. A CG is said to be selectable when any of two
or more CGs can be selected. According to one method described in
Faro '000, high and low CGs are alternately selected, and the
signal obtained for the preferred CG is chosen.
[0076] For the electronics illustrated in FIG. 14, a potential
disadvantage of the selectable gain method of Faro '000 is that
more computations are required of electronics such as the FPGAs
1415A, 1415B. The added computations result in increased power
consumption, increased system weight, and added expense to obtain
the desired high dynamic range. In an embodiment described in Faro
'000, the gain settings are alternated between high gain and low
gain, the pixel values are alternately read out, and one of the two
read-out values is selected for each pixel. Using this approach,
high dynamic range is achieved without increased power consumption,
weight gain, or expense.
[0077] A method that provides high dynamic range without increasing
power consumption, system weight, and expense is illustrated in the
method 1600 of FIG. 16. An element 1610 includes, with a 3D
measuring device having an image sensor, projecting a pattern of
light onto an object. An element 1612 includes, with an image
sensor, capturing an image of the projected pattern of light, the
captured image having pixel values each based at least in part on a
selection among two or more-pixel conversion gains. An element 1614
includes reading out the selected pixel values from the image
sensor. An element 1616 includes, with a processor, determining 3D
coordinates of points on the object based at least in part on the
projected pattern of light and the read-out pixel values.
[0078] FIGS. 17A, 17B, 17C illustrate an embodiment of the method
1600 illustrated in FIG. 16. In each of FIGS. 17A, 17B, 17C, the
horizontal axis 1702 of each graph represents input data, which is
to say the electrical signal (for example, in microvolts) generated
in response to electrons in the pixel well. As an example of low
and high CG modes, the high CG might be CG.sub.high=130 .mu.V/e
while the low CG might be CG.sub.low=30 .mu.V/e. Corresponding
numbers of electrons in a full pixel well might then be 10,000
electrons for the high CG case and 40,000 electrons for the low CG
case. Corresponding noise levels might be 2 electrons for the high
CG case and 9 electrons for the low CG case. In some embodiments,
the combining of low CG and high CG within the image sensor is
accomplished through the use of a dual-ADC (analog-to-digital
converter).
[0079] FIG. 17A shows a pixel response curve for the high CG case.
For this case, the horizontal axis 1702 may be considered to
equivalently represent either the number of photons striking the
well or the number of electrons stored in the well. The pixel
output data represented by the vertical axis 1704 may be given in
voltage. For the case in which light is faint so that relatively
few photons reach the pixel well, pixels remain below the
saturation limit 1708 while having the advantage of a relatively
low readout noise (2 electrons in this example). For the case in
which the light level is above the saturation limit 1708, the
output response saturates, which is to say that the output voltage
of the well levels off to a saturation output level 1710.
[0080] FIG. 17B shows a pixel response curve for the low CG case.
For this case, the horizontal axis 1712 represents the number of
photons striking the well or the number of electrons stored in the
well. The pixel output data represented by the vertical axis 1714
may be given, for example, in voltage. For the case in which light
is strong so that relatively many photons reach the pixel well,
saturation is avoided. Even though the readout noise is relatively
larger in this case compared to the high CG case, the
signal-to-noise ratio is still relatively good.
[0081] FIG. 17C illustrates an embodiment for combining the results
of the high CG and low CG data to obtain a high dynamic range
(HDR). For this case, the horizontal axis 1702 represents input
data and the vertical axis 1724 represents the output data. FIG.
17C illustrates a method for combining the high gain response curve
1706 with the low gain response curve 1716 to obtain a to obtain a
composite response curve that includes an extended region 1726 that
results in an HDR response. For input data having a level above the
saturation limit 1708, the captured input data 1710 to the right of
the saturation limit 1708 is increased by the ratio of high CG to
low CG. This causes the input data obtained for the curve 1716
below the saturation limit 1708 to be increased in a movement 1730
by the amount 1740, which when converted to bits is referred to as
the bit extension. Since the signal-to-noise ratio is approximately
the same for the high CG and low CG, the dynamic range is improved
approximately by the bit extension, resulting in HDR. As shown in
FIG. 17C, the bit extension 1740 seamlessly extends the range of
output values in the extended region 1726 to obtain the HDR.
[0082] In another embodiment illustrated in FIG. 18A, image sensors
such as the sensors 1410A, 1410B use a method of gradation
compression to obtain HDR, enabling a scanning 3D measuring device
such as 1000 or 1300 to measure both relatively very dim and very
bright reflections. In an embodiment, the image sensors 1410A,
1410B are set to have a plurality of compression break points such
as the points/levels 1812, 1822. As in the discussion of FIGS. 17A,
17B, 17C, the horizontal axis 1802 in FIG. 18 represents input
data, which is to say the electrical signal (for example, in
microvolts) generated in response to electrons in the pixel well.
The pixel output data represented by the vertical axis 1804 may
also be given in voltage. In an embodiment, for input data between
0 and the level 1812 and an output data between 0 and level 1816,
gradation compression is not applied to the input data, resulting
in the response curve 1814. For input data in the region between
1812 and 1822, the gain is reduced or compressed, resulting in a
smaller slope in the response curve 1824. For input data in the
region between 1822 and 1832 (having an output data corresponding
to level 1826), the gain is further reduced or compressed,
resulting in a still smaller slope in the response curve 1834. The
maximum level of the resulting output data is given by the
line/level 1840. For example, in a representative image sensor, the
level 1840 might correspond to 12 bits (or 4095). Without
compression, the signals may be considered small signals covering
the range 1818, medium signals that further cover the range 1828,
or large signals that further cover the range 1838. In effect, the
maximum signal without compression 1836 is compressed to the level
1832. Hence, as illustrated in FIG. 18, the method of gradation
compression increases dynamic range.
[0083] FIG. 18B describes elements in a method 1850 for using
gradation compression to increase dynamic range. An element 1860
includes, with a 3D measuring device having an image sensor,
projecting a pattern of light onto an object. An element 1862
includes, with the image sensor capturing an image of the projected
pattern of light, the captured image having pixel values based at
least in part on a change in pixel response at a plurality of
compression break points. An element 1864 includes reading out the
selected pixel values from the image sensor. An element 1866
includes, with a processor, determining 3D coordinates of points on
the object based at least in part on the projected pattern of light
and the read-out of pixel values.
[0084] As illustrated in FIG. 4, in a typical case, an emitted
laser line 400 is usually projected perpendicular to a line
connecting the projector 210 to the camera 220. In other words, for
a line scanner held as in FIG. 4, the line is vertical rather than
horizontal. To collect a relatively large number of data points on
the scanned object, it is customary to align the projected laser
line 400 to the long side of the image sensor within the camera.
Ordinarily, image sensors are shown in landscape view having the
long side of the image sensor along the horizontal direction, which
is the reverse of the actual direction of the image sensor as it
would be aligned in FIG. 4. Hence, in FIG. 19, the row numbers
change along the horizontal axis and the column numbers change
along the vertical axis. In prior art line scanners such as the
line scanner 200 in FIG. 4, processing of the data from the image
sensor is carried out a row at a time starting with first row
within the scan region and ending with the last row N in the
region. However, this order of data collection is the reverse of
the order obtained by the line scanner. In FIG. 19, a movement from
left to right, corresponding to a changing row number, corresponds
to a changing distance to the object under test. In other words,
for the geometry shown in FIG. 19, calculations need to be carried
out a column at a time rather than a row at a time. To make this
possible, in the past, it has been necessary to store much more
data than is used in the calculation of the centroid of the imaged
line of laser light 1910 along each projected line column. In an
embodiment, the image sensor 1900 can be set to read in either
vertical or horizontal mode, thereby greatly simplifying the
calculation of the 3D coordinate of each point on the projected
laser line. Advantages gained by selecting the better of the
horizontal or vertical directions include: (1) reduced data storage
requirements, (2) simpler algorithms for calculating 3D
coordinates, and (3) better processor utilization.
[0085] Binning is a procedure in which multiple values are combined
into a single "bin." For example, an image processor that supports
2.times.2 binning will report signal levels obtained from pixel
groups that are 2 pixels wide and 2 pixels high. A potential
disadvantage in the use of binning is a reduction in image
resolution, but potential advantages include (1) higher speed, (2)
reduced processing, (3) faster data transfer, (4) higher
signal-to-noise ratio in some cases, and (5) reduced speckle.
[0086] In an embodiment, 2.times.2 binning is used. With this type
of binning a square formed of two vertical pixels and 2 horizontal
pixels are treated as a block, with the values of the four pixels
summed together. For this case, speed and data transfer are both
increased by a factor of four. Signal-to-noise ratio is expected to
increase when signal levels are low. Such low signal levels might
result, for example, from materials such as shiny or transparent
materials having low reflectance. With 2.times.2 binning, the
signal level received by the binned pixels is expected to increase
by a factor of 4, which in most cases will cause the
signal-to-noise ratio to increase significantly. Binning is also
expected to decrease speckle relative to the signal level captured
by the binned pixels. To further speed measurement and reduce
processing, binning may be combined with windowing, which is to say
selecting a region of interest (ROI) within a pixel array. The use
of windowing with line scanners is discussed in the commonly owned
U.S. patent application Faro '923, discussed herein above.
[0087] In an embodiment illustrated in FIG. 20B, a self-registering
3D measuring system 2050 includes a 3D measurement device such as
handheld scanner 1000 or 1300 and a collection of visible targets
2060, which in embodiments adhesive reflectors and LEDs. In an
embodiment, the collection of light targets 2060 are coupled to a
frame 2062, which are removably attached to the handheld scanner
such as 1000, 1300. In other embodiments, the visible targets 2060
are directly affixed to the handheld scanner 1000, 1300 with
connector elements 2064. The self-registering 3D measuring system
2050 may be directly connected to an external computer 1220 such as
a workstation computer or networked computer. Alternatively, the
self-registering 3D measuring system 2050 may be affixed to a
wearable unit 1200 that includes computing unit 1205 and battery
1210, connected as shown in FIG. 12.
[0088] As shown in FIG. 20A, in an embodiment, a viewing system
2000 includes a stereo camera assembly 2005 and a stand assembly
2025. In an embodiment, the stereo camera assembly 2005 includes a
first camera 2010A, a second camera 2010B, and a connecting element
2012, the first camera 2010A and the second camera 2010B being
separated by a baseline distance 2020. The stand assembly 2025
includes a mounting structure 2030, a base 2040, and optional
wheels 2042. In some embodiments, the stand assembly is a tripod.
In other embodiments, the stand assembly is an instrument stand. In
some embodiments, the first camera 2010A and the second camera
2010B are independently mounted, with the baseline distance between
adjustable according to the selected mounting arrangement. In an
embodiment, the stereo camera captures images of the visible
targets 2060 as the 3D measuring system 2050 is moved from place to
place. One or more processors, which may include some combination
of the self-registering 3D measuring system 2050, the computing
unit 1205, and the external computing system 1220, determines the
3D movement from frame to frame based on matching of the visible
targets 2060 from frame to frame. With this method, the lines
1330A, 1330B from the projectors 1310A, 1310B or any other patterns
projected by 3D measuring devices such as 1000, 1300 can be tracked
as the 3D measuring system is moved from point to point. By
coupling the visible targets 2060 to the 3D measuring device such
as 1000, 1300, accurate measurement of 3D coordinates of an object
is provided without requiring the placing or removing of reflective
targets.
[0089] As shown in FIGS. 21A, 21B, 21C, camera systems 2100A, 2100B
capture images visible targets 2060 of a 3D measuring system 2050
and to use those captured images to track the pose (position and
orientation) of the handheld scanner 1300 as it is moved from
position to position by an operator 1302. The camera systems 2100A,
2100B take the place of the cameras 2010A, 2010B in FIG. 20A. In an
embodiment, electrical signals from the cameras 2100A, 2100B are
sent over a wired or wireless communication channel 2140 to a
computing system (processor) 2130 that calculates the 3D
coordinates. To perform this calculation, the computing system 2130
needs to know the relative pose (position and orientation) of the
two cameras 2110A, 2110B. In an embodiment, the relative pose of
the two cameras 2110A, 2110B is determined by performing a
compensation procedure in the field. An exemplary compensation
procedure involves capturing a pattern on an artifact such as a dot
plate. Such an artifact may be moved to a plurality of positions
and orientations and the cameras 2110A, 2110B used to capture
images in each case. Optimization methods such as bundle adjustment
are then used to determine the relative pose of the cameras 2110A,
2110B. Cameras 2110A, 2110B include optical imaging systems 2112A,
2112B having lenses, image sensors, and processing electronics. In
an embodiment, the lenses within optical imaging systems 2112A,
2112B are zoom lenses that enable magnification of the visible
targets 2060 on the 3D measuring system 2050. The cameras 2110A,
2110B may be mounted on any sort of mounting stands 2120A, 2120B,
for example, on tripods, instrument stands, or other structures
within a factory.
[0090] In some cases, it is desirable to have a greater or larger
optical magnification than provided by the lenses in the cameras in
the handheld 3D measuring devices such as 1300 or 1000. A greater
magnification covers a smaller region of the object in each
captured image, but it provides greater details, which enables
greater 3D measurement accuracy and resolution. In contrast, a
lesser magnification covers a larger region of the object in each
captured image, which enables measurements to be made faster but
with less resolution. A method to enable magnification to be
quickly changed while using the same basic 3D measurement assembly
is illustrated in FIGS. 22A, 22B, 22C.
[0091] FIG. 22A is an exploded view of a camera 2300 with
attachable adapter lenses 2350A, 2350B. The camera 2300 is a
handheld 3D measuring device, which includes housing 2302, cameras
2320A, 2320B, light projectors 2310A, 2310B, 2312A, 2312B, recessed
illuminator LEDs 2322A, 2322B, first kinematic elements 2330, first
magnets 2340, and electrical pin receptacles 2342. Each adapter
lens assembly 2350A, 2350B includes a housing 2352, adapter lens
elements 2360, and illuminator LEDs 2370. Additional elements on
the rear side of the adapter lens assembly 2350A are shown in FIG.
22C. These include second kinematic elements 2380, second magnets
2382, and electrical pin connectors 2384. In the exemplary
embodiment of FIG. 22C, second kinematic elements 2380 are
cylinders and first kinematic elements 2330 are a pair of spherical
surfaces. Each of the three first kinematic elements 2330 contact
the three second kinematic elements 2380. In general, kinematic
connectors like to those shown in FIGS. 22B, 22C enable the adapter
lens assembly 2350A or 2350B to be detached and then reattached
with a high degree of repeatability in the resulting position and
orientation. The first magnets 2340 are made to magnetically attach
to corresponding second magnets 2382. The electrical pin connectors
2384 plug into electrical pin receptacles 2342, thereby providing
electricity to power the illuminator LEDs 2370.
[0092] While the invention has been described in detail in
connection with only a limited number of embodiments, it should be
readily understood that the invention is not limited to such
disclosed embodiments. Rather, the invention can be modified to
incorporate any number of variations, alterations, substitutions,
or equivalent arrangements not heretofore described, but which are
commensurate with the spirit and scope of the invention.
Additionally, while various embodiments of the invention have been
described, it is to be understood that aspects of the invention may
include only some of the described embodiments. Accordingly, the
invention is not limited by the foregoing description but is only
limited by the scope of the appended claims.
* * * * *