U.S. patent application number 13/886981 was filed with the patent office on 2013-11-28 for ultrasound scanning system.
This patent application is currently assigned to MASSACHUSETTS INSTITUTE OF TECHNOLOGY. The applicant listed for this patent is MASSACHUSETTS INSTITUTE OF TECHNOLOGY. Invention is credited to Brian W. Anthony, Shih-Yu Sun.
Application Number | 20130317365 13/886981 |
Document ID | / |
Family ID | 49513088 |
Filed Date | 2013-11-28 |
United States Patent
Application |
20130317365 |
Kind Code |
A1 |
Anthony; Brian W. ; et
al. |
November 28, 2013 |
ULTRASOUND SCANNING SYSTEM
Abstract
A three dimensional ultrasound scanning system includes devices
to capture an acquisition state for a scan including quantitative
data about the conditions under which the scan was obtained. The
conditions may include pose and contact force of an ultrasound
probe in relation to a target. This data may be used to obtain a
three dimensional reconstructed volume of a target.
Inventors: |
Anthony; Brian W.;
(Cambridge, MA) ; Sun; Shih-Yu; (Cambridge,
MA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
TECHNOLOGY; MASSACHUSETTS INSTITUTE OF |
|
|
US |
|
|
Assignee: |
MASSACHUSETTS INSTITUTE OF
TECHNOLOGY
Cambridge
MA
|
Family ID: |
49513088 |
Appl. No.: |
13/886981 |
Filed: |
May 3, 2013 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61642315 |
May 3, 2012 |
|
|
|
Current U.S.
Class: |
600/459 |
Current CPC
Class: |
A61B 8/13 20130101; A61B
8/4472 20130101; A61B 8/463 20130101; A61B 8/4254 20130101; A61B
8/4263 20130101; A61B 8/4209 20130101; A61B 8/483 20130101; A61B
8/4245 20130101; A61B 8/4444 20130101; A61B 8/54 20130101; A61B
8/461 20130101 |
Class at
Publication: |
600/459 |
International
Class: |
A61B 8/00 20060101
A61B008/00; A61B 8/13 20060101 A61B008/13; A61B 8/08 20060101
A61B008/08 |
Claims
1. A system for ultrasound scanning comprising: a handheld
ultrasound probe comprising at least one transducer to capture a
plurality of ultrasound image of a target through a skin surface in
a scan; a fiducial marker with predetermined dimensions applied to
the skin surface; a camera mechanically coupled to the handheld
ultrasound probe, the camera positioned to record a digital image
of the skin surface during the scan when the handheld ultrasound
probe is placed for use against the skin surface; and a processor
in communication with the handheld ultrasound probe and the camera,
the processor configured to perform the steps of: identifying the
fiducial marker in two or more digital images from the camera;
establishing a world coordinate system using the predetermined
dimensions of the fiducial marker and the two or more digital
images; estimating a three dimensional pose of the handheld
ultrasound probe with respect to the two or more images; and
aligning the plurality of ultrasound images in the world coordinate
system to obtain a reconstructed volume of the target.
2. The system of claim 1 further comprising a lighting source
mechanically coupled to the handheld ultrasound probe and
positioned to illuminate the skin surface when the handheld
ultrasound probe is placed for use against the skin surface.
3. The system of claim 1 further comprising a display to display
one of the plurality of ultrasound images.
4. The system of claim 1 further comprising a display to display
one of the two or more digital images.
5. The system of claim 1 further comprising a memory to store data
from the handheld ultrasound probe and the camera.
6. The system of claim 5 wherein the memory is a remote memory
storage device.
7. The system of claim 6 further comprising a wireless interface
for coupling the handheld ultrasound probe in a communicating
relationship with the remote storage device.
8. A method for obtaining a three dimensional reconstructed volume
of a target using a handheld ultrasound probe comprising:
mechanically coupling a camera to the handheld ultrasound probe,
the camera positioned to capture a digital image of a skin surface
of a target when the handheld ultrasound probe is place for use
against the skin surface; applying a fiducial marker with
predetermined dimensions to the skin surface; concurrently
obtaining an ultrasound image from the handheld ultrasound probe
and a digital image from the camera in a scan of the target,
wherein the digital image includes the fiducial marker;
establishing a world coordinate system using the predetermined
dimensions of the fiducial marker; estimating a three-dimensional
pose of the handheld ultrasound probe with respect to the skin
surface for the digital image; and aligning the ultrasound images
in the world coordinate system.
9. The method of claim 8 further comprising aligning a plurality of
ultrasound images to the world coordinate system and reconstructing
a volume of the target from the plurality of ultrasound images.
10. The method of claim 8 further comprising mechanically coupling
a lighting source to the ultrasound probe, the lighting source
positioned to illuminate the skin surface when the handheld
ultrasound probe is placed for use against the skin surface.
11. The method of claim 8 wherein the step of estimating a three
dimensional pose of the handheld ultrasound probe includes
computing an image-to-world homography matrix.
12. The method of claim 8 wherein the fiducial marker is a
square.
13. The method of claim 12 wherein establishing the world
coordinate system comprises selecting the four corners of the
square.
14. The method of claim 8 further comprising wirelessly
transmitting ultrasound data from the handheld ultrasound probe to
a remote storage facility.
15. The method of claim 8 further comprising displaying at least
one ultrasound image obtained from the handheld ultrasound
probe.
16. The method of claim 8 further comprising displaying at least
one digital image recorded by the camera.
17. A system comprising: a handheld ultrasound probe including one
or more ultrasound transducers to capture an ultrasound image of a
target through a skin surface; a sensor system configured to obtain
a pose of the handheld ultrasound probe; a force sensor configured
to obtain a pressure applied by the handheld ultrasound probe to
the skin surface; and a memory that stores an acquisition state for
an ultrasound image, the acquisition state including the pose and
the pressure.
18. The system of claim 17, further comprising a fiducial marker
adhered to the skin surface and having one or more predetermined
features, wherein the sensor system includes a camera and a
processor programmed to identify the one or more predetermined
features and calculate the pose of the handheld ultrasound probe
using the one or more predetermined features and an image from the
camera that includes the fiducial marker.
19. The system of claim 17 wherein the sensor system includes at
least one inertial sensor.
20. The system of claim 17 wherein the sensor system includes an
external electromechanical system coupled to the handheld
ultrasonic probe for tracking a pose of the handheld ultrasonic
probe through direct measurements.
21. The system of claim 17 wherein the sensor system includes an
external optical system.
22. The system of claim 17 wherein the force sensor includes a
pressure transducer coupled to the handheld ultrasound probe and
configured to sense an instantaneous contact force between the
handheld ultrasound probe and the skin.
23. The system of claim 17 further comprising a lighting source
mechanically coupled to the handheld ultrasound probe and
positioned to illuminate the skin surface when the handheld
ultrasound probe is place for use against the skin surface.
24. The system of claim 17, further comprising: a linear drive
system mechanically coupled to the handheld ultrasound transducer
and including a control input, the linear drive system responsive
to a control signal received at the control input to translate the
ultrasound transducer along an actuation axis; and a controller
electronically coupled to the force sensor and the control input of
the linear drive system, the controller including processing
circuitry configured to generate the control signal to the control
input in a manner that maintains a substantially constant
predetermined contact force between the ultrasound transducer and
the target.
25. A method comprising: capturing an ultrasound image with a
handheld ultrasound probe; capturing a pose of the handheld
ultrasound probe substantially concurrently with the ultrasound
image; capturing a contact force of the handheld ultrasonic probe
substantially concurrently with the ultrasound image; and storing
the ultrasound image and an acquisition state for the ultrasound
image, the acquisition state including the pose and the contact
force.
26. The method of claim 25 further comprising capturing a plurality
of ultrasound images.
27. The method of claim 26 further comprising normalizing the
ultrasound images to a common contact force.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Prov. App. No.
61/642,315 filed on May 3, 2012, the entire content of which is
hereby incorporated by reference.
[0002] This application is also related to U.S. patent application
Ser. No. 12/972,461 filed on Dec. 18, 2010, which claims the
benefit of U.S. Prov. App. No. 61/287,886 filed Dec. 18, 2009. Each
of the foregoing applications is hereby incorporated by reference
in its entirety.
FIELD OF THE INVENTION
[0003] The invention generally relates to ultrasound imaging.
BACKGROUND
[0004] While medical ultrasound images can be quickly and cheaply
obtained from a handheld ultrasound imaging device, this type of
imaging generally suffers from a lack of accurate information
concerning the conditions under which the scan was captured. As a
result, two-dimensional ultrasound images from handheld probes are
generally limited in use to a qualitative evaluation of the imaged
tissue.
[0005] There remains a need for an ultrasound imaging device that
captures an acquisition state for a scan including quantitative
data about the conditions under which the scan was obtained such as
a position of a scanner or a contact force applied when an image is
obtained.
SUMMARY
[0006] An ultrasound scanning system captures an acquisition state
for a scan including quantitative data about the conditions under
which the scan was obtained. The conditions may include pose and
contact force of an ultrasound probe in relation to a target. This
data may be used, for example, to normalize image data in a variety
of ways and to return to a previously recorded acquisition state
for more useful comparisons to historical data.
[0007] In one aspect, a system disclosed herein includes a handheld
ultrasound probe comprising at least one transducer to capture a
plurality of ultrasound images of a target through a skin surface
in a scan; a fiducial marker with predetermined dimensions applied
to the skin surface; a camera mechanically coupled to the handheld
ultrasound probe, the camera positioned to record a digital image
of the skin surface during the scan when the handheld ultrasound
probe is placed for use against the skin surface; and a processor
in communication with the handheld ultrasound probe and the
camera.
[0008] The processor may be configured to perform the steps of:
identifying the fiducial marker in two or more digital images from
the camera; establishing a world coordinate system using the
predetermined dimensions of the planar marker and the two or more
digital images; estimating a three dimensional pose of the handheld
ultrasound probe with respect to the two or more images; and
aligning the plurality of ultrasound images in the world coordinate
system to obtain a reconstructed volume of the target.
[0009] The system may also include a lighting source mechanically
coupled to the handheld ultrasound probe and positioned to
illuminate the skin surface when the ultrasound probe is placed for
use against the skin surface.
[0010] The system may include a display to display one of the
plurality of ultrasound images and/or one of the two or more
digital images. The system may also include a memory, such as a
remote memory storage device, to store data from the handheld
ultrasound probe and the camera. The system may further include a
wireless interface for coupling the handheld ultrasound probe in a
communicating relationship with the remote storage device.
[0011] In another aspect, a method disclosed herein includes
mechanically coupling a camera to the handheld ultrasound probe,
the camera positioned to capture a digital image of a skin surface
of a target when the handheld ultrasound probe is place for use
against the skin surface; applying a fiducial marker with
predetermined dimensions to the skin surface; concurrently
obtaining an ultrasound image from the handheld ultrasound probe
and a digital image from the camera in a scan of the target,
wherein the digital image includes the fiducial marker;
establishing a world coordinate system using the predetermined
dimensions of the fiducial marker; estimating a three-dimensional
pose of the handheld ultrasound probe with respect to the skin
surface for the digital image; and aligning the ultrasound images
in the world coordinate system.
[0012] The method may include aligning a plurality of ultrasound
images to the world coordinate system and reconstructing a volume
of the target from the plurality of ultrasound images. The method
may include mechanically coupling a lighting source to the
ultrasound probe, the lighting source positioned to illuminate the
skin surface when the handheld ultrasound probe is placed for use
against the skin surface. The step of estimating a three
dimensional pose of the handheld ultrasound probe may include
computing an image-to-world homography matrix. The fiducial marker
may be a square. The step of establishing the world coordinate
system may include selecting the four corners of the square. The
method may further include wirelessly transmitting ultrasound data
from the handheld ultrasound probe to a remote storage facility.
The method may also include displaying at least one ultrasound
image obtained from the handheld ultrasound probe and/or displaying
at least one digital image recorded by the camera.
[0013] In another aspect, a system disclosed herein includes a
handheld ultrasound probe including one or more ultrasound
transducers to capture an ultrasound image of a target through a
skin surface; a sensor system configured to obtain a pose of the
handheld ultrasound probe; a force sensor configured to obtain a
pressure applied by the handheld ultrasound probe to the skin
surface; and a memory that stores an acquisition state for an
ultrasound image, the acquisition state including the pose and the
pressure.
[0014] The system may include a fiducial marker adhered to the skin
surface and having one or more predetermined features. The sensor
system may include a camera and a processor programmed to identify
the one or more predetermined features and calculate the pose of
the handheld ultrasound probe using the one or more predetermined
features and an image from the camera that includes the fiducial
marker. The sensor system may include at least one inertial sensor.
The sensor system may include an external electromechanical system
coupled to the handheld ultrasonic probe for tracking a pose of the
handheld ultrasonic probe through direct measurements. The sensor
system may include an external optical system. The force sensor may
include a pressure transducer coupled to the handheld ultrasound
probe and configured to sense an instantaneous contact force
between the handheld ultrasound probe and the skin. The system may
include a lighting source mechanically coupled to the handheld
ultrasound probe and positioned to illuminate the skin surface when
the handheld ultrasound probe is place for use against the skin
surface.
[0015] The system may include a linear drive system mechanically
coupled to the handheld ultrasound transducer and including a
control input, the linear drive system responsive to a control
signal received at the control input to translate the ultrasound
transducer along an actuation axis; and a controller electronically
coupled to the force sensor and the control input of the linear
drive system, the controller including processing circuitry
configured to generate the control signal to the control input in a
manner that maintains a substantially constant predetermined
contact force between the ultrasound transducer and the target.
[0016] In another aspect, a method disclosed herein includes
capturing an ultrasound image with a handheld ultrasound probe;
capturing a pose of the handheld ultrasound probe substantially
concurrently with the ultrasound image; capturing a contact force
of the handheld ultrasonic probe substantially concurrently with
the ultrasound image; and storing the ultrasound image and an
acquisition state for the ultrasound image, the acquisition state
including the pose and the contact force. The method may include
capturing a plurality of ultrasound images. The method may include
normalizing the ultrasound images to a common contact force.
BRIEF DESCRIPTION OF THE FIGURES
[0017] The invention and the following detailed description of
certain embodiments thereof may be understood by reference to the
following figures:
[0018] FIG. 1 is a perspective view of a handheld ultrasound probe
control device.
[0019] FIG. 2 is a schematic view of a handheld ultrasound
probe.
[0020] FIG. 3 is a flowchart of a process for force-controlled
acquisition of ultrasound images.
[0021] FIG. 4 shows a lumped parameter model of the mechanical
system of a probe as described herein.
[0022] FIG. 5 is a flowchart depicting operating modes of a
force-controlled ultrasound probe.
[0023] FIG. 6 shows a process for ultrasound image processing.
[0024] FIG. 7 is a schematic view of an ultrasound scanning
system.
[0025] FIG. 8 is a flowchart for a process for obtaining a
reconstructed volume of a target using a handheld ultrasound
probe.
[0026] FIG. 9 is a flowchart for a process for capturing an
acquisition state for an ultrasound scan.
[0027] FIG. 10 shows a fiducial marker.
[0028] FIG. 11 shows a system with a workflow for using acquisition
states.
DETAILED DESCRIPTION
[0029] The techniques described herein enable real-time control of
the contact force between an ultrasound probe and a target, such as
a patient's body. This allows ultrasound technicians to take fixed-
or variably-controlled-contact-force ultrasound measurements of the
target, as desired. This also facilitates measurement, tracking,
and/or control of the contact force in a manner that permits
enhanced, quantitative analysis and subsequent processing of
ultrasound image data.
[0030] FIG. 1 is a perspective view of a handheld ultrasound probe
control device. The device 100 may include a frame 118 adapted to
receive a probe 112, a linear drive system 122 that translates the
frame 118 along an actuation axis 114, a sensor 110 such as a force
sensor, a torque sensor, or some combination of these, and a
controller 120.
[0031] The probe 112 can be of any known type or construction. The
probe 112 may, for example include a handheld ultrasound probe used
for medical imaging or the like. More generally, the probe 112 may
include any contact scanner or other device that can be employed in
a manner that benefits from the systems and methods described
herein. Thus, one advantage of the device 100 is that a standard
off-the-shelf ultrasound medical probe can be retrofitted for use
as a force-controlled ultrasound in a relatively inexpensive way;
i.e., by mounting the probe 112 in the frame 118. Medical
ultrasound devices come in a variety of shapes and sizes, and the
frame 118 and other components may be adapted for a particular
size/shape of probe 112, or may be adapted to accommodate a varying
sizes and/or shapes. In another aspect, the probe 112 may be
integrated into the frame 118 or otherwise permanently affixed to
or in the frame 118.
[0032] In general, a probe 112 such as an ultrasound probe includes
an ultrasound transducer 124. The construction of suitable
ultrasound transducers is generally well known, and a detailed
description is not required here. In one aspect, an ultrasound
transducer includes piezoelectric crystals or similar means to
generate ultrasound waves and/or detect incident ultrasound. More
generally, any suitable arrangement for transmitting and/or
receiving ultrasound may be used as the ultrasound transducer 124.
Still more generally, other transceiving mechanisms or transducers
may also or instead be used to support imaging modalities other
than ultrasound.
[0033] The frame 118 may include any substantially rigid structure
that receives and holds the probe 112 in a fixed position and
orientation relative to the frame 118. The frame 118 may include an
opening that allows an ultrasound transducer 124 of the probe 112
to contact a patient's skin or other surface through which
ultrasound images are to be obtained. Although FIG. 1 shows the
probe 112 held within the frame 118 between two plates (a front
plate 128 bolted to a larger plate 130 on the frame 118) arranged
to surround a handheld ultrasound probe and securely affix the
probe to the frame 118, any suitable technique may also or instead
be employed to secure the probe 112 in a fixed relationship to the
frame 118. For example, the probe 112 may be secured with a press
fit, hooks, screws, anchors, adhesives, magnets, or any combination
of these and other fasteners. More generally, the frame 118 may
include any structure or combination of structure suitable for
securely retaining the probe 112 in a fixed positional relationship
relative to the probe 112.
[0034] In one aspect, the frame 118 may be adapted for handheld
use, and more particularly adapted for gripping by a technician in
the same orientation as a conventional ultrasound probe. Without
limitation, this may include a trunk 140 or the like for gripping
by a user that extends axially away from the ultrasound transducer
124 and generally normal to the contact surface of the transducer
124. Stated alternatively, the trunk 140 may extend substantially
parallel to the actuation axis 114 and be shaped and sized for
gripping by a human hand. In this manner, the trunk 140 may be
gripped by a user in the same manner and orientation as a typical
handheld ultrasound probe. The linear drive system 122 may
advantageously be axially aligned with the trunk 140 to permit a
more compact design consistent with handheld use. That is, a
ballscrew or similar linear actuator may be aligned to pass through
the trunk 140 without diminishing or otherwise adversely affecting
the range of linear actuation.
[0035] The linear drive system 122 may be mounted on the device 100
and may include a control input electronically coupled to the
controller 120. The linear drive system 122 may be configured to
translate the probe 112 along an actuation axis 114 in response to
a control signal from the controller 120 to the control input of
the linear drive system 122. Although the linear drive system 122
is depicted by way of example as a motor 102 and a linear actuator
104, any system capable of linearly moving the probe 112 can be
employed. For example, the linear drive system 122 can include a
mechanical actuator, hydraulic actuator, pneumatic actuator,
piezoelectric actuator, electro-mechanical actuator, linear motor,
telescoping linear actuator, ballscrew-driven linear actuator, and
so on. More generally, any actuator or combination of actuators
suitable for use within a grippable, handheld form factor such as
the trunk 140 may be suitably employed as the linear drive system
122. In some implementations, the linear drive system 122 is
configured to have a low backlash (e.g., less than 3 .mu.m) or no
backlash in order to improve positional accuracy and
repeatability.
[0036] The ability of the probe 112 to travel along the actuation
axis 114 permits the technician some flexibility in hand placement
while using the device 100. In some implementations, the probe 112
can travel up to six centimeters along the actuation axis 114,
although greater or lesser ranges of travel may be readily
accommodated with suitable modifications to the linear actuator 104
and other components of the device 100.
[0037] The motor 102 may be electrically coupled to the controller
120 and mechanically coupled in a fixed positional relationship to
the linear actuator 104. The motor 102 may be configured to drive
the linear actuator 104 in response to control signals from the
controller 120, as described more fully below. The motor 102 can
include a servo motor, a DC stepper motor, a hydraulic pump, a
pneumatic pump, and so on.
[0038] The sensor 110, which may include a force sensor and/or a
torque sensor, may be mechanically coupled to the frame 118, such
as in a fixed positional relationship to sense forces/torques
applied to the frame 118. The sensor 110 may also be electronically
coupled to the controller 120, and configured to sense a contact
force between the probe 112 and a target surface (also referred to
herein simply as a "target") such as a body from which ultrasound
images are to be captured. As depicted, the sensor 110 may be
positioned between the probe 112 and the back plate of the frame
118. Other deployments of the sensor 110 are possible, so long as
the sensor 110 is capable of detecting the contact force (for a
force sensor) between the probe 112 and the target surface.
Embodiments of the sensor 110 may also or instead include a
multi-axis force/torque sensor, a plurality of separate force
and/or torque sensors, or the like.
[0039] The force sensor may be mechanically coupled to the
ultrasound probe 112 and configured to obtain a pressure applied by
the ultrasound probe 112 to the skin surface or a target 136. The
force sensor may include a pressure transducer coupled to the
ultrasound probe 112 and configured to sense an instantaneous
contact force between the handheld ultrasound probe 112 and the
skin.
[0040] The sensor 110 can provide output in any known form, and
generally provides a signal indicative of forces and/or torques
applied to the sensor 110. For example, the sensor 110 can produce
analog output such as a voltage or current proportional to the
force or torque detected. Alternatively, the sensor 110 may produce
digital output indicative of the force or torque detected.
Moreover, digital-to-analog or analog-to-digital converters (not
shown) can be deployed at any point between the sensors and other
components to convert between these modes. Similarly, the sensor
110 may provide radio signals (e.g., for wireless configurations),
optical signals, or any other suitable output that can characterize
forces and/or torques for use in the device 100 described
herein.
[0041] The controller 120 generally includes processing circuitry
to control operation of the device 100 as described herein. The
controller 120 may receive signals from the sensor 110 indicative
of force/torque, and may generate a control signal to a control
input of the linear drive system 122 (or directly to the linear
actuator 104) for maintaining a given contact force between the
ultrasound probe 112 and the target, as described more fully below.
The controller 120 may include analog or digital circuitry,
computer program code stored in a non-transitory computer-readable
storage medium, and so on. Embodiments of the controller 120 may
employ pure force control, impedance control, contact
force-determined position control, and so on.
[0042] The controller 120 may be configured with preset limits
relating to operational parameters such as force, torque, velocity,
acceleration, position, current, etc. so as to immediately cut
power from the linear drive system 122 when any of these
operational parameters exceed the preset limits. In some
implementations, these preset limits are determined based on the
fragility of the target. For example, one set of preset limits may
be selected where the target is a healthy human abdomen, another
set of preset limits may be selected where the target is a human
abdomen of an appendicitis patient, etc. In addition, preset limits
for operational parameters may be adjusted to accommodate
discontinuities such as initial surface contact or termination of
an ultrasound scan (by breaking contact with a target surface).
[0043] In some implementations, the device 100 includes a
servo-motor-driven ballscrew linear actuator comprising a MAXON
servo motor (EC-Max #272768) (motor 102) driving an NSK MONOCARRIER
compact ballscrew actuator (linear actuator 104). a MINI40 six-axis
force/torque sensor (sensor 110) from ATI INDUSTRIAL AUTOMATION,
which simultaneously monitors all three force and all three torque
axes, may be mounted to the carriage of the actuator, and a TERASON
5 MHz ultrasound transducer (ultrasound transducer 124) may be
mounted to the force/torque sensor.
[0044] The vector from a geometric origin of the sensor 110 to an
endpoint at the probe 124 that contacts a patient can be used to
map the forces and torques at the sensor 110 into the contact
forces and torques seen at the probe/patient interface. In some
implementations, it is possible to maintain a set contact force
with a mean error of less than 0.2% and, in a closed-loop system,
maintain a desired contact force with a mean steady state error of
about 2.1%, and attain at least 20 Newtons of contact force. More
generally, in one embodiment a steady state error of less than 3%
was achieved for applied forces ranging from one to seven
Newtons.
[0045] Other sensors (indicated generically as a second sensor 138)
may be included without departing from the scope of this invention.
For example, a second sensor 138 such as an orientation sensor or
the like may be included, which may be operable to independently
detect at least one of a position and an orientation of the device
100, such as to track location and/or orientation of the device 100
before, during, and after use. This data may help to further
characterize operation of the device 100. A second sensor 138 such
as a range or proximity detector may be employed to anticipate an
approaching contact surface and place the device 100 in a state to
begin an ultrasound scan. For example, a proximity sensor may be
operable to detect a proximity of the ultrasound transducer 124 to
a subject (e.g., the target surface). One or more inertial sensors
may be included in the device 100. Suitable inertial sensors
include, for example, inertial sensors based on MEMS technology
such as accelerometers and gyroscopes, or any other device or
combination of devices that measure motion. More generally, any of
a variety of sensors known in the art may be used to augment or
supplement operation of the device 100 as contemplated herein.
[0046] The ultrasound probe may further include a sensor for
illuminating the skin surface when the handheld ultrasound probe is
placed for use against the skin surface. For example, the sensor
may be a lighting source mechanically coupled to the handheld
ultrasound probe and positioned to illuminate the skin surface
during use of the ultrasound probe. The lighting source may be part
of the sensor system of the ultrasound probe or the lighting source
may be a separate device directed toward the ultrasound probe.
Suitable lighting sources include an LED light or any other light
capable of illuminating the skin surface during ultrasound
scanning.
[0047] Another sensor that may be included in the device 100 is a
camera 132. The camera 132 may be positioned to record a digital
image of the skin surface 136 during an ultrasound scan when the
handheld ultrasound probe 112 is placed for use against the skin
surface 137 of the target 136. The camera 132 also may be
positioned to obtain a pose of the handheld ultrasound probe 112 as
the ultrasound transducer 124 scans the target 136. The camera 132
may be mechanically coupled to the ultrasound transducer 124. In
one aspect, the camera 132 may be rigidly mounted to the ultrasound
transducer 124 and directed toward the skin surface 137 (when
positioned for use) in order to capture images of the skin surface
137 and/or a target 134 adhered to the skin surface 137. In another
aspect, the camera 132 may be mounted separate from the ultrasound
probe 112 and directed toward an area of use of the ultrasound
probe 112 so that the camera 132 can capture images of the
ultrasound probe 112 in order to derive pose information directly
from images of the ultrasound probe 112. Suitable cameras 132 may
for example include any commercially available digital camera or
digital video camera designed to capture images of sufficient
quality for use as contemplated herein.
[0048] The ultrasound probe 112 may have an integral structure with
various components coupled directly to a body thereof, or one or
more of the various functions of one or more of the components of
the ultrasound probe may be distributed among one or more
independent devices. For example, the camera, the lighting source,
and any other sensors may be integrated into the ultrasound probe
or they may be separate from the ultrasound probe, along with
suitable communications and control systems where coordination of
function is desired between the probe and the external
components.
[0049] The ultrasound probe 112 may be used to capture an
ultrasound image of a target 136 through a skin surface 137. A
fiducial marker 134 with predetermined dimensions may be applied to
the skin surface 137 of the target 136 that is to be scanned by the
ultrasound probe 112. The fiducial marker 134 may have any desired
dimension or shape such as a square, a rectangle, a circle and/or
any other regular, irregular, and/or random shape and/or patterns.
In one embodiment, the fiducial marker 134 may be a 3 mm.times.3 mm
square. The fiducial marker 134 may be made of a thin material.
Suitable materials include, but are not limited to, any materials
that will not obstruct the transducer from obtaining an ultrasound
scan of the target 136. The fiducial marker 134 may be adhered to
the skin surface 137 of a target 136 using any suitable methods
and/or any suitable adhesives. In another aspect, the fiducial
marker 134 may be stamped, inked or otherwise applied to the skin
surface using ink or any other suitable, visually identifiable
marking material(s).
[0050] FIG. 2 is a schematic depiction of a handheld
force-controlled ultrasound probe. The probe 200, which may be a
force-controlled ultrasound probe, generally includes a sensor 110,
a controller 120, a linear drive system 122, and an ultrasound
transducer 124 as described above.
[0051] In contrast to the probe 112 mounted in the device 100 as
described in FIG. 1, the probe 200 of FIG. 2 may have the sensor
110, controller 120, and linear drive system 122 integrally mounted
(as opposed to mounted in a separate device 100) in a single device
to provide a probe 200 with an integral structure. In FIG. 2, the
components are all operable to gather ultrasound images at measured
and/or controlled forces and torques, as described above with
reference to FIG. 1. More generally, the various functions of the
above-described components may be distributed across several
independent devices in various ways (e.g., an ultrasound probe with
integrated force/torque sensors but external drive system, an
ultrasound probe with an internal drive system but external control
system, etc.). In one aspect, a wireless handheld probe 200 may be
provided that transmits sensor data and/or ultrasound data
wirelessly to a remote computer that captures data for subsequent
analysis and display. All such permutations are within the scope of
this disclosure.
[0052] The ultrasound transducer 124 can include a medical
ultrasonic transducer, an industrial ultrasonic transducer, or the
like. Like the ultrasound probe 112 described above with reference
to FIG. 1, it will be appreciated that a variety of embodiments of
the ultrasound transducer 124 are possible, including embodiments
directed to non-medical applications such as nondestructive
ultrasonic testing of materials and objects and the like, or more
generally, transducers or other transceivers or sensors for
capturing data instead of or in addition to ultrasound data. Thus,
although reference is made to an "ultrasound probe" in this
document, the techniques described herein are more generally
applicable to any context in which the transmission of energy
(e.g., sonic energy, electromagnetic energy, thermal energy, etc.)
from or through a target varies as a function of the contact force
between the energy transmitter and the target.
[0053] Other inputs/sensors may be usefully included in the probe
200. For example, the probe 200 may include a limit switch 202 or
multiple limit switches 202. These may be positioned at any
suitable location(s) to detect limits of travel of the linear drive
system 122, and may be used to prevent damage or other malfunction
of the linear drive system 122 or other system components. The
limit switch(es) may be electronically coupled to the controller
120 and provide a signal to the controller 120 to indicate when the
limit switch 202 detects an end of travel of the linear drive
system along the actuation axis. The limit switch 202 may include
any suitable electro-mechanical sensor or combination of sensors
such as a contact switch, proximity sensor, range sensor, magnetic
coupling, and so forth.
[0054] The probe 200 may also or instead include one or more user
inputs 204. These may be physically realized by buttons, switches,
dials, or the like on the probe 200. The user inputs 204 may be
usefully positioned in various locations on an exterior of the
probe 200. For example, the user inputs 204 may be positioned where
they are readily finger-accessible while gripping the probe 200 for
a scan. In another aspect, the user inputs 204 may be positioned
away from usual finger locations so that they are not accidentally
activated while manipulating the probe 200 during a scan. The user
inputs 204 may generally be electronically coupled to the
controller 120, and may support or activate functions such as
initiation of a scan, termination of a scan, selection of a current
contact force as the target contact force, storage of a current
contact force in memory for subsequent recall, or recall of a
predetermined contact force from memory. Thus, a variety of
functions may be usefully controlled by a user with the user inputs
204.
[0055] A memory 210 may be provided to store ultrasound data from
the ultrasound transducer and/or sensor data acquired from any of
the sensors during an ultrasound scan. The memory 210 may be
integrally built into the probe 200 to operate as a standalone
device, or the memory 210 may include remote storage, such as in a
desktop computer, network-attached storage, or other device with
suitable storage capacity. In one aspect, data may be wirelessly
transmitted from the probe 200 to the memory 210 to permit wireless
operation of the probe 200. The probe 200 may include any suitable
wireless interface 220 to accommodate such wireless operation, such
as for wireless communications with a remote storage device (which
may include the memory 210). The probe 200 may also or instead
include a wired communications interface for serial, parallel, or
networked communication with external components.
[0056] A display 230 may be provided, which may receive wired or
wireless data from the probe 200. The display 230 and memory 210
may be a display and memory of a desktop computer or the like, or
may be standalone accessories to the probe 200, or may be
integrated into a medical imaging device that includes the probe
200, memory 210, display 230 and any other suitable hardware,
processor(s), and the like. The display 230 may display ultrasound
images obtained from the probe 200 using known techniques. The
display 230 may also or instead display a current contact force or
instantaneous contact force measured by the sensor 110, which may
be superimposed on a corresponding ultrasound image or in another
display region of the display 230. Other useful information, such
as a target contact force, an actuator displacement, or an
operating mode, may also or instead be usefully rendered on the
display 230 to assist a user in obtaining ultrasound images.
[0057] A processor 250 may also be provided. In one aspect, the
processor 250, memory 210, and display 230 are a desktop or laptop
computer. In another aspect, these components may be separate, or
some combination of these. For example, the display 230 may be a
supplemental display provided for use by a doctor or technician
during an ultrasound scan. The memory 210 may be a network-attached
storage device or the like that logs ultrasound images and other
acquisition state data. The processor 250 may be a local or remote
computer provided for post-scan or in-scan processing of data. In
general, the processor 250 and/or a related computing device may
have sufficient processing capability to perform the quantitative
processing described below. For example, the processor 250 may be
configured to process an image of a subject from the ultrasound
transducer 124 of the probe 200 to provide an estimated image of
the subject at a predetermined contact force of the ultrasound
transducer. This may, for example, be an estimate of the image at
zero Newtons (no applied force), or an estimate of the image at
some positive value (e.g., one Newton) selected to normalize a
plurality of images from the ultrasound transducer 124. Details of
this image processing are provided below by way of example with
reference to FIG. 6.
[0058] FIG. 3 is a flowchart of a process for force-controlled
acquisition of ultrasound images. The process 300 can be performed,
e.g., using a handheld ultrasound probe 112 mounted in a device
100, or a handheld ultrasound probe 200 with integrated force
control hardware.
[0059] As shown in step 302, the process 300 may begin by
calibrating the force and/or torque sensors. The calibration step
is for minimizing (or ideally, eliminating) errors associated with
the weight of the ultrasound probe or the angle at which the
sensors are mounted with respect to the ultrasound transducer, and
may be performed using a variety of calibration techniques known in
the art.
[0060] To compensate for the mounting angle, the angle between the
sensor axis and the actuation axis may be independently measured
(e.g., when the sensor is installed). This angle may be
subsequently stored for use by the controller to combine the
measured forces and/or torques along each axis into a single
vector, using standard coordinate geometry. (E.g., for a mounting
angle .theta., scaling the appropriate measured forces by
sin(.theta.) and cos(.theta.) prior to combining them.)
[0061] To compensate for the weight of the ultrasound probe, a
baseline measurement may be taken, during a time at which the
ultrasound probe is not in contact with the target. Any measured
force may be modeled as due either to the weight of the ultrasound
probe, or bias inherent in the sensors. In either case, the
baseline measured force may be recorded, and may be subtracted from
any subsequent force measurements. Where data concerning
orientation of the probe is available, this compensation may also
be scaled according to how much the weight is contributing to a
contact force normal to the contact surface. Thus for example an
image from a side (with the probe horizontal) may have no
contribution to contact force from the weight of the probe, while
an image from a top (with the probe vertical) may have the entire
weight of the probe contributing to a normal contact force. This
variable contribution may be estimated and used to adjust
instantaneous contact force measurements obtained from the
probe.
[0062] As shown in step 304, a predetermined desired force may be
identified. In some implementations, the desired force is simply a
constant force. For example, in imaging a human patient, a constant
force of less than or equal 20 Newtons is often desirable for the
comfort and safety of the patient.
[0063] In some implementations, the desired force may vary as a
function of time. For example, it is often useful to "poke" a
target in a controlled manner, and acquire images of the target as
it deforms during or after the poke. The desired force may also or
instead include a desired limit (minimum or maximum) to manually
applied force. In some implementations, the desired force may
involve a gradual increase of force given by a function F(t) to a
force F.sub.max at a time t.sub.max, and then a symmetric reduction
of force until the force reaches zero. Such a function is often
referred to as a "generalized tent map," and may be given by the
function G(t)=F(t) if t<t.sub.max, and
G(t)=F.sub.max-F(t-t.sub.max) for t.gtoreq.t.sub.max. When F is a
linear function, the graph of G(t) resembles a tent, hence the
name. In another aspect, a desired force function may involve
increasing the applied force by some function F(t) for a specified
time period until satisfactory imaging (or patient comfort) is
achieved, and maintaining that force thereafter until completion of
a scan. The above functions are given by way of example. In
general, any predetermined force function can be used.
[0064] As shown in step 306, the output from the force and/or
torque sensors may be read as sensor inputs to a controller or the
like.
[0065] As shown in step 308, these sensor inputs may be compared to
the desired force function to determine a force differential. In
some implementations, the comparison can be accomplished by
computing an absolute measure such as the difference of the sensor
output with the corresponding desired sensor output. Similarly, a
relative measure such as a ratio of output to the desired output
can be computed. Many other functions can be used.
[0066] As shown in step 310, a control signal may be generated
based on the comparison of actual-to-desired sensor outputs (or,
from the perspective of a controller/processor, sensor inputs). The
control signal may be such that the linear drive system is
activated in such a way as to cause the measured force and/or
torque to be brought closer to a desired force and/or torque at a
given time. For example, if a difference between the measured force
and the desired force is computed, then the drive system can
translate the probe with a force whose magnitude is proportional to
the difference, and in a direction to reduce or minimize the
difference. Similarly, if a ratio of the desired force and measured
force is computed, then the drive system can translate the probe
with a force whose magnitude is proportional to one minus this
ratio.
[0067] More generally, any known techniques from control theory can
be used to drive the measured force towards the desired force.
These techniques include linear control algorithms,
proportional-integral-derivative ("PID") control algorithms, fuzzy
logic control algorithms, etc. By way of example, the control
signal may be damped in a manner that avoids sharp movements of the
probe against a patient's body. In another aspect, a closed-loop
control system may be adapted to accommodate ordinary variations in
a user's hand position. For example, a human hand typically has
small positional variations with an oscillating frequency of about
four Hertz to about twenty Hertz. As such, the controller may be
configured to compensate for an oscillating hand movement of a user
at a frequency between four Hertz and thirty Hertz or any other
suitable range. Thus, the system may usefully provide a time
resolution finer than twenty Hertz or thirty Hertz, accompanied by
an actuation range within the time resolution larger than typical
positional variations associated with jitter or tremors in an
operator's hand.
[0068] As shown in step 312, the ultrasound probe can acquire an
image, a fraction of an image, or more than one image. It will be
understood that this may generally occur in parallel with the force
control steps described above, and images may be captured at any
suitable increment independent of the time step or time resolution
used to provide force control. The image(s) (or fractions thereof)
may be stored together with contact force and/or torque information
(e.g., instantaneous contact force and torque) applicable during
the image acquisition. In some implementations, the contact force
and/or torque information includes all the information produced by
the force and/or torque sensors, such as the moment-by-moment
output of the sensors over the time period during which the image
was acquired. In some implementations, other derived quantities can
be computed and stored, such as the average or mean contact force
and/or torque, the maximum or minimum contact force and/or torque,
and so forth.
[0069] It will be understood that the steps of the above methods
may be varied in sequence, repeated, modified, or deleted, or
additional steps may be added, all without departing from the scope
of this disclosure. By way of example, the step of identifying a
desired force may be performed a single time where a constant force
is required, or continuously where a time-varying applied force is
desired. Similarly, measuring contact force may include measuring
instantaneous contact force or averaging a contact force over a
sequence of measurements during which an ultrasound image is
captured. In addition, operation of the probe in clinical settings
may include various modes of operation each having different
control constraints. Some of these modes are described below with
reference to FIG. 5. Thus, the details of the foregoing will be
understood as non-limiting examples of the systems and methods of
this disclosure.
[0070] FIG. 4 shows a lumped parameter model of the mechanical
system of a probe as described herein. While a detailed
mathematical derivation is not provided, and the lumped model
necessarily abstracts away some characteristics of an ultrasound
probe, the model of FIG. 4 provides a useful analytical framework
for creating a control system that can be realized using the
controller and other components described above to achieve
force-controlled acquisition of ultrasound images.
[0071] In general, the model 400 characterizes a number of lumped
parameters of a controlled-force probe. The physical parameters for
an exemplary embodiment are as follows. M.sub.u is the mass of
ultrasound probe and mounting hardware, which may be 147 grams.
M.sub.c is the mass of a frame that secures the probe, which may be
150 grams. M.sub.s is the mass of the linear drive system, which
may be 335 grams. k.sub.F/T is the linear stiffness of a force
sensor, which may be 1.1*10.sup.5 N/m. k.sub.e is the target skin
stiffness, which may be 845 N/m. b.sub.e is the viscous damping
coefficient of the target, which may be 1500 Ns/m. k.sub.t is the
user's total limb stiffness, which may be 1000 N/m. b.sub.t is the
user's total limb viscous damping coefficient, which may be 5000
Ns/m. b.sub.e is the frame viscous damping coefficient, which may
be 0 Ns/m. k.sub.C is the stiffness of the linear drive system,
which may be 3*10.sup.7 for an exemplary ballscrew and nut drive.
K.sub.T is the motor torque constant, which may be 0.0243 Nm/A.
.beta..sub.b is be the linear drive system viscous damping, which
may be 2*10.sup.-4 for an exemplary ballscrew and motor rotor. L is
the linear drive system lead, which may be 3*10.sup.-4 for an
exemplary ballscrew. J.sub.tot is the moment of inertia, which may
be 1.24*10.sup.-6 kgm.sup.2 for an exemplary ballscrew and motor
rotor.
[0072] Using these values, the mechanical system can be
mathematically modeled, and a suitable control relationship for
implementation on the controller can be determined that permits
application of a controlled force to the target surface by the
probe. Stated differently, the model may be employed to relate
displacement of the linear drive system to applied force in a
manner that permits control of the linear drive system to achieve
an application of a controlled force to the target surface. It will
be readily appreciated that the lumped model described above is
provided by way of illustration and not limitation. Variations may
be made to the lumped model and the individual parameters of the
model, either for the probe described above or for probes having
different configurations and characteristics, and any such model
may be usefully employed provided it yields a control model
suitable for implementation on a controller as described above.
[0073] FIG. 5 is a flowchart depicting operating modes of a
force-controlled ultrasound probe. While the probe described above
may be usefully operated in a controlled-force mode as discussed
above, use of the handheld probe in clinical settings may benefit
from a variety of additional operating modes for varying
circumstances such as initial contact with a target surface or
termination of a scan. Several useful modes are now described in
greater detail.
[0074] In general, the process 500 includes an initialization mode
510, a scan initiation mode 520, a controlled-force mode 530, and a
scan termination mode 540, ending in termination 550 of the process
500.
[0075] As shown in step 510, an initialization may be performed on
a probe. This may include, for example, powering on various
components of the probe, establishing a connection with remote
components such as a display, a memory, and the like, performing
any suitable diagnostic checks on components of the probe, and
moving a linear drive system to a neutral or ready position, which
may for example be at a mid-point of a range of movement along an
actuation axis.
[0076] As shown in step 522, the scan initiation mode 520 may begin
by detecting a force against the probe using a sensor, such as any
of the sensors described above. In general, prior to contact with a
target surface such as a patient, the sensed force may be at or
near zero. In this state, it would be undesirable for the linear
drive system to move to a limit of actuation in an effort to
achieve a target controlled force. As such, the linear drive system
may remain inactive and in a neutral or ready position during this
step.
[0077] As shown in step 524, the controller may check to determine
whether the force detected in step 522 is at or near a
predetermined contact force such as the target contact force for a
scan. If the detected force is not yet at (or sufficiently close
to) the target contact force, then the initiation mode 520 may
return to step 522 where an additional force measurement is
acquired. If the force detected in step 522 is at or near the
predetermined contact force, the process 500 may proceed to the
controlled-force mode 530. Thus, a controller disclosed herein may
provide an initiation mode in which a linear drive system is placed
in a neutral position and a force sensor is measured to monitor an
instantaneous contact force, the controller transitioning to
controlled-force operation when the instantaneous contact force
meets a predetermined threshold. The predetermined threshold may be
the predetermined contact force that serves as the target contact
force for controlled-force operation, or the predetermined
threshold may be some other limit such as a value sufficiently
close to the target contact force so that the target contact force
can likely be readily achieved through actuation of the linear
drive system. The predetermined threshold may also or instead be
predictively determined, such as by measuring a change in the
measured contact force and extrapolating (linearly or otherwise) to
estimate when the instantaneous contact force will equal the target
contact force.
[0078] As shown in step 532, the controlled-force mode 530 may
begin by initiating controlled-force operation, during which a
control system may be executed in the controller to maintain a
desired contact force between the probe and a target, all as
generally discussed above.
[0079] While in the controlled-force mode 530, other operations may
be periodically performed. For example, as shown in step 534, the
current contact force may be monitored for rapid changes. In
general, a rapid decrease in contact force may be used to infer
that a probe operator has terminated a scan by withdrawing the
probe from contact with a target surface. This may be for example,
a step decrease in measured force to zero, or any other pattern of
measured force that deviates significantly from expected values
during an ongoing ultrasound scan. If there is a rapid change in
force, then the process 500 may proceed to the termination mode
540. It will be appreciated that this transition may be terminated
where the force quickly returns to expected values, and the process
may continue in the controlled-force mode 530 even where there are
substantial momentary variations in measure force. As is shown in
step 536, limit detectors for a linear drive system may be
periodically (or continuously) monitored to determine whether an
actuation limit of the linear drive system has been reached. If no
such limit has been reached, the process 500 may continue in the
controlled-force mode 530 by proceeding for example to step 537. If
an actuation limit has been reached, then the process may proceed
to termination 550 where the linear drive system is disabled. It
will be appreciated that the process 500 may instead proceed to the
termination mode 540 to return the linear drive system to a neutral
position for future scanning.
[0080] As shown in step 537, a contact force, such as a force
measured with any of the force sensors described above, may be
displayed in a monitor or the like. It will be appreciated that the
contact force may be an instantaneous contact force or an average
contact force for a series of measurements over any suitable time
interval. The contact force may, for example, be displayed
alongside a target contact force or other data. As shown in step
538, ultrasound images may be displayed using any known technique,
which display may be alongside or superimposed with the force data
and other data described above.
[0081] As shown in step 542, when a rapid force change or other
implicit or explicit scan termination signal is received, the
process 500 may enter a scan termination mode 540 in which the
linear drive system returns to a neutral or ready position using
any suitable control algorithm, such as a controlled-velocity
algorithm that returns to a neutral position (such as a mid-point
of an actuation range) at a constant, predetermined velocity. When
the linear drive system has returned to the ready position, the
process 500 may proceed to termination as shown in step 550, where
operation of the linear drive system is disabled or otherwise
terminated.
[0082] Thus, it will be appreciated that a method or system
disclosed herein may include operation in at least three distinct
modes to accommodate intuitive user operation during initiation of
a scan, controlled-force scanning, and controlled-velocity exit
from a scanning mode. Variations to each mode will be readily
envisioned by one of ordinary skill in the art and are intended to
fall within the scope of this disclosure. Thus, for example any one
of the modes may be entered or exited by explicit user input. In
addition, the method may accommodate various modes of operation
using the sensors and other hardware described above. For example
the controlled-force mode 530 may provide for user selection or
input of a target force for controlled operation using, e.g., any
of the user inputs described above.
[0083] More generally, the steps described above may be modified,
reordered, or supplemented in a variety of ways. By way of example,
the controlled-force mode of operation may include a
controlled-velocity component that limits a rate of change in
position of the linear drive system. Similarly, the
controlled-velocity mode for scan termination may include a
controlled-force component that checks for possible recovery of
controlled-force operation while returning the linear drive system
to a neutral position. All such variations, and any other
variations that would be apparent to one of ordinary skill in the
art, are intended to fall within the scope of this disclosure.
[0084] In general, the systems described above facilitate
ultrasound scanning with a controlled and repeatable contact force.
The system may also provides a real time measurement of the applied
force when each ultrasound image is captured, thus permitting a
variety of quantitative analysis and processing steps that can
normalize images, estimate tissue elasticity, provide feedback to
recover a previous scan state, and so forth. Some of these
techniques are now described below in greater detail.
[0085] FIG. 6 shows a process 600 for ultrasound image
processing.
[0086] As shown in step 602, the process may begin with capturing a
plurality of ultrasound images of an object such as human tissue.
In general, each ultrasound image may contain radio frequency echo
data from the object, and may be accompanied by a contact force
measured between an ultrasound transducer used to obtain the
plurality of ultrasound images and a surface of the object. The
contact force may be obtained using, e.g., any of the hand-held,
controlled force ultrasound scanners described above or any other
device capable of capturing a contact force during an ultrasound
scan. The contact force may be manually applied, or may be
dynamically controlled to remain substantially at a predetermined
value. It will be appreciated that the radio frequency echo data
may be, for example, A-mode or B-mode ultrasound data, or any other
type of data available from an ultrasound probe and suitable for
imaging. More generally, the techniques described herein may be
combined with any force-dependent imaging technique (and/or
contact-force-dependent imaging subject) to facilitate quantitative
analysis of resulting data.
[0087] As shown in step 604, the process 600 may include estimating
a displacement of one or more features between two or more of the
ultrasound images to provide a displacement estimation. A variety
of techniques are available for estimating pixel displacements in
two-dimensional ultrasound images, such as B-mode block-matching,
phase-based estimation, RF speckle tracking,
incompressibility-based analysis, and optical flow. In one aspect,
two-dimensional displacement estimation may be based on an
iterative one-dimensional displacement estimation scheme, with
lateral displacement estimation performed at locations found in a
corresponding axial estimation. As described for example in U.S.
Provisional Application No. 61/429,308 filed on Jan. 3, 2011 and
incorporated herein by reference in its entirety, coarse-to-fine
template-matching may be performed axially, with normalized
correlation coefficients used as a similarity measure Subsample
estimation accuracy may be achieved with curve fitting. Regardless
of how estimated, this step generally results in a two-dimensional
characterization (e.g., at a feature or pixel level) of how an
image deforms from measurement to measurement.
[0088] It will be understood that feature tracking for purposes of
displacement estimation may be usefully performed on a variety of
different representations of ultrasound data. Brightness mode (or
"B-mode") ultrasound images provide a useful visual representation
of a transverse plane of imaged tissue, and may be used to provide
the features for which displacement in response to a known contact
force is tracked. Similarly, an elastography images (such as
stiffness or strain images) characterize such changes well, and may
provide two-dimensional images for feature tracking.
[0089] As shown in step 606, the process 600 may include estimating
an induced strain field from the displacement. In general,
hyperelastic models for mechanical behavior work well with subject
matter such as human tissue that exhibits significant nonlinear
compression. A variety of such models are known for characterizing
induced strain fields. One such model that has been usefully
employed with tissue phantoms is a second-order polynomial model
described by the strain energy function:
U = i + j = 1 2 C ij ( I 1 - 3 ) i ( I 2 - 3 ) j + i = 1 2 1 D i (
J el - 1 ) 2 i [ Eq . 1 ] ##EQU00001##
where U is the strain energy per unit volume, I.sub.1 and I.sub.2
are the first and second deviatoric strain invariant, respectively,
and J.sub.el is the elastic volume strain. The variables C.sub.ij
are the material parameters with the units of force per unit area,
and the variables D.sub.i are compressibility coefficients that are
set to zero for incompressible materials. Other models are known in
the art, and may be usefully adapted to estimation of a strain
field for target tissue as contemplated herein.
[0090] As shown in step 608, the process 600 may include creating a
trajectory field that characterizes a displacement of the one or
more features according to variations in the contact force. This
may include characterizing the relationship between displacement
and contact force for the observed data using least-square curve
fitting with polynomial curves of the form:
x.sub.i,j(f)=.SIGMA..sub.k=0.sup.N.alpha..sub.i,j,kf.sup.k [Eq.
2]
y.sub.i,j(f)=.SIGMA..sub.k=0.sup.N.beta..sub.i,j,kf.sup.k [Eq.
3]
where x.sub.i,j and y.sub.i,j are the lateral and axial
coordinates, respectively of a pixel located at the position (i,j)
of a reference image, and .alpha. and .beta. are the parameter sets
determined in a curve fitting procedure. The contact force is f,
and N denotes the order of the polynomial curves. Other
error-minimization techniques and the like are known for
characterizing such relationships, many of which may be suitably
adapted to the creation of a trajectory field as contemplated
herein.
[0091] With a trajectory field established for a subject, a variety
of useful real-time or post-processing steps may be performed,
including without limitation image correction or normalization,
analysis of tissue changes over time, registration to data from
other imaging modalities, feedback and guidance to an
operator/technician (e.g., to help obtain a standard image), and
three-dimensional image reconstruction. Without limiting the range
of post-processing techniques that might be usefully employed,
several examples are now discussed in greater detail.
[0092] As shown in step 610, post-processing may include
extrapolating the trajectory field to estimate a location of the
one or more features at a predetermined contact force, such as to
obtain a corrected image. The predetermined contact force may, for
example, be an absence of applied force (i.e., zero Newtons), or
some standardized force selected for normalization of multiple
images (e.g., one Newton), or any other contact force for which a
corrected image is desired, either for comparison to other images
or examination of deformation behavior. With the relationship
between contact force and displacement provided from step 608,
location-by-location (e.g., feature-by-feature or pixel-by-pixel)
displacement may be determined for an arbitrary contact force using
Eqs. 2 and 3 above, although it will be appreciated that the useful
range for accurate predictions may be affected by the range of
contact forces under which actual observations were made.
[0093] As shown in step 612, post-processing may include
registering an undistorted image to an image of an object obtained
using a different imaging modality. Thus ultrasound results may be
registered to images from, e.g., x-ray imaging, x-ray computed
tomography, magnetic resonance imaging ("MRI"), optical coherence
tomography, positron emission tomography, and so forth. In this
manner, elastography data that characterizes compressibility of
tissue may be registered to other medical information such as
images of bone and other tissue structures.
[0094] As shown in step 614, post-processing may include comparing
an undistorted image to a previous undistorted image of an object.
This may be useful, for example, to identify changes in tissue
shape, size, elasticity, and composition over a period of time
between image captures. By normalizing a contact force or otherwise
generating corrected or undistorted images, a direct comparison can
be made from one undistorted image to another undistorted image
captured weeks, months, or years later.
[0095] As shown in step 616, post-processing may also or instead
include capturing multiple undistorted images of a number of
transverse planes of an object such as human tissue. Where these
images are normalized to a common contact force, they may be
registered or otherwise combined with one another to obtain a
three-dimensional image of the object. The resulting
three-dimensional image(s) may be further processed, either
manually or automatically (or some combination of these), for
spatial analysis such as measuring a volume of a specific tissue
within the object, or measuring a shape of the tissue.
[0096] Still more generally, any post-processing for improved
imaging, diagnosis, or other analysis may be usefully performed
based on the quantitative characterizations of elastography
described above. For example, an ultrasound image of an artery may
be obtained, and by measuring an amount of compression in the
artery in response to varying contact forces, blood pressure may be
estimated. Similarly, by permitting reliable comparisons of
time-spaced data, better diagnosis/detection of cancerous tissue
can be achieved. Any such ultrasound imaging applications that can
be improved with normalized data can benefit from the inventive
concepts disclosed herein.
[0097] FIG. 7 is a schematic view of an ultrasound scanning system
700. The system 700 may be used to capture an acquisition state for
a handheld ultrasound probe 702, such as any of the devices
described above, including quantitative data about the conditions
under which the scan was obtained. The conditions may include,
e.g., a contact force of the handheld ultrasound probe 702, a pose
of the ultrasound probe 702 in relation to a target 704, and/or any
other data that might be useful in interpreting or further
processing image data. In one aspect, data on the contact force
and/or pose may be used to obtain a three dimensional reconstructed
volume of a target.
[0098] The system 700 generally includes the handheld ultrasound
probe 702 to capture one or more ultrasound images of a target 704
through a skin surface in a scan, a fiducial marker 708 applied to
the skin surface of the target 704, and a camera 710. While a
single fiducial marker 708 is depicted, it will be understood that
any number of fiducial markers, which may have identical or
different features, may be used.
[0099] The ultrasound probe 702 may include an ultrasound imaging
system 712 that includes the at least one transducer 706 and a
memory 714. The memory 714 may be provided to store ultrasound data
from the ultrasound transducer 706 and/or sensor data acquired from
any of the sensors during an ultrasound scan. The ultrasound probe
702 may also include a force control system 716.
[0100] The force system control may include a force sensor 718, a
linear drive system 720, an input 722, and a controller 724, as
described above. The force sensor 718 may include a pressure
transducer or the like configured to obtain a pressure applied by
the handheld ultrasound probe 702 to the skin surface. The linear
drive system 720 may be mechanically coupled to the handheld
ultrasound transducer 706. The linear drive system 720 may include
a control input 722 or be electronically coupled to the control
input 722. The linear drive system 720 may be responsive to a
control signal received at the control input 722 to translate the
ultrasound transducer 706 along an actuation axis 114 as shown in
FIG. 1. The controller 724 may be electronically coupled to the
force sensor 718 and a control input of the linear drive system
720. The controller 724 may include processing circuitry configured
to generate the control signal to the control input 722 in a manner
that maintains a substantially constant predetermined contact force
between the ultrasound transducer 706 and the target 704, or a
contact force that varies in a predetermined manner, all as
discussed above.
[0101] The ultrasound probe 702 may also include a sensor system
726 configured to obtain a pose of the ultrasound probe 702. The
sensor system 726 may include the camera 710 and a lighting source
728, e.g., to capture image of the fiducial marker 708 and or other
visible features to obtain camera motion data. In another aspect,
the sensor system 720 may include other sensors 730 such as one or
more inertial sensors, range finding sensors (such as sonic,
ultrasonic, or infrared range finding subsystems), or any other
circuitry or combination of circuitry suitable for tracking
relative positions of the ultrasound probe 702 and/or the target
704.
[0102] The system 700 may also include a processor 732 in
communication with the handheld ultrasound probe 702 and/or
sub-systems thereof, either individually or through a common
interface 736 such as a wired or wireless interface for the
ultrasound probe 702. It will be appreciated that a wide range of
architectures are possible for control and data acquisition for the
system 700, including for example, a processor on the ultrasound
probe, a processor remote from the ultrasound probe coupled
directly to one or more subsystems of the ultrasound probe, and
various combinations of these. As such, the logical depiction of
systems in FIG. 7 should be understood as illustrative only, and
any arrangement of components and/or allocation of processing
suitable for an ultrasound imaging system as contemplated herein
may be used without departing from the scope of this disclosure. By
way of non-limiting example, the processor 732 on the ultrasound
probe 702 may be a controller or the like that provides a
programming interface for the force control system 716, ultrasound
imaging system 712, and/or sensor system 726, with system control
provided by a remote processor (such as the process 742) through
the interface 736. In another aspect, the processor 732 on the
ultrasound probe 702 may be a microprocessor programmed to control
all aspects of the ultrasound probe 702 directly, with the remote
processor 742 providing only supervisory control such as initiating
a scan or managing/displaying received scan data.
[0103] The processor 732 may be programmed to identify one or more
predetermined features (such as the fiducial marker 708 and/or
other features on the skin surface of the target 704) and calculate
a pose of the handheld ultrasound probe 702 using the one or more
predetermined features in an image from the camera 710. In this
manner, a number of camera images, each associated with one of a
number of ultrasound images, may be used to align the number of
ultrasound images in a world coordinate system. The ultrasound
images, when so aligned, may be combined to obtain a reconstructed
volume of the target 704.
[0104] The system 700 may also include one or more external sensor
systems 734. The external sensor systems 734 may be integral with
or separate from the sensor system 726. The external sensor system
734 may include an external electromechanical system coupled to the
handheld ultrasonic probe for tracking a pose of the handheld
ultrasonic probe 702 through direct measurements, as an alternative
to or in addition to image based camera motion data. The external
sensor system 734 may also or instead include an external optical
system, or any other sensors used for tracking a pose of the
ultrasound probe 702.
[0105] The system 700 also may include an interface 736, such as a
wireless interface, for coupling the handheld ultrasound probe 702
in a communicating relationship with a memory 738, a display 740,
and a processor 742. The memory 738, the display 740, and the
processor 742 may be separate components or they may be integrated
into a single device such as a computer. Where a wireless interface
is used, various techniques may be employed to provide a
data/control channel consistent with medical security/privacy
constraints, and/or to reduce or eliminate interference with and/or
from other wireless devices.
[0106] The display 740 may include one or more displays or monitors
for displaying one or more ultrasound images obtained from the
ultrasound probe 702 and/or one or more digital images recorded by
the camera 710, along with any other data related to a current or
previous scan.
[0107] The memory 738 may be used to store, e.g., data from the
handheld ultrasound probe 702 and the camera 710. The memory 738
may also store data from other devices of the system 700 such as
the sensors. The memory 738 may store an acquisition state for an
ultrasound image. The acquisition state may for example include the
pose and the pressure of the ultrasound transducer 706 during
scanning, or data for recovering any of the foregoing. The memory
738 may also or instead include a fixed or removable mass storage
device for archiving scans and accompanying data. The memory 738
may be a remote memory storage device or the memory may be
associated with a computer containing the processor 742. The
interface 736 may include a wireless interface coupling the
handheld ultrasound probe 702 in a communicating relationship with
the remote storage device, processor, and/or display.
[0108] The system 700 may include any other hardware or software
useful for the various functions described above. Also, the system
700 may be used for other applications including, for example,
pathology tracking, elastography, data archiving or retrieval,
imaging instructions, user guides, and so forth.
[0109] FIG. 8 is a flowchart for a process 800 for obtaining a
reconstructed volume of a target using a handheld ultrasound
probe.
[0110] As shown in step 802, a camera and ultrasound probe may be
provided. The camera, which may be any of the cameras described
above, may be mechanically coupled in a fixed relationship to the
handheld ultrasound probe in an orientation such that the camera is
positioned to capture a digital image of a skin surface of a target
when the handheld ultrasound probe is placed for use against the
skin surface.
[0111] A lighting source may also be mechanically coupled to the
ultrasound probe and positioned to illuminate the skin surface, and
more particularly an area of the skin surface where the digital
image is captured, when the handheld ultrasound probe is placed for
use against the skin surface.
[0112] As shown in step 804, a fiducial marker with predetermined
features such as predetermined dimensions may be applied to the
skin surface of a target. The fiducial marker may be placed in any
suitable location. In one embodiment, the fiducial marker is
preferably positioned at or near the location where the ultrasound
probe will contact the surface so that the camera has a clear view
of the fiducial marker.
[0113] As shown in step 806, an ultrasound image of the skin
surface may be obtained from the handheld ultrasound probe. This
may include data on contact force, or any other data available from
the ultrasound system and useful in subsequent processing.
[0114] As shown in step 808, the camera may capture a digital image
of the target, or the skin surface of the target. The digital image
may include features of the skin surface and/or the fiducial marker
where the fiducial marker is within a field of view of the camera.
The steps of obtaining an ultrasound image from the handheld
ultrasound probe and obtaining a digital image from the camera may
be usefully performed substantially simultaneously so that the
digital image is temporally correlated to the ultrasound image. In
general, the ultrasound probe may capture one or more ultrasound
images and the camera may capture one or more digital images in
order to provide a sequence of images forming a scan. At least one
of the digital images may include the fiducial marker.
[0115] Although not depicted, a variety of supporting steps may
also or instead be performed as generally described above. The
method may include wirelessly transmitting ultrasound data from the
handheld ultrasound probe to a remote storage facility. The method
may include displaying at least one ultrasound image obtained from
the handheld ultrasound probe and/or displaying at least one
digital image recorded by the camera. The images may be displayed
on one or more monitors, and may be displayed during a scan and/or
after a scan. In one aspect, where a sequence of images is
obtained, a time stamp or other sequential and/or chronological
indicator may be associated with each image (or image pair,
including the digital image from the camera and the ultrasound
image from the probe). In this manner, a sequence of images may be
replayed or otherwise processed in a manner dependent on
sequencing/timing of individual images.
[0116] As shown in step 809, a camera pose may be determined. By
way of example and not limitation, this may be accomplished using
motion estimation, which may be further based on a fiducial marker
placed upon the skin surface of a target. While the emphasis in the
following description is on motion estimation using a fiducial, it
will be understood that numerous techniques may be employed to
estimate or measure a camera pose, and any such techniques may be
adapted to use with the systems and methods contemplated herein
provided they can recover motion with suitable speed and accuracy
for the further processing described. Several examples are noted
above, and suitable techniques may include, e.g., mechanical
instrumentation of the ultrasound probe, or image-based or other
external tracking of the probe.
[0117] As shown in step 810, a digital image may be analyzed to
detect a presence of a fiducial marker. Where the fiducial marker
is detected, the process 800 may proceed to step 814 where motion
estimation is performed and the camera pose recovered using the
fiducial marker. As shown in step 812, where no fiducial marker is
detected, motion estimation may be performed using any other
visible features of the skin surface captured by the camera.
However, determined, the camera pose may be determined and stored
along with other data relating to a scan. It will be understood
that the "camera pose" referred to herein may be a position and
orientation of the actual digital camera, or any other pose related
thereto, such as any point within, on the exterior of, or external
to the ultrasound probe, provided the point can be consistently
related (e.g., by translation and/or rotation) to the visual images
captured by the digital camera.
[0118] Once a world coordinate system is established (which may be
arbitrarily selected or related to specific elements of the
ultrasound system and/or the target), a three-dimensional motion of
the handheld ultrasound probe with respect to the skin surface for
the digital image may be estimated and the pose may be expressed
within the world coordinates. World coordinates of points on a
plane, X=[X Y l] T, and the corresponding image coordinates, x=[x y
l] T may be related by a homography matrix. This relationship may
be expressed as:
[xyl]T=K[r.sub.1r.sub.2r.sub.3[t][XYZl]T [Eq. 4]
where K is the 3.times.3 projection matrix of the camera that
incorporates the intrinsic parameters of the camera. The projection
matrix may be obtained through intrinsic calibration of the camera.
The rotation matrix R and the translation vector t describe the
geometric relationship between the world coordinate system and the
image coordinate system, and r.sub.1, r.sub.2, and r.sub.3 are the
column vectors of R.
[0119] World coordinates of points on a planar structure may be
related by a 3.times.3 planar homography matrix. For points on a
planar structure, Z=0 and the relationship may be expressed as:
[xyl]T=K[r.sub.1r.sub.2[t][XYl]T [Eq. 5]
[0120] The image coordinates in different perspectives of a planar
structure may also be related by a 3.times.3 planar homography
matrix. An image-to-world homography matrix may then be used to
show the relationship between the camera images (the image
coordinate system) and the ultrasound images (a world coordinate
system). The homography matrix may be expressed by the formula
x'=Hx, where x'=[x' y' l'] and x=[x y l], points x' and x are
corresponding points in the two coordinate systems, and His the
homography matrix that maps point x to x'. H has eight degrees of
freedom in this homogeneous representation and H can be determined
by at least four corresponding points. H may be written as a
9-vector matrix, [h.sub.11, h.sub.12, h.sub.13, h.sub.21, h.sub.23,
h.sub.31, h.sub.32, h.sub.33] T, with n corresponding points,
x.sub.i' and x.sub.i for i=1, 2, 3, . . . n. This matrix may be
expressed by the formula A.sub.h=0, where A may be a 2n.times.9
matrix:
A = x 1 y 1 1 0 0 0 - x 1 x 1 ' - y 1 x 1 ' - x 1 ' 0 0 0 x 1 y 1 1
- x 1 y 1 ' - y 1 y 1 ' - y 1 ' x n y n 1 0 0 0 - x n x n ' - y n x
n ' - x n ' 0 0 0 x n y n 1 - x n y n ' - y n y n ' - y n ' [ Eq .
6 ] ##EQU00002##
[0121] The solution h is the unit eigenvector of the matrix ATA
with the minimum eigenvalue. The homography matrix may be estimated
by any suitable method including, for example, using a RANSAC
(Random Sample Consensus) algorithm.
[0122] If the image to world homography matrix and the projection
matrix of the camera are known, then the camera pose in the world
coordinate system may be calculated for each image i, where i=1, 2,
3, . . . n. Thus, column vectors r.sub.1, r.sub.2, and t can be
calculated. Vector r.sub.3 may be expressed as the formula
r.sub.3=r.sub.1.times.r.sub.2 since R is a rotation matrix.
[0123] After the points or other features on the fiducial marker
are selected, the corresponding features may be identified for each
digital image where the fiducial marker is present. In general, the
fiducial marker may be visible in at least one of the plurality of
digital images (and not necessarily the first image), although it
is also possible to recover camera pose entirely based on features
other than the fiducial marker. The planar homography matrix from
image i to the previous image may be calculated. The
correspondences between consecutive images may be extracted using
any suitable technique such as scale-invariant feature transform
(SIFT).
[0124] As shown in step 816, the ultrasound image(s) may be aligned
in the world coordinate system. The process 800 may also include a
step of calibrating the ultrasound probe so that ultrasound images
can be converted to camera coordinates and/or the world coordinate
system, thus permitting the ultrasound image(s) to be registered in
a common coordinate system. For each ultrasound image, image
coordinates may be converted to world coordinates using the
corresponding estimates of camera pose. Transformations derived
from ultrasound calibration may also be used to convert the image
coordinates to the world coordinates. A plurality of ultrasound
images may be aligned to the world coordinate system based on the
corresponding camera poses. The resulting ultrasound images may be
further processed in three dimensions, for example to obtain a
shape and/or volume of the target, or features and/or objects
within the target from the plurality of ultrasound images that have
been aligned in the world coordinate system. Thus for example, a
shape or volume of tissue, a tumor, or any other object within the
target may be calculated in three-dimensions based on the
registered ultrasound images, particularly where each image is
normalized as described above to a single contact force.
[0125] A processor in communication with the ultrasound probe and
the camera may be configured to perform the above-described steps
including: identifying the fiducial marker in two or more digital
images from the camera; establishing a world coordinate system
using the predetermined dimensions of the fiducial marker and the
two or more digital images; estimating a three dimensional pose of
the handheld ultrasound probe with respect to the two or more
images; and aligning the plurality of ultrasound images in the
world coordinate system to obtain a reconstructed volume of the
target. It will be readily appreciated that the steps may be
performed any number of times for any number of camera images
and/or ultrasound images, and processing may be performed as new
images are acquired, or in a post processing step according to the
capabilities of the system or the preferences of a user.
[0126] FIG. 9 is a flowchart of a process for capturing an
acquisition state for an ultrasound scan.
[0127] As shown in step 902, the process 900 may begin with
capturing an ultrasound image with a handheld ultrasound probe such
as any of the probes described above. The steps of the process 900
may in general performed a single time, or may be repeated any
number of times according to the intended use(s) of the acquisition
state data.
[0128] As depicted generally in step 903, the process 900 may
include capturing acquisition state data. While various specific
types of acquisition state data are described below, it will be
appreciated that more generally any data related to the state of an
ultrasound probe, the target, the surrounding environment, and so
forth may be usefully acquired as acquisition state data and
correlated to one or more ultrasound images. Thus for example, the
acquisition state may include location, orientation, orientation,
velocity (in any of the foregoing), acceleration, or any other
intrinsic or extrinsic characteristics of the camera, the
ultrasound probe, the target, or combinations of the foregoing.
Similarly, an acquisition state may include environmental factors
such as temperature, humidity, air pressure, or the like, as well
as operating states of hardware, optics, and so forth. Still more
generally, anything that can be monitored or sensed and usefully
employed in an ultrasound imaging system as contemplated herein may
be used as an acquisition state.
[0129] As shown in step 904, capturing acquisition state data may
include capturing a pose of the handheld ultrasound probe, which
may be performed substantially concurrently with the step of
obtaining the ultrasound image, and may include any of the
techniques described above. This step may also or instead employ
other techniques for measuring position and orientation of an
ultrasound probe with respect to a skin surface. For example, the
present process may be combined with techniques involving the use
of accelerometers or gyroscopes, or techniques involving the level
of speckle dissimilarity, decorrelation or other disparities
between consecutive scans.
[0130] As shown in step 906, capturing acquisition state data may
also or instead include capturing a contact force or pressure of
the handheld ultrasonic probe, which may be performed substantially
concurrently with the step of obtaining the ultrasound image. This
may for example be an instantaneous contact force measured between
the probe and a surface being imaged, or an average force over a
predetermined interval, or some other measure of pressure or the
like between the surface and the probe.
[0131] As shown in step 908, the process 900 may include
normalizing the ultrasound images to a common contact force. As
shown in step 910, the process 900 may include storing ultrasound
image data and any acquisition state data (e.g., pose and contact
force) temporally or otherwise associated therewith. As shown in
step 912, the process 900 may include displaying one or more
ultrasound images and/or acquisition state data or the like. As
shown in step 914, the process 900 may include aligning the
ultrasound image(s) in a world coordinate system, such as by using
the acquisition state data. As shown in step 916, the process 900
may include reconstructing a volume of the target, or an object or
feature within the target, using a plurality of ultrasound images.
This may be accompanied by a display of the reconstructed volume.
More generally, any use of the various types of acquisition state
data described above for control, image enhancements, or other
visualizations and/or data processing may be incorporated into the
process 900.
[0132] FIG. 10 shows a fiducial marker that may be used with the
techniques described above. The fiducial marker 1000 may be applied
to a skin surface of a target as a sticker, or using ink or any
other suitable marking technique. In one aspect, the fiducial
marker 1000 may be black, or any other color or combination of
colors easily detectable using digital imaging techniques. In
another aspect, the fiducial marker 1000 may be formed of a
material or combination of materials transparent to ultrasound so
as not to interfere with target imaging. The fiducial marker 1000
may also have a variety of shapes and/or sizes. In one embodiment,
the fiducial marker 1000 may include an inner square 1002 of about
three millimeters in height, and one or more outer squares 1004 of
about one millimeter in height on the corners of the inner square
1002. In this manner, several corner features 1006 are created for
detection using an imaging device.
[0133] While the systems and methods above describe specific
embodiments of acquisition states and uses of same, it will be
understood that acquisition state data may more generally be
incorporated into an ultrasound imaging workflow, such as to
provide operator feedback or enhance images. Thus in one aspect
there is disclosed herein techniques for capturing an acquisition
state for an ultrasound scan and using the acquisition state to
either control an ultrasound probe (such as with force feedback as
described above) or to provide user guidance (such as to direct a
user through displayed instructions or tactile feedback to a
previous acquisition state for a patient or other target). The
improved workflows possible with an ultrasound probe that captures
acquisition state are generally illustrated in the following
figure.
[0134] FIG. 11 shows a system with a workflow for using acquisition
states. In general, an ultrasound probe 1102 such as any of the
devices described above may capture image data and acquisition
state data during an ultrasound scan. In one aspect, this data may
be fed directly to a user such as an ultrasound technician during a
scan through a user interface 1104. For example, ultrasound images
and/or acquisition state data may be displayed on a display of a
computer or the like while a scan is being performed. The data such
as image data 1106 and acquisition state data 1108 may also or
instead be stored in a memory for archival purposes or further
processing.
[0135] In another aspect, machine intelligence may be applied in a
variety of manners to augment a scanning process, as depicted
generally as a machine intelligence system 1110 which may include a
processor, memory, software, and so forth. For example, acquisition
state data concerning, e.g., a pose of the ultrasound probe may be
used to create a graphical representation of a scanner relative to
a target, and the graphical representation may be depicted in the
user interface 1104 showing a relative position of the ultrasound
probe to the target in order to provide visual feedback to the user
concerning orientation. As another example, contact force may be
displayed as a numerical value, and/or an ultrasound image with a
normalized contact force may be rendered for viewing by the user
during the scan. In another aspect, a desired acquisition state may
be determined (e.g., provided by a user to the computer), and the
machine intelligence may create instructions for the user that can
be displayed during the scan to steer the user toward the desired
acquisition state. This may be a state of diagnostic significance,
and/or a previous acquisition state from a current or historical
scan. In another aspect, the desired acquisition state 1112, or
portions thereof, may be transmitted as control signals to the
ultrasound probe. For example, control signals for an instantaneous
contact force may be communicated to a force-controlled ultrasound
device such as the device described above. This may also or instead
include scanning data parameters such as frequency or array beam
formation, steering, and focusing.
[0136] In addition, the capability of capturing a multi-factor
acquisition state including, e.g., contact force and position
permits enhancements to analysis and diagnostic use of an
ultrasound system. For example, analysis may include elastography,
image normalization, three-dimensional reconstruction (e.g., using
normalized images), volume and/or shape analysis, and so forth.
Similarly, diagnostics may be improved, or new diagnostics created,
based upon the resulting improved ultrasound images as well as
normalization of images and accurate assessment of an acquisition
state. All such uses of an ultrasound system having acquisition
state capabilities, feedback control capabilities, and machine
intelligence as contemplated herein are intended to fall within the
scope of this disclosure.
[0137] It will be appreciated that many of the above systems,
devices, methods, processes, and the like may be realized in
hardware, software, or any combination of these suitable for the
data processing, data communications, and other functions described
herein. This includes realization in one or more microprocessors,
microcontrollers, embedded microcontrollers, programmable digital
signal processors or other programmable devices or processing
circuitry, along with internal and/or external memory. This may
also, or instead, include one or more application specific
integrated circuits, programmable gate arrays, programmable array
logic components, or any other device or devices that may be
configured to process electronic signals. It will further be
appreciated that a realization of the processes or devices
described above may include computer-executable code created using
a structured programming language such as C, an object oriented
programming language such as C++, or any other high-level or
low-level programming language (including assembly languages,
hardware description languages, and database programming languages
and technologies) that may be stored, compiled or interpreted to
run on one of the above devices, as well as heterogeneous
combinations of processors, processor architectures, or
combinations of different hardware and software. At the same time,
processing may be distributed across devices such as the handheld
probe and a remote desktop computer or storage device, or all of
the functionality may be integrated into a dedicated, standalone
device including without limitation a wireless, handheld ultrasound
probe. All such permutations and combinations are intended to fall
within the scope of the present disclosure.
[0138] In other embodiments, disclosed herein are computer program
products comprising computer-executable code or computer-usable
code that, when executing on one or more computing devices (such as
the controller described above), performs any and/or all of the
steps described above. The code may be stored in a computer memory
or other non-transitory computer readable medium, which may be a
memory from which the program executes (such as internal or
external random access memory associated with a processor), a
storage device such as a disk drive, flash memory or any other
optical, electromagnetic, magnetic, infrared or other device or
combination of devices. In another aspect, any of the processes
described above may be embodied in any suitable transmission or
propagation medium carrying the computer-executable code described
above and/or any inputs or outputs from same.
[0139] It should further be appreciated that the methods above are
provided by way of example. Absent an explicit indication to the
contrary, the disclosed steps may be modified, supplemented,
omitted, and/or re-ordered without departing from the scope of this
disclosure.
[0140] The method steps of the invention(s) described herein are
intended to include any suitable method of causing such method
steps to be performed, consistent with the patentability of the
following claims, unless a different meaning is expressly provided
or otherwise clear from the context. So for example performing the
step of X includes any suitable method for causing another party
such as a remote user or a remote processing resource (e.g., a
server or cloud computer) to perform the step of X. Similarly,
performing steps X, Y and Z may include any method of directing or
controlling any combination of such other individuals or resources
to perform steps X, Y and Z to obtain the benefit of such
steps.
[0141] While particular embodiments of the present invention have
been shown and described, it will be apparent to those skilled in
the art that various changes and modifications in form and details
may be made therein without departing from the spirit and scope of
this disclosure and are intended to form a part of the invention as
defined by the following claims, which are to be interpreted in the
broadest sense allowable by law.
* * * * *