U.S. patent application number 17/031375 was filed with the patent office on 2021-01-07 for fluorescence based flow imaging and measurements.
The applicant listed for this patent is SCINOVIA, CORP.. Invention is credited to David S. Cohen, James Bradley Sund, SR..
Application Number | 20210000352 17/031375 |
Document ID | / |
Family ID | |
Filed Date | 2021-01-07 |
![](/patent/app/20210000352/US20210000352A1-20210107-D00000.png)
![](/patent/app/20210000352/US20210000352A1-20210107-D00001.png)
![](/patent/app/20210000352/US20210000352A1-20210107-D00002.png)
![](/patent/app/20210000352/US20210000352A1-20210107-D00003.png)
![](/patent/app/20210000352/US20210000352A1-20210107-D00004.png)
![](/patent/app/20210000352/US20210000352A1-20210107-D00005.png)
![](/patent/app/20210000352/US20210000352A1-20210107-D00006.png)
United States Patent
Application |
20210000352 |
Kind Code |
A1 |
Sund, SR.; James Bradley ;
et al. |
January 7, 2021 |
FLUORESCENCE BASED FLOW IMAGING AND MEASUREMENTS
Abstract
Fluorescence based tracking of a light-emitting marker in a
bodily fluid stream is conducted by: providing a light-emitting
marker into a fluid stream; establishing field of view monitoring
by placement of a sensor, such as a high speed camera, at a region
of interest; recording image data of light emitted by the marker at
the region of interest; determining time characteristics of the
light output of the marker traversing the field of view; and
calculating flow characteristics based on the time characteristics.
Furthermore generating a velocity vector map may be conducted using
a cross correlation technique, leading and falling edge
considerations, subtraction, and/or thresholding.
Inventors: |
Sund, SR.; James Bradley;
(Raleigh, NC) ; Cohen; David S.; (Chapel Hill,
NC) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
SCINOVIA, CORP. |
Sheridan |
WY |
US |
|
|
Appl. No.: |
17/031375 |
Filed: |
September 24, 2020 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
15863338 |
Jan 5, 2018 |
10813563 |
|
|
17031375 |
|
|
|
|
PCT/US2016/041045 |
Jul 6, 2016 |
|
|
|
15863338 |
|
|
|
|
62189126 |
Jul 6, 2015 |
|
|
|
Current U.S.
Class: |
1/1 |
International
Class: |
A61B 5/0275 20060101
A61B005/0275; A61B 5/00 20060101 A61B005/00; G01F 1/00 20060101
G01F001/00; G06T 7/246 20060101 G06T007/246; G06T 7/254 20060101
G06T007/254; A61B 5/026 20060101 A61B005/026; A61M 5/00 20060101
A61M005/00; G06T 5/00 20060101 G06T005/00; G06T 5/20 20060101
G06T005/20; G06T 7/00 20060101 G06T007/00 |
Claims
1. A system for fluorescence based tracking of a light-emitting
marker in a bodily fluid stream, the system comprising: a delivery
apparatus configured to provide a light-emitting marker into the
bodily fluid stream; a camera configured to monitor a region of
interest traversed by the bodily fluid stream; and a computing
device configured to: record motion video data generated by the
camera; determine time characteristics of the recorded data; and
calculate flow characteristics based on the time
characteristics.
2. The system according to claim 1, wherein the computing device is
further configured to: divide the motion video data into kernels;
identify which of the kernels receive some portion of the
light-emitting marker using an intensity threshold; compute, for
each identified kernel, an intensity signal data set comprising
information of mean light intensity versus time; perform smoothing
on each intensity signal data set; and calculate a lag time between
the intensity signal data sets of neighboring identified kernels
using cross-correlation.
3. The system according to claim 1, wherein the computing device is
further configured to: using a spatial resolution and the lag time,
calculate velocity vectors; sum the velocity vectors of neighboring
kernels to create a resultant velocity vector; and generate a
velocity map from the resultant velocity vectors for all
kernels.
4. The system according to claim 3, wherein the computing device
performs smoothing on each intensity signal data set by time window
averaging or by using a filter.
5. The system according to claim 3, wherein the computing device is
further configured to: for each particular identified kernel, find
segments in which a slope of the intensity signal data set rises
for a minimum consecutive number of frames or falls for a minimum
consecutive number of frames, which segments occur when a leading
edge or falling edge of a portion of the light-emitting marker
passes through the identified kernel; search the intensity signal
data sets of neighboring identified kernels for a rising or falling
segment of similar length; and calculate a lag time between
segments in the particular identified kernel and segments in the
neighboring identified kernels.
6. The system according to claim 3, wherein the computing device is
further configured to: calculate a difference frame by subtracting
a frame of the motion video data from a consecutive frame of the
motion video data; apply a threshold to the difference frame to
eliminate pixels therein below a specified intensity value;
calculate a pixel size of a remaining blob in the difference frame
in a direction of bodily fluid flow; calculate a size of the
remaining blob using the pixel size and a spatial resolution; and
calculate a velocity by using a distance traveled by the remaining
blob and a time between frames.
7. The system according to claim 3, wherein the computing device is
further configured to: create a logical frame in which a respective
indicator for each pixel can be set as true or false; set the
indicators of the identified pixels as true; set the indicators of
all other pixels as false; calculate a difference frame by
subtracting a first logical frame from a second logical frame such
that the difference frame comprises pixels that reached the
specified threshold after a time of the first logical frame; find
length in pixels of the remaining blob in the difference frame in a
direction of bodily fluid flow; convert the length in pixels of the
difference frame to physical distance using the spatial resolution;
and calculate velocity by dividing the physical distance by a time
between frames.
8. A method of fluorescence based tracking of a light-emitting
marker in a fluid stream, the method comprising: monitoring, with a
camera, a region of interest traversed by fluid stream into which a
light-emitting marker has been introduced; recording motion video
data generated by the sensor; dividing the motion video data into
frames each comprising pixels; identifying which of the pixels
receive some portion of the light-emitting marker using an
intensity threshold; calculating a difference frame by subtracting
a frame of the motion video data from a consecutive frame of the
motion video data; and applying a threshold to the difference frame
to eliminate pixels therein below a specified intensity value.
9. The method according to claim 8, further comprising: calculating
a pixel size of a remaining blob in the difference frame in a
direction of fluid flow; calculating a size of the remaining blob
using the pixel size and a spatial resolution; and calculating a
velocity by using a distance traveled by the remaining blob and a
time between frames.
10. The method according to claim 9, further comprising: creating a
logical frame in which a respective indicator for each pixel can be
set as true or false; setting the indicators of the identified
pixels as true; setting the indicators of all other pixels as
false; calculating a difference frame by subtracting a first
logical frame from a second logical frame such that the difference
frame comprises pixels that reached the specified threshold after a
time of the first logical frame; finding length in pixels of a
remaining blob in the difference frame in a direction of fluid
flow; converting the length in pixels of the difference frame to
physical distance using the spatial resolution; and calculating
velocity by dividing the physical distance by a time between
frames.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation U.S. Non-Provisional
patent application Ser. No. 15/863,338, entitled "FLUORESCENCE
BASED FLOW IMAGING AND MEASUREMENTS," filed on Jan. 5, 2018, which
is a continuation of International Patent Application Serial No.
PCT/US2016/041045, entitled "FLUORESCENCE BASED FLOW IMAGING AND
MEASUREMENTS," filed on Jul. 6, 2016, which claims the benefit of
priority of U.S. Provisional Patent Application No. 62/189,126,
entitled "FLUORESCENCE BASED FLOW IMAGING AND MEASUREMENTS," filed
on Jul. 6, 2015. Each above-reference application is incorporated
herein in its entirety by this reference.
TECHNICAL FIELD
[0002] The present disclosure relates to fluorescence based imaging
and measurements. More particularly, the present disclosure relates
to determining flow characteristics such as velocity in bodily
vessels such as blood vessels.
BACKGROUND
[0003] Fluorescent markers have been used for basic imaging of
bodily structures, but improvements are needed in determining flow
characteristics in such bodily fluids as blood.
SUMMARY
[0004] This summary is provided to introduce in a simplified form
concepts that are further described in the following detailed
descriptions. This summary is not intended to identify key features
or essential features of the claimed subject matter, nor is it to
be construed as limiting the scope of the claimed subject
matter.
[0005] According to at least one embodiment, a method of
fluorescence based tracking of a light-emitting marker in a bodily
fluid stream includes: providing a light-emitting marker into a
bodily fluid stream; monitoring, with a sensor, a region of
interest traversed by the bodily fluid stream; recording data
generated by the sensor; determining time characteristics of the
recorded data; and calculating flow characteristics based on the
time characteristics.
[0006] In at least one example, the sensor includes a camera, and
the recorded data comprises motion video data.
[0007] In at least one example, the method further includes:
dividing the motion video data into kernels; identifying which of
the kernels receive some portion of the light-emitting marker using
an intensity threshold; computing, for each identified kernel, an
intensity signal data set including information of mean light
intensity versus time; performing smoothing on each intensity
signal data set; calculating a lag time between the intensity
signal data sets of neighboring identified kernels using
cross-correlation; using a spatial resolution and the lag time,
calculating velocity vectors; summing the velocity vectors of
neighboring kernels to create a resultant velocity vector; and
generating a velocity map from the resultant velocity vectors for
all kernels.
[0008] In at least one example, performing smoothing on each
intensity signal data set includes time window averaging.
[0009] In at least one example, performing smoothing on each
intensity signal data set includes using a filter.
[0010] In at least one example, wherein performing smoothing on
each intensity signal data set includes using a Gaussian
filter.
[0011] In at least one example, the method further includes:
dividing the motion video data into kernels; identifying which of
the kernels receive some portion of the light-emitting marker using
an intensity threshold; computing, for each identified kernel, an
intensity signal data set including information of mean light
intensity versus time; performing smoothing on each intensity
signal data set; for each particular identified kernel, finding
segments in which a slope of the intensity signal data set rises
for a minimum consecutive number of frames or falls for a minimum
consecutive number of frames, which segments occur when a leading
edge or falling edge of a portion of the light-emitting marker
passes through the identified kernel; searching the intensity
signal data sets of neighboring identified kernels for a rising or
falling segment of similar length; calculating a lag time between
segments in the particular identified kernel and segments in the
neighboring identified kernels; using a spatial resolution and the
lag time, calculating velocity vectors; summing the velocity
vectors of neighboring kernels to create a resultant velocity
vector; and generating a velocity map from the resultant velocity
vectors for all kernels.
[0012] In at least one example, performing smoothing on each
intensity signal data set includes time window averaging. In at
least one example, performing smoothing on each intensity signal
data set includes using a filter. In at least one example,
performing smoothing on each intensity signal data set includes
using a Gaussian filter.
[0013] In at least one example, the method further includes:
calculating a difference frame by subtracting a frame of the motion
video data from a consecutive frame of the motion video data;
applying a threshold the difference frame to eliminate pixels
therein below a specified intensity value; calculating a pixel size
of a remaining blob in the difference frame in a direction of blood
flow; calculating a size of the remaining blob using the pixel size
and a spatial resolution; and calculating a velocity by using a
distance traveled by the remaining and a time between frames.
[0014] In at least one example, the method further includes:
dividing the motion video data into frames each including pixels;
identifying which of the pixels receive some portion of the
light-emitting marker using an intensity threshold; creating a
logical frame in which a respective indicator for each pixel can be
set as true or false; setting the indicators of the identified
pixels as true; setting the indicators of all other pixels as
false; calculating a difference frame by subtracting a first
logical frame from a second logical frame such that the difference
frame includes pixels that reached the specified threshold after a
time of the first logical frame; finding length in pixels of the
remaining blob in the difference frame in a direction of blood
flow; converting the length in pixels of the difference frame to
physical distance using the spatial resolution; and calculating
velocity by dividing the physical distance by a time between
frames.
[0015] According to at least one embodiment, a system for
fluorescence based tracking of a light-emitting marker in a bodily
fluid stream includes: a delivery apparatus configured to provide a
light-emitting marker into a bodily fluid stream; a sensor
configured to monitor a region of interest traversed by the bodily
fluid stream; and a computing device configured to: record data
generated by the sensor; determine time characteristics of the
recorded data; and calculate flow characteristics based on the time
characteristics.
[0016] In at least one example, the sensor includes a camera, and
the recorded data includes motion video data.
[0017] In at least one example, the computing device is further
configured to: divide the motion video data into kernels; identify
which of the kernels receive some portion of the light-emitting
marker using an intensity threshold; compute, for each identified
kernel, an intensity signal data set including information of mean
light intensity versus time; perform smoothing on each intensity
signal data set; calculate a lag time between the intensity signal
data sets of neighboring identified kernels using
cross-correlation; using a spatial resolution and the lag time,
calculate velocity vectors; sum the velocity vectors of neighboring
kernels to create a resultant velocity vector; and generate a
velocity map from the resultant velocity vectors for all
kernels.
[0018] In at least one example, the computing device performs
smoothing on each intensity signal data set by time window
averaging. In at least one example, the computing device performs
smoothing on each intensity signal data set by using a Gaussian
filter.
[0019] In at least one example, the computing device is further
configured to: divide the motion video data into kernels; identify
which of the kernels receive some portion of the light-emitting
marker using an intensity threshold; compute, for each identified
kernel, an intensity signal data set including information of mean
light intensity versus time; perform smoothing on each intensity
signal data set; for each particular identified kernel, find
segments in which a slope of the intensity signal data set rises
for a minimum consecutive number of frames or falls for a minimum
consecutive number of frames, which segments occur when a leading
edge or falling edge of a portion of the light-emitting marker
passes through the identified kernel; search the intensity signal
data sets of neighboring identified kernels for a rising or falling
segment of similar length; calculate a lag time between segments in
the particular identified kernel and segments in the neighboring
identified kernels; use a spatial resolution and the lag time to
calculate velocity vectors; sum the velocity vectors of neighboring
kernels to create a resultant velocity vector; and generate a
velocity map from the resultant velocity vectors for all
kernels.
[0020] In at least one example, the computing device is further
configured to: calculate a difference frame by subtracting a frame
of the motion video data from a consecutive frame of the motion
video data; apply a threshold the difference frame to eliminate
pixels therein below a specified intensity value; calculate a pixel
size of a remaining blob in the difference frame in a direction of
blood flow; calculate a size of the remaining blob using the pixel
size and a spatial resolution; and calculate a velocity by using a
distance traveled by the remaining blob and a time between
frames.
[0021] In at least one example, wherein the computing device is
further configured to: divide the motion video data into frames
each including pixels; identify which of the pixels receive some
portion of the light-emitting marker using an intensity threshold;
create a logical frame in which a respective indicator for each
pixel can be set as true or false; set the indicators of the
identified pixels as true; set the indicators of all other pixels
as false; calculate a difference frame by subtracting a first
logical frame from a second logical frame such that the difference
frame includes pixels that reached the specified threshold after a
time of the first logical frame; find length in pixels of the
remaining blob in the difference frame in a direction of blood
flow; convert the length in pixels of the difference frame to
physical distance using the spatial resolution; and calculate
velocity by dividing the physical distance by a time between
frames.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The previous summary and the following detailed descriptions
are to be read in view of the drawings, which illustrate particular
exemplary embodiments and features as briefly described below. The
summary and detailed descriptions, however, are not limited to only
those embodiments and features explicitly illustrated.
[0023] FIG. 1 shows a fluorescent marker delivery time plot in a
fluid stream and a corresponding response time plot of light
intensity measured downstream in a fixed field of view according to
at least one embodiment.
[0024] FIG. 2 shows a flowchart representing a method, according to
at least one embodiment, of generating a velocity vector map using
a cross correlation technique.
[0025] FIG. 3 shows a flowchart representing a method, according to
at least one embodiment, of generating a velocity vector map using
leading and falling edge considerations.
[0026] FIG. 4 shows a flowchart representing a method, according to
at least one embodiment, of generating a velocity vector map using
subtraction.
[0027] FIG. 5 shows a flowchart representing a method, according to
at least one embodiment, of generating a velocity vector map using
thresholding.
[0028] FIG. 6 shows a system, according to at least one embodiment,
by which at least the methods of FIGS. 2-5 are implemented.
DETAILED DESCRIPTIONS
[0029] These descriptions are presented with sufficient details to
provide an understanding of one or more particular embodiments of
broader inventive subject matters. These descriptions expound upon
and exemplify particular features of those particular embodiments
without limiting the inventive subject matters to the explicitly
described embodiments and features. Considerations in view of these
descriptions will likely give rise to additional and similar
embodiments and features without departing from the scope of the
inventive subject matters. Although the term "step" may be
expressly used or implied relating to features of processes or
methods, no implication is made of any particular order or sequence
among such expressed or implied steps unless an order or sequence
is explicitly stated.
[0030] Any dimensions expressed or implied in the drawings and
these descriptions are provided for exemplary purposes. Thus, not
all embodiments within the scope of the drawings and these
descriptions are made according to such exemplary dimensions. The
drawings are not made necessarily to scale. Thus, not all
embodiments within the scope of the drawings and these descriptions
are made according to the apparent scale of the drawings with
regard to relative dimensions in the drawings. However, for each
drawing, at least one embodiment is made according to the apparent
relative scale of the drawing.
[0031] Fluorescence based tracking according to several embodiments
described herein includes the providing of a marker such as a
glowing dye into a fluid stream, such as a bloodstream, and making
measurements and generating imagery based on the arrival, movement,
and departure of the marker downstream as detected by sensor(s) to
characterize the flow of the fluid stream and vessels or structures
within which the flow travels. The marker is provided into a fluid
stream for example by direct injection or via a port as discrete
bolus deliveries separated over time. A bolus refers to the
administration of a discrete amount of a fluid substance, in this
case the marker into a bodily fluid stream such as blood, in order
to provide a concentration of the substance to gain a response. A
bolus can be delivered by active pumping or by passive gravity
based delivery such as via an intravenous drip line. In at least
one embodiment, a central line delivery arrangement is used, in
which a port is placed in fluid communication with the subclavian
vein and bolus deliveries are injected into the port. The dye
briefly fluoresces when excited by an illumination source that
emits a particular range of wavelengths. The dye is illuminated
over the Region of Interest (ROI) where imaging of the fluorescence
is also performed.
[0032] A field of view monitoring is established by placement of a
sensor, such as a high speed camera, at a region of interest. The
field of view can be established and held generally fixed as the
marker enters and traverses the field of view of a high-speed
camera sensitive to the light emitted by the marker. Time
characteristics of the light output of the marker traversing the
field of view can be deducted from the light output intensity as
recorded by the camera. A field of view may be established for
example at the heart or other organ where flow diagnostics are
wanted.
[0033] The visual response in the field of view indicates presence
of the marker, with the intensity of the light response being
correlated with the time evolving concentration of the marker in
the stream as the marker diffuses and travels with the host fluid.
The light intensity in the field of view may typically have both
rise and fall characteristics. The rise characteristics correspond
to the arrival and increasing concentration of the marker in the
field of view. The fall characteristics correspond to the departure
or diffusion of the marker and/or the reduction of its light
output. In the case of a dye marker in a blood stream as injected
by a bolus, rise time may be faster generally than fall time such
that response time curves typically show steeper climbs than
falls.
[0034] FIG. 1 shows a fluorescent marker delivery time plot 102 in
a fluid stream and a corresponding response time plot 112 of light
intensity measured downstream in a fixed field of view according to
at least one embodiment. The plots 102 and 112 are shown on a
common time axis 100. The delivery time plot 102 records several
bolus deliveries 104 as step functions separated in time. The
response plot 112 records the corresponding response functions 114
of light intensity in a field of view downstream from the marker
delivery location into a fluid stream. Each response function 114
is shown as a wave or envelope having a rise side 116 representing
the arrival of the marker in the field of view, and a fall side 118
indicating the departure or diffusion of the marker and/or the
reduction of its light output. In order to correlate bolus
deliveries and data acquisition, a marker delivery system in at
least one embodiment includes a controller and a delivery pump in
wired or wireless communication. A data acquisition system
including the camera and a computing device for recording,
analyzing and visualizing the data and/or field of view are also in
communication with the controller in at least one embodiment.
[0035] The time of initiation, delivered volume, and duration of
each bolus delivery can be controlled. The time interval between
consecutive bolus deliveries is also controlled. Thus, multiple
parameters for bolus delivery can be adjusted to ultimately query
and determine varied flow characteristics within a region of
interest subjected to field of view monitoring. Shorter time gaps
can be used for slower moving fluids and longer time gaps can be
used for faster moving fluids within the region of interest.
Injection times can be varied to address different anatomical
depths and tissue surface barriers.
[0036] In at least one embodiment, as the marker from a bolus
delivery enters an established field of view, the light response of
a bolus is captured by a high speed camera. The time domain
dynamics of the light response is analyzed to arrive at velocity
vectors representing movement of the host fluid in the field of
view. Several embodiments of generating velocity vectors using the
data from fluorescence based tracking are described in the
following with reference to FIGS. 2-5. In each, a video including
multiple frames (images) taken of a region of interest is analyzed
to track pixels in or across the field of view.
[0037] A method 200 of generating a velocity vector map using a
cross correlation technique, according to at least one embodiment,
is represented as a flow chart in FIG. 2. In step 202, divide the
video into spatiotemporal cubes, which are termed "kernels" in
these descriptions. In step 204, which is optional, isolate kernels
in which fluorescence appears at any point during the length of the
video using an intensity threshold. In step 206, compute a signal
for each kernel of mean intensity vs. time. The signal may be 1D,
2D, or 3D. Vessel(s) may cross under other vessel(s) and at
different angles, and multiple perspectives from different cameras
can be used. The potential use of coded apertures is within the
scope of these descriptions. In step 208, which is optional,
perform smoothing on the intensity signal for each kernel using
time window averaging, Gaussian filter, etc. In step 210, for each
kernel, calculate the lag time between the intensity signal of the
kernel and the time intensity signals of its neighboring kernels
using cross-correlation. In step 212, using the known spatial
resolution of the image, convert lag time in each direction into
velocity. In step 214, sum the resulting velocity vectors to each
neighboring kernel to create one resultant velocity vector. In step
216, generate a velocity map from the resultant velocity vectors
for all kernels.
[0038] A method 300 of generating a velocity vector map using
leading and falling edge considerations, according to at least one
embodiment, is represented as a flow chart in FIG. 3. In step 302,
divide video into kernels. In step 304, which is optional, isolate
kernels in which fluorescence appears at any point during the
length of the video using an intensity threshold. In step 306,
compute a signal for each kernel of mean intensity vs. time. As
described above with reference to step 206 of FIG. 2, the signal
may be 1D, 2D, or 3D. In step 308, which is optional, perform
smoothing on the intensity signal for each kernel using time window
averaging, Gaussian filter, etc. In step 310, for each kernel, find
segments in which the slope of the intensity signal rises for a
minimum consecutive number of frames or falls for a minimum
consecutive number of frames. These segments occur when the leading
or falling edges of the ICG (fluorescein or other glow dye) bolus
pass through the kernel. In step 312, search the intensity signals
of neighboring kernels for a rising or falling segment of similar
length. In step 314, calculate the temporal offset (lag time)
between the segment in the original kernel and the segment in the
neighboring kernels. In step 316, using the known spatial
resolution of the image, convert lag time in each direction into
velocity. In step 318, sum the resulting velocity vectors to each
neighboring kernel to create one resultant velocity vector. In step
320, generate a velocity map from the resultant velocity vectors
for all kernels.
[0039] A method 400 of generating a velocity vector map using
subtraction, according to at least one embodiment, is represented
as a flow chart in FIG. 4. In step 402, subtract two consecutive
frames, resulting in a difference frame. In step 404, threshold
difference frame to eliminate pixels below a specified intensity
value. In step 406, calculate pixel size of a remaining blob in the
difference frame in the direction of blood flow. In step 408,
convert size of blob in pixels to physical distance using spatial
resolution. In step 410, calculate velocity by dividing distance
traveled by time between frames.
[0040] A method 500 of generating a velocity vector map using
thresholding, according to at least one embodiment, is represented
as a flow chart in FIG. 5. In step 502, for each frame in video,
isolate pixels with intensities above specified threshold. Create
logical frame and set all pixels at or above threshold to true. Set
all other pixels to false. In step 504, for two consecutive frames,
subtract the first logical frame from the second logical frame. The
resulting frame contains the pixels that reached the specified
threshold between frames. In step 506, find the length in pixels of
the remaining blob in the difference frame in the direction of
blood flow. In step 508, convert the pixel length of the difference
frame to physical distance using the spatial resolution. In step
510, calculate velocity by dividing physical distance by the time
between frames.
[0041] FIG. 6 shows a system 600, according to at least one
embodiment, by which at least the methods of FIGS. 2-5 are
implemented. The system 600 includes a computing device 602, a
delivery apparatus 604 configured to provide a light-emitting
marker 606 into a bodily fluid stream, and a sensor 610 configured
to monitor a region of interest traversed by the bodily fluid
stream. The computing device 602 records data generated by the
sensor 610, determines time characteristics of the recorded data;
and calculates flow characteristics based on the time
characteristics.
[0042] The computing device 602 is illustrated as a laptop or other
personal computer. Other computing devices including local and
remote servers are within the scope of these descriptions. The
delivery apparatus 604 provides a light-emitting marker 606. The
delivery apparatus 604 is in communication with and/or under the
control of the computing device 602. The delivery apparatus 604 may
include a powered pump or a gravity based arrangement. The
light-emitting marker 606 may be delivered to the bodily fluid
stream 620 via a catheter, an intravenous drip line, a central line
delivery, or other needle or port device. The delivery apparatus
604 delivers the light-emitting marker 606 in discrete bolus
deliveries separated over time. The light-emitting marker 606 may
include Indocyanine green (ICG), Fluorescein or other glow dye. Two
or more dyes, each having a different respective color, may be
used. For example, each bolus of two or more may include a
different dye and thus the presence and response of each can be
determined separately by color distinction.
[0043] The bodily fluid stream 620 in FIG. 6 is represented as
flowing along a blood vessel or other biological conduit 622.
Several downstream organs or tissue areas 624, 626 and 628 are
represented. By placement and selection of field of view of the
sensor 610, a region of interest can be selected for observation
downstream of the bodily fluid stream 620 carrying with it the
light-emitting marker 606. The sensor 610 monitors for fluorescence
or other indication of the presence of the light-emitting marker
606 in the selected field of view. The sensor 610 can be a
high-speed and high-resolution camera for example.
[0044] Several fields of view are represented. In a first exemplary
field of view 630, the sensor 610 observes an area where the bodily
fluid stream 620 is divided into several downstream flows. In a
second exemplary field of view 632, the sensor 610 observes an area
downstream of the division to isolate monitoring to a single branch
of downstream flow. In a third exemplary field of view 634, the
sensor 610 observes a particular organ or tissue area 628. These
examples represent that a user such as a physician can deliver a
light-emitting marker 606 at any selected location and then observe
the time evolving arrival and dispersion or other activity of the
marker downstream of the selected location at any selected field of
view. In at least one embodiment, a central line delivery
arrangement is used, in which a port is placed in fluid
communication with the subclavian vein and bolus deliveries of the
light-emitting marker 606 are injected into the port.
[0045] The delivery apparatus 604 and sensor 610 are shown as
connected to the computing device 602 by respective cables 612 and
614, however wireless connections may be used as well. The
light-emitting marker 606 briefly fluoresces when excited by an
illumination source 640 that emits a particular range of
wavelengths upon the region of interest within the field of view of
the sensor 610. The illumination source 640 is also shown as
connected to the computing device 602 by a cable 642, however a
wireless connection may be used as well. The computing device
correlates activations of the delivery apparatus 604, the
illumination source 640, and the sensor 610, and collects data from
the sensor 610 as the light-emitting marker 606 in the field of
view responds to the excitation from the illumination source
640.
[0046] In various embodiments, the computing device 602 is
configured to record data generated by the sensor 610; determine
time characteristics of the recorded data; and calculate flow
characteristics based on the time characteristics. Further
embodiments and examples of fluorescence based imaging and data
analysis conducted by the computing device 602 are described above
with reference to FIGS. 2-5.
[0047] Particular embodiments and features have been described with
reference to the drawings. It is to be understood that these
descriptions are not limited to any single embodiment or any
particular set of features, and that similar embodiments and
features may arise or modifications and additions may be made
without departing from the scope of these descriptions and the
spirit of the appended claims.
* * * * *