U.S. patent application number 14/040159 was filed with the patent office on 2014-04-03 for systems and methods for three-dimensional interaction monitoring in an ems environment.
The applicant listed for this patent is ZOLL MEDICAL CORPORATION. Invention is credited to Chad ASHMORE, Martin BURES, Robert H. GOTSCHALL, C. Shane REID.
Application Number | 20140093135 14/040159 |
Document ID | / |
Family ID | 50385259 |
Filed Date | 2014-04-03 |
United States Patent
Application |
20140093135 |
Kind Code |
A1 |
REID; C. Shane ; et
al. |
April 3, 2014 |
SYSTEMS AND METHODS FOR THREE-DIMENSIONAL INTERACTION MONITORING IN
AN EMS ENVIRONMENT
Abstract
A method for tracking interactions in an emergency response
environment according to embodiments of the present invention
includes receiving color images and depth information from within a
field of view of a sensor array; maintaining an emergency encounter
record; monitoring one or both of a position of an object and
movement of the object in the emergency response environment based
on the color images and depth information received by the sensor
array; and recording an occurrence of a condition in the emergency
encounter record, wherein the condition is based on the one or both
of the position of the object and the movement of the object.
Inventors: |
REID; C. Shane; (Denver,
CO) ; ASHMORE; Chad; (Frederick, CO) ;
GOTSCHALL; Robert H.; (Thornton, CO) ; BURES;
Martin; (Somerville, MA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
ZOLL MEDICAL CORPORATION |
Chelmsford |
MA |
US |
|
|
Family ID: |
50385259 |
Appl. No.: |
14/040159 |
Filed: |
September 27, 2013 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61707671 |
Sep 28, 2012 |
|
|
|
61707665 |
Sep 28, 2012 |
|
|
|
Current U.S.
Class: |
382/107 |
Current CPC
Class: |
G06Q 10/00 20130101;
G06F 3/017 20130101; G06T 7/20 20130101; G16H 40/67 20180101; G16H
20/13 20180101; G16H 10/60 20180101 |
Class at
Publication: |
382/107 |
International
Class: |
G06Q 50/22 20060101
G06Q050/22; G06T 7/20 20060101 G06T007/20 |
Claims
1. A method for tracking interactions in an emergency response
environment, the method comprising: receiving color images and
depth information from within a field of view of a sensor array;
maintaining an emergency encounter record; monitoring one or both
of a position of an object and movement of the object in the
emergency response environment based on the color images and depth
information received by the sensor array; and recording an
occurrence of a condition in the emergency encounter record,
wherein the condition is based on the one or both of the position
of the object and the movement of the object.
2. The method of claim 1, wherein the object is a human, and
wherein monitoring one or both of the position of the object and
movement of the object comprises monitoring one or both of the
position of the human and movement of an at least partial skeletal
approximation of the human.
3. The method of claim 2, wherein the human is a first object,
wherein the condition comprises the at least partial skeletal
approximation of the human coming within a certain distance of a
second object.
4. The method of claim 3, wherein the human is a first human, and
wherein the second object is a second human.
5. The method of claim 4, wherein the condition comprises the first
human touching the second human.
6. The method of claim 5, wherein the second human is a patient
being treated by the first human in the emergency response
environment.
7. The method of claim 1, wherein recording the occurrence of the
condition comprises recording a time at which the condition
occurs.
8. The method of claim 7, wherein recording the occurrence of the
condition further comprises recording a type of the condition.
9. The method of claim 7, wherein recording the occurrence of the
condition further comprises recording as video footage the color
images received during the occurrence of the condition.
10. The method of claim 1, further comprising: receiving streaming
clinical data about a patient, and correlating at least a portion
of the streaming clinical data in the emergency encounter record
with the occurrence of the condition.
11. The method of claim 10, wherein correlating the at least the
portion of the streaming clinical data comprises flagging the at
least the portion of the streaming clinical data corresponding to a
time of the occurrence of the condition.
12. A system for tracking interactions in an emergency response
environment, the system comprising: a sensor array, wherein the
sensor array is adapted to receive color images and depth
information in its field of view; a control system communicably
coupled to the sensor array, the control system configured to:
maintain an emergency encounter record; monitor one or both of
position and movement of an object in the emergency response
environment based on the color images and depth information
received from the sensor array; and record an occurrence of a
condition in the emergency encounter record, wherein the condition
is based on the one or both of position and movement of the
object.
13. A method for inventory control in an emergency response
environment, the method comprising: detecting three-dimensional
movement of a human body in the emergency response environment with
a sensor array, wherein the sensor array generates visual
information and depth information about the emergency response
environment; detecting three-dimensional movement of an object in
the emergency response environment; determining an occurrence of
contact between the human body and the object; and recording an
entry in an emergency encounter record based on the occurrence of
the contact.
14. The method of claim 13, wherein the object is a narcotic
medication stored in an enclosure in the emergency response
environment, the method further comprising: determining, based on
the detection of the three-dimensional movement of the human body
and the object, an occurrence of intersection of the human body
with the enclosure; and recording an entry in the emergency
encounter record based on the occurrence of the intersection.
15. The method of claim 13, wherein the object is a narcotic
medication stored in an enclosure in the emergency response
environment, the method further comprising: determining, based on
the detection of the three-dimensional movement of the object, an
occurrence of removal of the narcotic medication from the
enclosure; and recording an entry in the emergency encounter record
based on the occurrence of the removal.
16. The method of claim 15, further comprising updating an
inventory database, based on the occurrence of the removal, to
reflect that the narcotic medication has been used and needs
restocking.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional
Patent Application Ser. No. 61/707,671, filed on Sep. 28, 2012, and
of U.S. Provisional Patent Application Ser. No. 61/707,665, filed
on Sep. 28, 2012, both of which are incorporated herein by
reference in their entireties for all purposes.
TECHNICAL FIELD
[0002] Embodiments of the present invention relate generally to
gesture recognition and three-dimensional interaction tracking in
an emergency medical services environment.
BACKGROUND
[0003] In an emergency medical services ("EMS") or first responder
environment, caregivers must often focus more acutely on patient
care in a shorter amount of time and with a greater number of
uncertainties and variables than their counterparts in a hospital
setting. Creating a record of the EMS caregiver's encounter with a
patient, however, remains important. Manual input of information
into patient charting systems (e.g. by typing or by writing) can
sometimes take valuable time and attention away from patient care,
can be distracting, and can often be inaccurately recreated from
memory after an EMS encounter.
SUMMARY
[0004] A method for tracking interactions in an emergency response
environment according to embodiments of the present invention
includes receiving color images and depth information from within a
field of view of a sensor array; maintaining an emergency encounter
record; monitoring one or both of a position of an object and
movement of the object in the emergency response environment based
on the color images and depth information received by the sensor
array; and recording an occurrence of a condition in the emergency
encounter record, wherein the condition is based on the one or both
of the position of the object and the movement of the object.
[0005] The method of paragraph [0004], wherein the object is a
human, and wherein monitoring one or both of the position of the
object and movement of the object comprises monitoring one or both
of the position of the human and movement of an at least partial
skeletal approximation of the human.
[0006] The method of any of paragraphs [0004] and [0005], wherein
the human is a first object, wherein the condition comprises the at
least partial skeletal approximation of the human coming within a
certain distance of a second object.
[0007] The method of any of paragraphs [0004] through [0006],
wherein the human is a first human, and wherein the second object
is a second human.
[0008] The method of any of paragraphs [0004] through [0007],
wherein the condition comprises the first human touching the second
human.
[0009] The method of any of paragraphs [0004] through [0008],
wherein the second human is a patient being treated by the first
human in the emergency response environment.
[0010] The method of any of paragraphs [0004] through [0009],
wherein recording the occurrence of the condition comprises
recording a time at which the condition occurs.
[0011] The method of any of paragraphs [0004] through [0010],
wherein recording the occurrence of the condition further comprises
recording a type of the condition.
[0012] The method of any of paragraphs [0004] through [0011],
wherein recording the occurrence of the condition further comprises
recording as video footage the color images received during the
occurrence of the condition.
[0013] The method of any of paragraphs [0004] through [0012],
further comprising: receiving streaming clinical data about a
patient, and correlating at least a portion of the streaming
clinical data in the emergency encounter record with the occurrence
of the condition.
[0014] The method of any of paragraphs [0004] through [0013],
wherein correlating the at least the portion of the streaming
clinical data comprises flagging the at least the portion of the
streaming clinical data corresponding to a time of the occurrence
of the condition.
[0015] A system for tracking interactions in an emergency response
environment according to embodiments of the present invention
includes a sensor array, wherein the sensor array is adapted to
receive color images and depth information in its field of view; a
control system communicably coupled to the sensor array, the
control system configured to: maintain an emergency encounter
record; monitor one or both of position and movement of an object
in the emergency response environment based on the color images and
depth information received from the sensor array; and record an
occurrence of a condition in the emergency encounter record,
wherein the condition is based on the one or both of position and
movement of the object.
[0016] A method for inventory control in an emergency response
environment, according to embodiments of the present invention,
includes detecting three-dimensional movement of a human body in
the emergency response environment with a sensor array, wherein the
sensor array generates visual information and depth information
about the emergency response environment; detecting
three-dimensional movement of an object in the emergency response
environment; determining an occurrence of contact between the human
body and the object; and recording an entry in an emergency
encounter record based on the occurrence of the contact.
[0017] The method of paragraph [0016], wherein the object is a
narcotic medication stored in an enclosure in the emergency
response environment, the method further comprising: determining,
based on the detection of the three-dimensional movement of the
human body and the object, an occurrence of intersection of the
human body with the enclosure; and recording an entry in the
emergency encounter record based on the occurrence of the
intersection.
[0018] The method of any of paragraphs [0016] and [0017], wherein
the object is a narcotic medication stored in an enclosure in the
emergency response environment, the method further comprising:
determining, based on the detection of the three-dimensional
movement of the object, an occurrence of removal of the narcotic
medication from the enclosure; and recording an entry in the
emergency encounter record based on the occurrence of the
removal.
[0019] The method of any of paragraphs [0016] through [0018],
further comprising updating an inventory database, based on the
occurrence of the removal, to reflect that the narcotic medication
has been used and needs restocking.
[0020] While multiple embodiments are disclosed, still other
embodiments of the present invention will become apparent to those
skilled in the art from the following detailed description, which
shows and describes illustrative embodiments of the invention.
Accordingly, the drawings and detailed description are to be
regarded as illustrative in nature and not restrictive.
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] FIG. 1 illustrates an emergency response environment with a
vehicle control system communicably coupled to other devices,
according to embodiments of the present invention.
[0022] FIG. 2 illustrates a computer system, according to
embodiments of the present invention.
[0023] FIG. 3 illustrates an emergency response environment with a
system that monitors three-dimensional interaction, according to
embodiments of the present invention.
[0024] FIG. 4 illustrates a system including a vehicle control
system and a sensor array, according to embodiments of the present
invention.
[0025] FIG. 5 illustrates a table listing various hand and finger
gestures that may be recognized by the system of FIG. 4, according
to embodiments of the present invention.
[0026] FIG. 6 illustrates a table listing various head and facial
gestures that may be recognized by the system of FIG. 4, according
to embodiments of the present invention.
[0027] FIG. 7 depicts a flow chart illustrating a method for
monitoring three-dimensional interaction in an emergency response
environment, according to embodiments of the present invention.
[0028] FIG. 8 depicts a flow chart illustrating a method for
monitoring three-dimensional interaction of a caregiver with a
patient in an emergency response environment, according to
embodiments of the present invention.
[0029] FIG. 9 depicts a flow chart illustrating a method for
monitoring three-dimensional interaction in an emergency response
environment for inventory control, according to embodiments of the
present invention.
[0030] FIG. 10 depicts a flow chart illustrating a method for
gesture recognition in an emergency response environment, according
to embodiments of the present invention.
[0031] While the invention is amenable to various modifications and
alternative forms, specific embodiments have been shown by way of
example in the drawings and are described in detail below. The
intention, however, is not to limit the invention to the particular
embodiments described. On the contrary, the invention is intended
to cover all modifications, equivalents, and alternatives falling
within the scope of the invention as defined by the appended
claims.
DETAILED DESCRIPTION
[0032] As illustrated in FIG. 1, a system 100 according to
embodiments of the present invention performs advanced data
management, integration and presentation of EMS data from multiple
different devices. System 100 includes a mobile environment 101, an
enterprise environment 102, and an administration environment 103.
Devices within the various environments 101, 102, 103 may be
communicably coupled via a network 120, such as, for example, the
Internet. System 100 is further described in Patent Cooperation
Treaty Application Publication No. WO 2011/011454, published on
Jan. 27, 2011, which is incorporated herein by reference in its
entirety for all purposes.
[0033] As used herein, the phrase "communicably coupled" is used in
its broadest sense to refer to any coupling whereby information may
be passed. Thus, for example, communicably coupled includes
electrically coupled by, for example, a wire; optically coupled by,
for example, an optical cable; and/or wirelessly coupled by, for
example, a radio frequency or other transmission media.
"Communicably coupled" also includes, for example, indirect
coupling, such as through a network, or direct coupling.
[0034] According to embodiments of the present invention, the
mobile environment 101 is an ambulance or other EMS vehicle--for
example a vehicular mobile environment (VME). The mobile
environment may also be the local network of data entry devices as
well as diagnostic and therapeutic devices established at time of
treatment of a patient or patients in the field environment--the
"At Scene Patient Mobile Environment" (ASPME). The mobile
environment may also be a combination of one or more of VMEs and/or
ASPMEs. The mobile environment may include a navigation device 110
used by the driver 112 to track the mobile environment's position
101, locate the mobile environment 101 and/or the emergency
location, and locate the transport destination, according to
embodiments of the present invention. The navigation device 110 may
include a Global Positioning System ("GPS"), for example. The
navigation device 110 may also be configured to perform
calculations about vehicle speed, the travel time between
locations, and estimated times of arrival. According to embodiments
of the present invention, the navigation device 110 is located at
the front of the ambulance to assist the driver 112 in navigating
the vehicle. The navigation device 110 may be, for example, a
RescueNet.RTM. Navigator onboard electronic data communication
system available from ZOLL Data Systems of Broomfield, Colo.
[0035] As illustrated in FIG. 1, a patient monitoring device 106
and a patient charting device 108 are also often used for patient
care in the mobile environment 101, according to embodiments of the
present invention. The EMS technician 114 attaches the patient
monitoring device 106 to the patient 116 to monitor the patient
116. The patient monitoring device 106 may be, for example, a
defibrillator device with electrodes and/or sensors configured for
attachment to the patient 116 to monitor heart rate and/or to
generate electrocardiographs ("ECG's"), according to embodiments of
the present invention. The patient monitoring device 106 may also
include sensors to detect or a processor to derive or calculate
other patient conditions. For example, the patient monitoring
device 106 may monitor, detect, treat and/or derive or calculate
blood pressure, temperature, respiration rate, blood oxygen level,
end-tidal carbon dioxide level, pulmonary function, blood glucose
level, and/or weight, according to embodiments of the present
invention. The patient monitoring device 106 may be a Zoll
E-Series.RTM. or X-Series defibrillator available from Zoll Medical
Corporation of Chelmsford, Mass., according to embodiments of the
present invention. A patient monitoring device may also be a
patient treatment device, or another kind of device that includes
patient monitoring and/or patient treatment capabilities, according
to embodiments of the present invention.
[0036] The patient charting device 108 is a device used by the EMS
technician 114 to generate records and/or notes about the patient's
116 condition and/or treatments applied to the patient, according
to embodiments of the present invention. For example, the patient
charting device 108 may be used to note a dosage of medicine given
to the patient 116 at a particular time. The patient charting
device 108 and/or patient monitoring device 106 may have a clock,
which may be synchronized with an external time source such as a
network or a satellite to prevent the EMS technician from having to
manually enter a time of treatment or observation (or having to
attempt to estimate the time of treatment for charting purposes
long after the treatment was administered), according to
embodiments of the present invention. The patient charting device
108 may also be used to record biographic and/or demographic and/or
historical information about a patient, for example the patient's
name, identification number, height, weight, and/or medical
history, according to embodiments of the present invention.
According to embodiments of the present invention, the patient
charting device 108 is a tablet PC, such as for example the
TabletPCR component of the RescueNet.RTM. ePCR Suite available from
Zoll Data Systems of Broomfield, Colo. According to some
embodiments of the present invention, the patient charting device
108 is a wristband or smart-phone such as an Apple iPhone or iPad
with interactive data entry interface such as a touch screen or
voice recognition data entry that may be communicably connected to
the VCS 104 and tapped to indicate what was done with the patient
116 and when it was done.
[0037] The navigation device 110, the charting device 108, and the
monitoring device 106 are each separately very useful to the EMS
drivers 112 and technicians 114 before, during, and after the
patient transport. A vehicle control system ("VCS") 104 receives,
organizes, stores, and displays data from each device 108, 110, 112
to further enhance the usefulness of each device 108, 110, 112 and
to make it much easier for the EMS technician 114 to perform
certain tasks that would normally require the EMS technician 114 to
divert visual and manual attention to each device 108, 110, 112
separately, according to embodiments of the present invention. In
other words, the VCS centralizes and organizes information that
would normally be de-centralized and disorganized, according to
embodiments of the present invention.
[0038] The VCS 104 is communicably coupled to the patient
monitoring device 106, the patient charting device 108, and the
navigation device 110, according to embodiments of the present
invention. The VCS 104 is also communicably coupled to a storage
medium 118. The VCS 104 may be a touch-screen, flat panel PC, and
the storage medium 118 may be located within or external to the VCS
104, according to embodiments of the present invention. The VCS 104
may include a display template serving as a graphical user
interface, which permits the user (e.g. EMS tech 114) to select
different subsets and/or display modes of the information gathered
from and/or sent to devices 106, 108, 110, according to embodiments
of the present invention.
[0039] Some embodiments of the present invention include various
steps, some of which may be performed by hardware components or may
be embodied in machine-executable instructions. These
machine-executable instructions may be used to cause a
general-purpose or a special-purpose processor programmed with the
instructions to perform the steps. Alternatively, the steps may be
performed by a combination of hardware, software, and/or firmware.
In addition, some embodiments of the present invention may be
performed or implemented, at least in part (e.g., one or more
modules), on one or more computer systems, mainframes (e.g., IBM
mainframes such as the IBM zSeries, Unisys ClearPath Mainframes, HP
Integrity NonStop servers, NEC Express series, and others), or
client-server type systems. In addition, specific hardware aspects
of embodiments of the present invention may incorporate one or more
of these systems, or portions thereof.
[0040] As such, FIG. 2 is an example of a computer system 200 with
which embodiments of the present invention may be utilized.
According to the present example, the computer system includes a
bus 201, at least one processor 202, at least one communication
port 203, a main memory 24, a removable storage media 205, a read
only memory 206, and a mass storage 207.
[0041] Processor(s) 202 can be any known processor, such as, but
not limited to, an Intel.RTM. Itanium.RTM. or Itanium 2.RTM.
processor(s), or AMD.RTM. Opteron.RTM. or Athlon MP.RTM.
processor(s), or Motorola.RTM. lines of processors. Communication
port(s) 203 can be any of an RS-232 port for use with a modem based
dialup connection, a 10/100 Ethernet port, or a Gigabit port using
copper or fiber, for example. Communication port(s) 203 may be
chosen depending on a network such a Local Area Network (LAN), Wide
Area Network (WAN), or any network to which the computer system 200
connects. Main memory 204 can be Random Access Memory (RAM), or any
other dynamic storage device(s) commonly known to one of ordinary
skill in the art. Read only memory 206 can be any static storage
device(s) such as Programmable Read Only Memory (PROM) chips for
storing static information such as instructions for processor 202,
for example.
[0042] Mass storage 207 can be used to store information and
instructions. For example, hard disks such as the Adaptec.RTM.
family of SCSI drives, an optical disc, an array of disks such as
RAID (e.g. the Adaptec family of RAID drives), or any other mass
storage devices may be used, for example. Bus 201 communicably
couples processor(s) 202 with the other memory, storage and
communication blocks. Bus 201 can be a PCI/PCI-X or SCSI based
system bus depending on the storage devices used, for example.
Removable storage media 205 can be any kind of external
hard-drives, floppy drives, flash drives, IOMEGA.RTM. Zip Drives,
Compact Disc-Read Only Memory (CD-ROM), Compact Disc-Re-Writable
(CD-RW), or Digital Video Disk-Read Only Memory (DVD-ROM), for
example. The components described above are meant to exemplify some
types of possibilities. In no way should the aforementioned
examples limit the scope of the invention, as they are only
exemplary embodiments.
[0043] FIG. 3 illustrates an emergency response environment with a
system 300 that monitors three-dimensional interaction, according
to embodiments of the present invention. System 300 includes a
sensor or sensor array 1. Sensor 1 may be a camera, video camera,
or other imaging device capable of collecting visual information.
According to some embodiments of the present invention, sensor 1 is
a sensor array that includes an image capture device, for example a
color image capture device, as well as a depth determining device,
for example an infrared emitter and infrared depth sensor. Sensor 1
may also include an audio capture device. For example, sensor 1 may
be a sensor array such as a Kinect.RTM. sensor array available from
Microsoft Corporation. Sensor 1 may also or alternatively be a
LEAP.TM. device available from Leap Motion, Inc. Sensor 1 may be,
or include, a wide variety of hardware that permits collection of
visual, depth, audio, and color information and the like, according
to embodiments of the present invention.
[0044] Sensor 1 may be placed within an emergency response
environment, for example in the back 152 of an ambulance 101, such
that activities of the patient 116 and/or crew members 2, 3 are at
least partially within its field of view. For example, sensor 1 may
be mounted on a wall or ceiling of the back compartment 152 of the
ambulance 101. The sensor 1 may also include, within its field of
view, a patient support 4, such as a bed, cot, or stretcher, upon
which a patient 116 is laying and/or being treated. The back 152 of
the ambulance 101 may further include a supply cabinet 5, for
example a medicine cabinet or narcotics cabinet, which may be
stocked with medicines, for example narcotic 6.
[0045] FIG. 4 illustrates a system including a vehicle control
system 104 communicably coupled with a sensor array 1, according to
embodiments of the present invention. Sensor array 1 may include an
imaging device 9, a depth sensor system 10, and/or an audio input
11, according to embodiments of the present invention. VCS 104 may
also be communicably coupled with a patient monitoring device 106,
a charting system 108, a navigation system 110, and vehicle
operations systems 8. The vehicle operation systems 8 may include
sensors and controllers installed in the vehicle relating to
vehicle safety and/or operation, including both
manufacturer-installed and aftermarket devices, for example vehicle
speed sensors, seatbelt detectors, accelerometers, and other
vehicle- and safety-related devices, including without limitation
those described in U.S. Provisional Patent Application Ser. No.
61/656,527, filed on Jun. 7, 2012, which is incorporated by
reference herein in its entirety for all purposes.
[0046] Vehicle control system 104 may be configured to create,
maintain, and/or update an encounter record 7, which may be stored
locally in an emergency response environment (for example in
database 118) and/or remotely on an enterprise database 130. The
encounter record 7 may include information obtained by the vehicle
control system 104 and each of the devices to which VCS 104 is
communicably coupled. Records in the encounter record 7 may be
specific to an encounter with a particular patient 116, and/or a
particular dispatch of the vehicle 101, for example.
[0047] The VCS 104 may be configured to track interactions in the
emergency response environment, for example interactions by and
among caregivers 2, 3, and patient 4 and/or objects in the
emergency response environment. The VCS 104 may be configured to
receive color images and depth information from within a field of
view of the sensor array 1. The VCS 104 may also be configured to
maintain an emergency encounter record 7, either locally and/or
remotely. The VCS 104 monitors a position of an object and/or
movement of the object in the emergency response environment based
on the color images and depth information received by the sensor
array 1. For example, the sensor array 1 may be a Kinect.RTM.
sensor array, and the VCS 104 may include software that receives
data from the sensor array 1 to detect or approximate movements and
locations of human bodies and their respective linkages (skeletal
joints and bones) in three-dimensional space.
[0048] As such, the VCS 104 can distinguish between different
humans in the field of view of the sensor 1, and can monitor or
observe the movements of two or more of such humans in the field of
view. According to some embodiments of the present invention, the
VCS 104 is configured to recognize which of the humans is a patient
and which is a caregiver. For example, VCS 104 may recognize a
human as a patient by observing that the particular human is laying
relatively still on the patient support 4, while another human is
an EMS technician 2 because the other human is standing up or
moving around the back of the ambulance 101. The VCS 104 may be
configured to track or monitor three-dimensional movements of one
or more humans in the emergency response environment by
approximating elements of their basic skeletal structure and, as
such, can determine when two humans are in contact or close
proximity. For example, the VCS 104 can determine when a hand or
arm of the EMS technician 2 reaches over and touches an area of the
patient's 116 body, according to embodiments of the present
invention.
[0049] Any or all of the information received by the VCS 104 from
the sensor array 1, as well as any additional data or information
derived from such sensor information, may be stored to the
encounter record 7. Such information may also be stored to the
encounter record 7 in a manner that correlates it with other data
in the encounter record 7 from other devices, for example records
in the encounter record 7 may include a time index and/or a patient
identification.
[0050] According to embodiments of the present invention, the VCS
104 is configured to record into the emergency encounter record 7
an occurrence of a condition. Such condition may be based on the
position of the object and/or the movement of the object. For
example, the object may be a human, and the VCS 104 may monitor the
human's movement (or a skeletal approximation thereof) in
three-dimensional space, and make an entry in the encounter record
7 when the human or part of the human intersects a certain location
(e.g. within the ambulance 101), or remains in a particular
location for a certain amount of time, or intersects or nears
another object. The VCS 104 may be configured to make an entry to
the encounter record 7 when one object (e.g. a human) comes within
a certain distance of another object (e.g. another human), for
example a zero or minimal distance at which the first object is
touching the second object. As such, the VCS 104 may be configured
to mark the encounter record 7 when a caregiver 2 or 3 approached
the patient 116 and/or touched the patient 116, or when an object
approached or touched the patient 116.
[0051] The VCS 104 may be configured to update the encounter record
7 in various ways based on the observance of a condition based on
three-dimensional visual and position data. For example, the VCS
104 may be configured to enter into the encounter record 7 a time
at which the condition occurred, and/or an identification of the
condition or type of condition that occurred, and/or other data
coinciding with the occurrence of the condition, for example video
data or color images covering the time or time range when the
condition occurred. In some cases, the VCS 104 receives streaming
clinical data about a patient 116, for example from a defibrillator
or other patient monitoring device 106 communicably coupled to the
patient, and correlates at least a portion of the streaming
clinical data in the emergency encounter record 7 with the
occurrence of the condition based on the sensor's 1 visual data.
According to embodiments of the present invention, correlating some
or all of the streaming clinical data includes flagging some or all
of the streaming clinical data that corresponds to a time of the
occurrence of the condition.
[0052] FIG. 7 illustrates a flow chart 700 showing the recording of
an occurrence of a condition based on three-dimensional position
and shape visual data, according to embodiments of the present
invention. One or more distinct objects are identified (block 702),
for example by VCS 104 and sensor 1. The position and/or movement
of the one or more objects are tracked or otherwise monitored or
modeled (block 704), and based on such tracking the VCS 104
identifies the occurrence of a condition (block 706). The
occurrence of the condition, or information about the condition, is
recorded in the patient encounter record 7 (block 708).
[0053] FIG. 8 illustrates a flow chart 800 describing a similar
method in greater detail, according to embodiments of the present
invention. An individual human or distinct humans are identified in
an emergency response environment, for example the back of an
ambulance (block 802). At least one of the humans is identified as
a patient (block 804). The position and/or movement of the one or
more humans is observed or tracked or otherwise modeled (block
806), and based thereon the VCS 104 identifies the occurrence of a
condition, for example the occurrence of patient treatment (block
808). Information about the patient contact may be recorded in the
encounter record 7 (block 810), for example by recording a time or
time range at which the condition (e.g. treatment) occurred (block
812), and/or by recording a type of contact (e.g. treatment) which
occurred (block 814).
[0054] For example, if the sensor 1 data supplied to the VCS 104
was interpreted by the VCS 104 as a caregiver's 2 hand going to the
head or mouth area of the patient 116, the VCS 104 may update the
encounter record 7 to reflect that an oral medication was or may
have been administered to the patient 116, and the particular time
which this occurred. Alternatively, or in addition, the VCS 104 may
be configured to prompt the EMS technician 2 or other caregiver at
a later time, for example after the emergency encounter or at the
end of a standard shift, to confirm or validate the perceived
interactions or conditions that were entered into the patient
encounter record 7. For example, the VCS 104 might observe the
occurrence of the EMS technician's 2 hand going to the face of the
patient 116 and flag such occurrence as the possible administration
of an oral medication, but when prompting the EMS technician 2 for
later confirmation, may give the EMS technician 2 the ability to
edit the observation to reflect that the interaction was instead a
turning of the head of the patient, or some other reason for why
the caregiver 2 contacted the patient 116.
[0055] FIG. 9 depicts a flow chart 900 illustrating a method for
monitoring three-dimensional interaction in an emergency response
environment for inventory control, according to embodiments of the
present invention. The VCS 104 may identify a particular location
within the emergency response environment, for example a supply
cabinet 5, using sensor 1 and known information about the
environment (block 902). The VCS 104 may also be configured for
customization regarding the locations of certain items in the
emergency response environment. For example, during an
initialization and/or configuration protocol, the VCS 104 may
prompt the user to run the user's finger or hand around an outer
perimeter of a supply cabinet 5 and/or a door thereto, so that the
VCS 104 can log the three-dimensional position of the supply
cabinet 5. Such cabinet 5 may be, for example, a narcotics cabinet
5 to which access is often controlled for safety and security
reasons.
[0056] The VCS 104 may identify individual humans in the emergency
response environment, for example the back of an ambulance (block
904), and track the position and/or movement of such humans (block
906). This may be done with visual and depth information received
from the sensor array 1, according to embodiments of the present
invention. Based on such visual and depth information received from
the sensor array 1, the VCS 104 may also detect or track
three-dimensional movement of an object in the emergency response
environment, for example a non-human object. The VCS 104 may
determine an occurrence of contact between the human body and the
object (block 908), for example an occurrence of the human body or
a portion thereof approaching and/or intersecting the narcotics
cabinet 5. The VCS 104 may also record an entry in an emergency
encounter record 7 based on the occurrence of the contact, for
example a note that the cabinet 5 was accessed (block 910) along
with a time (block 912) and/or an identity of the person who
accessed the cabinet 5 (block 914). The VCS 104 may be configured
to observe the occurrence of various different types of conditions
of note. For example, the VCS 104 may be configured to detect an
intersection of a human form with the area of the door or opening
to the cabinet 5. The VCS 104 may be configured to detect that a
shape that correlates to the shape of the narcotic medication 6 has
gone from inside such area of the door or cabinet opening to
outside such area. VCS 104 may also be configured to note whether a
human has an object in the human's hand as well as the shape and/or
size of the object. The VCS 104 may further be configured to update
an inventory database, based on the occurrence of the removal, to
reflect that the narcotic medication has been used and needs
restocking. Similar processes may be used to track the use of other
objects and the inventory associated therewith, as well as to track
in general the intersection of objects with humans and use thereby,
according to embodiments of the present invention. According to
some embodiments of the present invention, the occurrence of an
access event to the particular cabinet 5 may further trigger other
information gathering, for example it may trigger a camera on the
inside of the cabinet 5 and/or another video camera elsewhere in
the vehicle 101. The identity of each crew member accessing the
cabinet 5 may be recorded in the encounter record 7, according to
embodiments of the present invention.
[0057] FIG. 10 depicts a flow chart 1000 illustrating a method for
gesture recognition in an emergency response environment, according
to embodiments of the present invention. While system 400,
including VCS 104 and sensors 1, may be configured to track
motions, positions, and interactions of humans and objects as
described above, system 400 as well as VCS 104 and sensors 1 may
also or alternatively be configured to monitor such visual
information for the occurrence of gestures. In some cases,
three-dimensional position and visual information may be used to
monitor for gestures; in other cases, mere visual information may
be used to detect gestures (e.g. based on pattern recognition or
other visual cues or patterns). As such, sensor 1 may be one of a
number of various types of sensors or sensor arrays.
[0058] VCS 104 may be configured to track an entire human body
and/or one or more portions thereof to identify gestures being
made, for example gestures being made by one or more hands and/or
fingers or by the head and/or neck (block 1002). VCS 104 receives
visual information about at least a portion of a human body from at
least one sensor 1, and maintains the encounter record 7. The VCS
104 is configured to monitor the visual information to determine
movements of the at least the portion of the human body (for
example the hand or the head), and to recognize an occurrence of a
gesture based on the movements of the at least the portion of the
human body. For example, the VCS 104 recognizes one or more hand or
finger gestures based on visual and/or depth information received
by sensor 1, for example one or more hand or finger gestures listed
in FIG. 5. The VCS 104 may also recognize one or more head or
facial gestures based on visual and/or depth information received
by sensor 1, for example one or more head or facial gestures listed
in FIG. 6.
[0059] Examples of hand or finger gestures may include waving a
hand or finger, making a fist, raising the fist, shaking the first,
making the "thumbs up" signal, spreading fingers apart, displaying
a count (e.g. zero, one, two, three, four, five, six, seven, eight,
nine, or ten digits extended), pointing, moving hands together,
pulling hands apart, and/or tapping on the wrist. Examples of head
or facial gestures may include nodding the head, bobbing the head,
shaking the head side-to-side as in the "no" gesture, shaking the
head up and down as in the "yes" gesture, blinking, opening or
closing the mouth, sticking the tongue out, raising or lowering
eyebrows, and/or opening or closing the eyes.
[0060] When the VCS 104 recognizes a gesture, the VCS 104 records
an entry in the emergency encounter record 7 based on the
occurrence of the gesture (block 1004). Such a gesture may be
artificial, or alternatively such a gesture may be natural. An
artificial gesture is a gesture made by a human for the primary
purpose of triggering the condition with VCS 104. As such, an
artificial gesture may be a gesture that would not normally be made
in the normal course of treating a patient 116 in an emergency
response environment. For example, making a "thumbs up" signal is
one example of an artificial gesture. A patient whose head is
involuntarily bobbing is an example of a natural gesture, or a
gesture which is not performed only to trigger VCS 104.
[0061] The entry which the VCS 104 makes in the patient encounter
record 7 based on the recognition of the gesture may include
information about the type of gesture made (block 1006),
information about the time at which the gesture was made (block
1008), and/or information about other data values at the time the
gesture was made (block 1010), for example information about the
crew (block 1012), patient clinical data (block 1014), and vehicle
operation or safety conditions (block 1016). For example, VCS 104
may be configured to write the patient's 116 current blood pressure
reading to the encounter record 7 whenever VCS 104 receives visual
and/or depth information from the sensor 1 indicating that the
caregiver 2 attending to the patient 116 taps his or her left wrist
with the right hand or fingers (tapping the location where a watch
would normally be worn). Successive gestures may be used to take
the VCS 104 down various pathways and/or treatment protocols, or to
confirm previous gestures or options that become available because
of those gestures. For example, the VCS 104 may be configured to
record a blood pressure reading to the encounter record 7 when it
identifies the wrist tapping gesture followed by a chest tapping
gesture, and may be configured to record an ECG waveform signal to
the encounter record 7 when it identifies the same wrist tapping
gesture followed by a back-of-the-neck tapping gesture. The VCS 104
may also be configured to record in the encounter record 7 the
audiovisual (e.g. video and/or audio) information received during
or within a certain time range of the gesture, according to
embodiments of the present invention.
[0062] According to some embodiments of the present invention, the
VCS is configured to identify simultaneous occurrence of gestures,
for example two or more gestures selected from FIG. 5, FIG. 6, or
any other natural or artificial gestures. According to some
embodiments of the present invention, the VCS 104 is configured to
identify simultaneous occurrence of gestures along with position
and/or movement information for entire human bodies or portions
thereof, or simultaneous occurrence of other factors such as
vehicle position along the ambulance route, patient vital signs,
and/or vehicle speed. VCS 104 may also be configured to identify
simultaneous occurrence of gestures by the same person, for example
a different or similar gesture with each hand, or a hand and a
head. For example, the VCS 104 may be configured to recognize a
hand waiving gesture and to make a record in the encounter record 7
and notify the ambulance driver to slow down if the hand waiving
gesture is received at a time when the vehicle speed is exceeding
60 miles per hour. In this way, the visually recognized gestures
may be paired or correlated or combined with other information
received by VCS 104, either in the creation of the condition which
triggers a further event (such as writing to the encounter record 7
or creating a notification or some other action), or in the
creation of the entry to the encounter record 7 itself (for example
the types of information that would be flagged or gathered or
otherwise noted upon occurrence of the condition).
[0063] According to some embodiments, the VCS 104 identifies
(either in the encounter record 7 or for other devices) whether a
patient is being transported by the vehicle 101, for example by
determining whether a human figure is sitting on or laying on the
patient support 4. The VCS 104 may also identify the position of a
patient or a crew member, for example whether the patient or crew
member is sitting or standing. The VCS 104 may also receive from
sensor 1 information about structures beyond a normal emergency
response environment, for example larger-scale depth images of
emergency incidents such as buildings on fire, to aid in the
location of emergency workers and/or victims.
[0064] Although one sensor 1 is shown and described, multiple
sensors 1, either of the same type of different types, may be
communicably coupled with VCS 104. Multiple sensors 1 may be used
to expand the field or depth of view, or to collect similar
information from a different viewing angle, in order to observe
more objects or humans, or gather more detailed information about
shapes and/or movements. And although sensor 1 is described as
being mounted within a vehicle, sensor 1 or multiples thereof may
alternatively be mounted on a device (for example a defibrillator
taken to an emergency response scene) and/or on a person (for
example on a crew member's helmet).
[0065] Embodiments of the present invention may also be used for
charting and/or counting functions. Often, medics must reconstruct
past events that occurred during patient treatment. Embodiments of
the present invention improve accuracy and help to accurately
document times at which various events occurred. For example, the
VCS 104 may recognize boundaries of multiple cabinets or storage
areas within an ambulance 101, and may log the times at which each
storage area was accessed by a medic, as well as the identity (e.g.
obtained from voice or body or facial recognition) of the medic who
accessed the area. Such a "bounding volume" may be preprogrammed
into VCS 104 and/or customized or initialized upon installation of
VCS 104, sensor 1, and/or a new storage area. The VCS 104 may count
a number of boxes on the floor of the ambulance to determine a
number of items used in the encounter, and reconcile that with the
medications and other durable goods charted for the patient
encounter. The VCS 104 may then prompt the medic for additional
information to help reconcile the encounter record 7.
[0066] As described above, the system 400 may also determine when a
patient is being touched, either by another human or by an
implement held by another human. This information may be used
either during the patient encounter, or afterward, to determine
whether inappropriate patient contact has occurred. The system 400
may determine when an IV is being started. System 400 may also use
gesture-based charting, for example quick-logging with artificial
gestures, to save time over manual entry or typing of such
information. Embodiments of the present invention may also include
voice recognition, which may filter out siren sounds or road
sounds, and which may also provide feedback to the crew.
Embodiments of the present invention may also be configured to
identify crew members, for example through facial recognition,
pattern recognition, name badge reading, skeletal modeling, habits
or movements, or via another mechanism such as crew logins or RFID
badges which are also communicably coupled to VCS 104. According to
some embodiments of the present invention, the system 400 may be
used for security monitoring, to detect the presence of
unidentified or unwanted intruders in the vehicle 101.
[0067] According to some embodiments of the present invention, the
system 400 may be used to begin tracking a person when the person
makes a gesture or performs a certain activity, and then continue
to track the same person after the gesture or activity, for a
certain period of time or until another event occurs, for example
another visual event. In some embodiments, the system 400
identifies an operator of a medical device using visual
information; for example, a patient monitoring device 106, such as
a defibrillator, may include a camera or other type of sensor array
1, and upon use of the device 106 the device 106 may observe visual
characteristics of the person directly in front of the device 106
in order to identify the person or monitor or interpret activities
of that person. The system 400 may also be configured to recognize
or identify in its field of view equipment used by medical
personnel, either by visual cues or otherwise, and may perform
similar medical personnel identification or visual monitoring even
when the camera or sensor array 1 is not in or near the device
being used. Such multiple devices used by medical personnel may be
wirelessly or otherwise communicably coupled with each other and/or
with system 400, so that activities performed on various devices
and by the personnel are correlated for a more complete patient
record without requiring manual annotation, according to
embodiments of the present invention. The system 400 may be mounted
not only in a vehicle, such as the back of an ambulance, but system
400 and/or parts thereof may also be integrated into or mounted on
a medical device, including a portable medical device such as a
defibrillator.
[0068] The system 400 may also be configured to "remember" a person
based on that person's gestures; for example, the system 400 may
observe certain gestures performed by a person one day after the
person identifies himself or herself to the system 400, and may
then visually identify the same person the next day based on
observing similar gestures, even if the person has not specifically
identified himself or herself to the system 400 on the following
occasion. The system 400 may also be configured to count the number
of distinct individual people in a given area, according to
embodiments of the present invention.
[0069] The system 400 may also be configured to monitor certain
activities and to interpret various aspects of those activities,
and even to provide feedback to the performer of the activities
either in real time or in a later review. For example, the system
400 may monitor an EMS technician's twelve-lead placement on a
patient, and/or may provide adaptive feedback, for example adaptive
feedback to a person who is administering cardiopulmonary
resuscitation. The system 400 may also be configured to identify a
certain portion of the body, or an object held by a person, and to
track the movement of the body part or object and record the
tracked motion as writing. For example, an EMS technician could
write numbers, letters, or words in the air using a finger, and the
system 400 may be configured to record such movement as writing.
The EMS technician may initiate such "air writing" recording mode
with a gesture or other activation; in other embodiments, the
system 400 automatically recognizes such "air writing" based on the
absence of other objects with which the user's hand or finger could
be interacting, for example for a certain period of time. Such
recording capabilities may save the EMS technician time in data
entry or patient charting, and would permit the medical
professional to create charting entries and other writings even
when the medical professional's hands are dirty, or when the
medical professional does not wish to physically touch devices so
as to maintain sterility for hands or gloved hands, according to
embodiments of the present invention.
[0070] Various modifications and additions can be made to the
exemplary embodiments discussed without departing from the scope of
the present invention. For example, while the embodiments described
above refer to particular features, the scope of this invention
also includes embodiments having different combinations of features
and embodiments that do not include all of the described features.
Accordingly, the scope of the present invention is intended to
embrace all such alternatives, modifications, and variations as
fall within the scope of the claims, together with all equivalents
thereof.
* * * * *