U.S. patent application number 12/062631 was filed with the patent office on 2008-10-09 for distributed patient monitoring system.
This patent application is currently assigned to Siemens Medical Solutions USA, Inc.. Invention is credited to John R. Zaleski.
Application Number | 20080249376 12/062631 |
Document ID | / |
Family ID | 39827564 |
Filed Date | 2008-10-09 |
United States Patent
Application |
20080249376 |
Kind Code |
A1 |
Zaleski; John R. |
October 9, 2008 |
Distributed Patient Monitoring System
Abstract
A distributed patient monitoring system for visually monitoring
patients and patient parameters using portable processing devices
in different remote locations includes a monitoring processor. The
monitoring processor is responsive to user initiated commands from
multiple different portable processing devices in different remote
locations and includes an input processor and a data processor. The
input processor acquires vital sign parameters and associated video
data representative of multiple sequences of video images of
corresponding multiple different patients. The data processor
processes the vital sign parameters and associated video data to
provide processed first video data representing an image sequence
including a composite image including a first area showing live
video of a selected first patient and a second area presenting
vital sign parameters of the selected first patient. The data
processor also processes the vital sign parameters and associated
video data to provide processed second video data representing an
image sequence including a composite image including a first area
showing live video of a selected second patient and a second area
presenting vital sign parameters of the selected second patient. A
communication network has bandwidth sufficient to communicate the
processed first video data and second video data to first and
second portable processing devices respectively of the multiple
different portable processing devices in different remote locations
in response to commands received from the first and second portable
processing devices respectively.
Inventors: |
Zaleski; John R.; (Elkton,
MD) |
Correspondence
Address: |
SIEMENS CORPORATION;INTELLECTUAL PROPERTY DEPARTMENT
170 WOOD AVENUE SOUTH
ISELIN
NJ
08830
US
|
Assignee: |
Siemens Medical Solutions USA,
Inc.
Malvern
PA
|
Family ID: |
39827564 |
Appl. No.: |
12/062631 |
Filed: |
April 4, 2008 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
60910674 |
Apr 9, 2007 |
|
|
|
60911302 |
Apr 12, 2007 |
|
|
|
Current U.S.
Class: |
600/301 ;
600/300 |
Current CPC
Class: |
A61B 5/0006 20130101;
A61B 5/0013 20130101; A61B 5/7232 20130101; G16H 30/20 20180101;
G16H 40/67 20180101; G06Q 40/08 20130101 |
Class at
Publication: |
600/301 ;
600/300 |
International
Class: |
A61B 5/00 20060101
A61B005/00 |
Claims
1. A distributed patient monitoring system for visually monitoring
patients and patient parameters using a plurality of portable
processing devices in different remote locations, comprising: a
monitoring processor responsive to user initiated commands from a
plurality of different portable processing devices in different
remote locations and including, an input processor for acquiring
vital sign parameters and associated video and audio data and for
controlling the selected viewing of said data representative of
plurality of sequences of video images of a corresponding plurality
of different patients; and a data processor for processing said
vital sign parameters and associated video data to provide,
processed first video data representing an image sequence including
a composite image including a first area showing live video of a
selected first patient and a second area presenting vital sign
parameters of said selected first patient and processed second
video data representing an image sequence including a composite
image including a first area showing live video of a selected
second patient and a second area presenting vital sign parameters
of said selected second patient; and a communication network of
bandwidth sufficient to communicate said processed first video data
and second video data to first and second portable processing
devices respectively of said plurality of different portable
processing devices in different remote locations in response to
commands received from said first and second portable processing
devices respectively.
2. A system according to claim 1, wherein said data processor
processes said vital sign parameters and associated video data to
provide processed first and second video data by encoding said
video data with a compression function.
3. A system according to claim 2, wherein said compression function
is compatible with at least one of, (a) MPEG-4, (b) MPEG-2 and (c)
DIVX.
4. A system according to claim 1, wherein said input processor
acquires audio data of said plurality of different patients, said
data processor processes said audio data to provide processed audio
data and said communication network is of bandwidth sufficient to
communicate said processed first video data and second video data
and audio data to said first and second portable processing devices
respectively.
5. A system according to claim 4, wherein said input processor
acquires said audio data from a plurality of different microphones
in patient rooms associated with said plurality of different
patients.
6. A system according to claim 4, wherein said data processor
processes said audio data to provide processed audio data by
encoding said audio data with a compression function.
7. A system according to claim 1, wherein said input processor
acquires said vital sign parameters from patient monitoring devices
attached to said plurality of different patients.
8. A system according to claim 1, wherein said input processor
acquires said associated video data from a plurality of different
cameras in patient rooms associated with said plurality of
different patients.
9. A system according to claim 1 including a rules processor for
applying rules to said vital sign parameters to identify an alert
condition indicating a significant patient clinical condition or
change of clinical condition and wherein said data processor
processes data representing said alert condition for inclusion in
said processed first video data and said composite image includes
an image element indicating said alert condition.
10. A system according to claim 1, wherein said data processor
provides said processed first video data and processed second video
data for concurrent viewing by users of said first and second
portable processing devices respectively.
11. A system according to claim 1, wherein said data processor
provides said processed first video data and processed second video
data for concurrent viewing of the same patient by users of said
first and second portable processing devices respectively.
12. A system according to claim 1, wherein said data processor
enables a user to select and manipulate video image capture and
display.
13. A distributed patient monitoring system for monitoring patients
via visual and audio means and for viewing and monitoring patient
parameters using a plurality of portable processing devices
concurrently in different remote locations over distributed wired
or wireless networks, comprising: a monitoring processor responsive
to user initiated commands from a plurality of different portable
processing devices in different remote locations and including, an
input processor for acquiring vital sign parameters and associated
video data representative of plurality of sequences of video images
of a corresponding plurality of different patients; and a data
processor for processing said vital sign parameters and associated
video data to provide processed video data representing an image
sequence including a composite image including a first area showing
live video of a selected patient and a second area presenting vital
sign parameters of said selected patient in response to user
selection of an image element associated with said selected patient
in a display image provided by a clinical information application;
and a communication network of bandwidth sufficient to communicate
said processed video data to first and second portable processing
devices respectively of said plurality of different portable
processing devices in different remote locations in response to
commands received from said first and second portable processing
devices respectively.
14. A system according to claim 13, wherein said image element
associated with said selected patient comprises a hyperlink.
15. A system according to claim 13, wherein said image element
associated with said selected patient is presented in a list of
different patients in said display image provided by said clinical
information application.
16. A system for use by a portable processing device operating in a
distributed patient monitoring system for visually monitoring
patients and patient parameters using a plurality of portable
processing devices in different remote locations, comprising: an
authentication processor enabling a user to obtain access
authorization to access patient data in response to entry of
identification data using any portable processing device of said
plurality of portable processing devices in different remote
locations; a user interface, in response to said access
authorization, enabling a user to, initiate execution of a clinical
information application providing a user with a clinical
application display image identifying a plurality of different
patients in a corresponding plurality of different locations and
select a particular patient of said plurality of different patients
in said clinical application display image; and a display processor
for initiating generation of data representing an image sequence
for presentation in a composite image including a first area
showing live video of a selected particular patient and a second
area presenting vital sign parameters of said particular patient in
response to user selection of an image element associated with said
particular patient of said plurality of different patients in said
clinical application display image.
17. A system according to claim 16, including a rules processor for
applying rules to said vital sign parameters to identify an alert
condition indicating a significant patient clinical condition or
change of clinical condition and wherein said composite image
includes an image element indicating said alert condition.
18. A system for use by a plurality of portable processing devices
in different remote locations operating in a distributed patient
monitoring system for visually monitoring patients and patient
parameters, comprising: an authentication processor enabling a user
to obtain access authorization to access patient data in response
to entry of identification data using any portable processing
device of said plurality of portable processing devices in
different remote locations; a first portable processing device of
said plurality of portable processing devices, having a user
interface enabling a user, in response to said access
authorization, to, initiate execution of a clinical information
application providing a user with a clinical application display
image identifying a plurality of different patients in a
corresponding plurality of different locations, select a first
patient of said plurality of different patients in said clinical
application display image and display an image sequence including a
composite image comprising a first area showing live video of said
first patient and a second area presenting vital sign parameters of
said first patient; and a second portable processing device of said
plurality of portable processing devices, having a user interface
enabling a user, concurrently with operation of said first portable
processing device and in response to said access authorization, to,
initiate execution of a clinical information application providing
a user with a clinical application display image identifying a
plurality of different patients in a corresponding plurality of
different locations, select a second patient of said plurality of
different patients in said clinical application display image and
display an image sequence including a composite image comprising a
first area showing live video of said second patient and a second
area presenting vital sign parameters of said second patient.
19. A distributed patient monitoring system for visually monitoring
patients and patient parameters using a plurality of portable
processing devices in different remote locations, comprising: a
monitoring processor responsive to user initiated commands from a
plurality of different portable processing devices in different
remote locations and including, an input processor for acquiring
vital sign parameters and associated video data representative of
plurality of sequences of video images of a corresponding plurality
of different patients; and a data processor for processing said
vital sign parameters and associated video data to provide,
processed first video data representing an image sequence including
a first image including an area displaying user selectable image
elements individually associated with corresponding individual
patients of said plurality of different patients; and processed
second video data representing an image sequence including a
composite image including a first area showing live video of a
selected first patient and a second area presenting vital sign
parameters of said selected first patient in response to user
selection of an image element associated with said first patient in
said first image; and a communication network of bandwidth
sufficient to communicate said processed first video data and
second video data to first and second portable processing devices
respectively of said plurality of different portable processing
devices in different remote locations in response to commands
received from said first and second portable processing devices
respectively.
Description
[0001] This is a non-provisional application of provisional
application Ser. No. 60/910,674 filed Apr. 9, 2007 and of
provisional application Ser. No. 60/911,302 filed Apr. 12, 2007, by
J. R. Zaleski.
FIELD OF THE INVENTION
[0002] This invention concerns a distributed patient monitoring
system for visually monitoring patients and patient parameters
using a plurality of portable processing devices in different
remote locations.
BACKGROUND OF THE INVENTION
[0003] Monitoring of patients, particularly patients in critical
care is a burdensome and labor intensive task. This problem has
been addressed by use of a centralized monitoring facility enabling
a physician at the workstation of the centralized monitoring
facility to monitor patient vital signs and video and audio. One
known remote centralized patient monitoring system, described in
U.S. Pat. No. 6,804,656, provides fixed location, static
centralized monitoring of ICUs by a physician. The centralized
monitoring employs a single command center and a workstation
provides a single display area operated by clinical personnel.
However it is fixed in location, inflexible in performance and
architecture and fails to accommodate high bandwidth communication
of patient related data. A system according to invention principles
addresses these deficiencies and related problems.
SUMMARY OF THE INVENTION
[0004] A distributed patient monitoring system enables visual
monitoring of patients and patient parameters using live motion
video and audio data presented on multiple portable processing
devices in different remote locations in response to user selection
of a specific patient related item in an image showing specific
patient electronic medical record data or a patient census list,
for example. A distributed patient monitoring system for visually
monitoring patients and patient parameters using portable
processing devices in different remote locations includes a
monitoring processor. The monitoring processor is responsive to
user initiated commands from multiple different portable processing
devices in different remote locations and includes an input
processor and a data processor. The input processor acquires vital
sign parameters and associated video data representative of
multiple sequences of video images of corresponding multiple
different patients. The data processor processes the vital sign
parameters and associated video data to provide processed first
video and audio data representing an image sequence and providing
two-way audio communication including a composite image of a first
area showing live video of a selected first patient and a second
area presenting vital sign parameters of the selected first patient
together with ancillary clinical data (e.g.: laboratory, physician
notes, etc.). The data processor also processes the vital sign
parameters and associated video data to provide processed second
video data representing an image sequence including a composite
image including a first area showing live video of a selected
second patient and a second area presenting vital sign parameters
of the selected second patient. A communication network has
bandwidth sufficient to communicate the processed first video data
and second video data to first and second portable processing
devices respectively of the multiple different portable processing
devices in different remote locations in response to commands
received from the first and second portable processing devices
respectively.
BRIEF DESCRIPTION OF THE DRAWING
[0005] FIG. 1 shows a distributed patient monitoring system for
visually monitoring patients and patient parameters using a
plurality of portable processing devices in different remote
locations, according to invention principles.
[0006] FIG. 2 shows an architecture of a distributed patient
monitoring system, according to invention principles.
[0007] FIG. 3 shows a Web-browser compatible user interface display
image window enabling a user to initiate visual and audio
monitoring of one or more patients and associated sets of patient
parameters, according to invention principles.
[0008] FIG. 4 shows a networked linkage of components of the
distributed patient monitoring system present in patient rooms
including a mobile unit which can be wheeled into patient rooms as
needed to support remote viewing of a patient anywhere within a
healthcare enterprise, according to invention principles.
[0009] FIG. 5 illustrates a pan-tilt-zoom camera control system
used for controlling a camera in a patient room, according to
invention principles.
[0010] FIG. 6 shows a Web-browser based user interface display
image window enabling a user to initiate remote visual and audio
monitoring of a particular patient, according to invention
principles.
[0011] FIG. 7 illustrates structured association of camera, room
and channel identifier data, for example, for use in mapping a
selected patient to a corresponding camera, according to invention
principles.
[0012] FIGS. 8A and 8B illustrate data flow in a distributed
patient monitoring system, according to invention principles.
[0013] FIG. 9 shows a flowchart of a process performed by a
distributed patient monitoring system, according to invention
principles.
DETAILED DESCRIPTION OF THE INVENTION
[0014] A distributed patient monitoring system enables a user to
visually monitor patients and patient parameters using live motion
video and audio data presented on multiple portable processing
devices in different remote locations comprising distributed
personal command centers. A mobile or stationary clinician in a
healthcare enterprise monitors live motion video and audio data of
a patient within a hospital room presented using a Web browser on a
wireless tablet personal computer, palm pilot or other portable
device. Execution of an individual command center application is
initiated from within a patient specific display image view
presenting specific patient electronic medical record data, in
response to user selection of a specific patient related item or an
item in a patient census list, for example. Patient identifier
information is employed in acquiring video and audio data of a
particular patient using association of the patient identifier with
patient medical information and specific room and bed identifiers.
The system in one embodiment advantageously employs a mobile
hardware unit that enables viewing of any patient in the healthcare
enterprise. Mobile units are located within an enterprise and use
radio frequency identification. The radio frequency identification
tags placed on a mobile unit transmit location representative data
to a centralized processor which associates the particular
enterprise location with a patient location determined from the
health information system. This enables a mapping of a location of
the mobile unit of the viewing hardware to a particular patient
clinical record, thereby enabling a user to view video and hear
audio directly from a patient bedside when selected via the health
information system.
[0015] Multiple clinicians at multiple locations are able to
concurrently view patient information as well as communicate
verbally via audio linkage with occupants of the patient room.
Similarly, multiple viewing clinicians can, in turn and based on an
on-line collaborative mechanism, alternately perform remote
pan-tilt-zoom operation of the camera in the patient room. Patient
parameters including vital sign data (heart rate, blood pressure,
blood oxygen saturation etc. including data normally taken and
displayed from within patient flow sheets) is also visible in an
electronic medical record display image view. A health information
system application analyzes individual patient vital sign data by
comparing discrete values (values that are validated by a nurse for
inclusion in a patient record) with predetermined thresholds.
[0016] The system enables one or more clinicians to view live
motion video and to transmit and receive audio via a Web-enabled
plug-in software component that allows the user to view a specific
patient as part of the normal patient care management process.
Video and audio data are acquired from cameras located within
patient rooms and is associated with specific patients via virtual
electronic linkages that permit associating patient demographic
information (including standard patient identifiers) with a patient
location. This information is used to launch a patient specific
data image display view via a Web-based electronic health record as
a child process that acquires patient specific information and
searches for this information to display within a Web-browser on
either a wired or wirelessly communicating computing device. Video
and audio representative data derived at the point of care is also
captured using a wireless mobile embodiment, thereby enabling
viewing of any patient at any location within a healthcare
enterprise. The patient information is viewed using a Web-based
computer application that allows one or more distributed users to
view a patient at any time from substantially any location within a
healthcare organization. Multiple clinicians can view multiple
patients concurrently or individually. In addition, patient
parameter information is displayed and is visible to multiple
clinicians concurrently. Discrete patient parameters (e.g., vital
sign) information (validated by a nurse for inclusion within a
patient health record) is processed using a rule information engine
to assess whether the parameter values fall within normal ranges or
meet certain thresholds.
[0017] The system provides a visual and audio link with patients
through a Web-enabled browser and displays this information through
a context-based link that enables clinicians to view specific
patients without requiring them to select the patients from a
census list, thereby facilitating the rapid review of patients and
their parameters within their care. Web-based accessibility from
within the patient record allows for remote viewing and
collaboration among healthcare professionals virtually anywhere
within a healthcare enterprise, advantageously obviating the need
for a clinician to return to, or contact, a centrally located
command center.
[0018] A processor, as used herein, operates under the control of
an executable application to (a) receive information from an input
information device, (b) process the information by manipulating,
analyzing, modifying, converting and/or transmitting the
information, and/or (c) route the information to an output
information device. In specific embodiments a processor determines
location of a mobile video and audio unit at a patient bedside and
provides the capability for multiple viewing healthcare
professionals to concurrently view and communicate verbally with a
patient or healthcare providers present at the patient bedside. A
processor may use, or comprise the capabilities of, a controller or
microprocessor, for example. The processor may operate with a
display processor or generator. A display processor or generator is
a known element for generating signals representing display images
or portions thereof. A processor and a display processor may
comprise a combination of, hardware, firmware, and/or software.
[0019] An executable application, as used herein, comprises code or
machine readable instructions for conditioning the processor to
implement predetermined functions, such as those of an operating
system, a context data acquisition system or other information
processing system, for example, in response to user command or
input. An executable procedure is a segment of code or machine
readable instruction, sub-routine, or other distinct section of
code or portion of an executable application for performing one or
more particular processes. These processes may include receiving
input data and/or parameters, performing operations on received
input data and/or performing functions in response to received
input parameters, and providing resulting output data and/or
parameters. A user interface (UI), as used herein, comprises one or
more display images, generated by a display processor and enabling
user interaction with a processor or other device and associated
data acquisition and processing functions.
[0020] The UI also includes an executable procedure or executable
application. The executable procedure or executable application
conditions the display processor to generate signals representing
the UI display images. These signals are supplied to a display
device which displays the image for viewing by the user. The
executable procedure or executable application further receives
signals from user input devices, such as a keyboard, mouse, light
pen, touch screen or any other means allowing a user to provide
data to a processor. The processor, under control of an executable
procedure or executable application, manipulates the UI display
images in response to signals received from the input devices. In
this way, the user interacts with the display image using the input
devices, enabling user interaction with the processor or other
device. The functions and process steps (e.g., of FIG. 9) herein
may be performed automatically or wholly or partially in response
to user command. An activity (including a step) performed
automatically is performed in response to executable instruction or
device operation without user direct initiation of the activity.
Workflow comprises a sequence of tasks performed by a device or
worker or both. An object or data object comprises a grouping of
data, executable instructions or a combination of both or an
executable procedure.
[0021] A workflow processor, as used herein, processes data to
determine tasks to add to a task list, remove from a task list or
modifies tasks incorporated on, or for incorporation on, a task
list. A task list is a list of tasks for performance by a worker or
device or a combination of both. A workflow processor may or may
not employ a workflow engine. A workflow engine, as used herein, is
a processor executing in response to predetermined process
definitions that implement processes responsive to events and event
associated data. The workflow engine implements processes in
sequence and/or concurrently, responsive to event associated data
to determine tasks for performance by a device and or worker and
for updating task lists of a device and a worker to include
determined tasks. A process definition is definable by a user and
comprises a sequence of process steps including one or more, of
start, wait, decision and task allocation steps for performance by
a device and or worker, for example. An event is an occurrence
affecting operation of a process implemented using a process
definition. The workflow engine includes a process definition
function that allows users to define a process that is to be
followed and includes an Event Monitor, which captures events
occurring in a Healthcare Information System. A processor in the
workflow engine tracks which processes are running, for which
patients, and what step needs to be executed next, according to a
process definition and includes a procedure for notifying
clinicians of a task to be performed, through their worklists (task
lists) and a procedure for allocating and assigning tasks to
specific users or specific teams.
[0022] FIG. 1 shows distributed patient monitoring system 10 for
visually monitoring patients and patient parameters in patient
rooms 41 using multiple portable processing devices 12 and 14 in
different remote locations. System 10 includes portable devices
(e.g., notebooks, Personal Digital Assistants, cell phones) 12 and
14, at least one repository 17, Clinical Information System
Application (CIS) 51 and server 20 as well as patient rooms 41
(including cameras and patient monitoring devices)
inter-communicating via network 21. Portable devices 12 and 14
individually include memory 28 and user interface 26. User
interface 26 provides data representing display images for
presentation on portable device 12 and 14.
[0023] Server 20 includes monitoring processor 15, data processor
29, input processor 27, authentication processor 39, workflow
processor 34 and rules processor 19. Monitoring processor 15 is
responsive to user initiated commands from multiple different
portable processing devices 12 and 14 in different remote locations
and includes input processor 27 and data processor 29. Workflow
processor 34 initiates tracks and monitors task sequences performed
by personnel and systems in response to events. Input processor 27
acquires vital sign parameters and associated video data
representative of multiple sequences of video images of
corresponding multiple different patients. Data processor 29
processes vital sign parameters and associated video data to
provide processed first video data representing an image sequence
including a composite image including a first area showing live
video of a selected first patient and a second area presenting
vital sign parameters of the selected first patient. Data processor
29 similarly provides processed second video data representing an
image sequence including a composite image including a first area
showing live video of a selected second patient and a second area
presenting vital sign parameters of the selected second patient.
Data processor 29 processes the vital sign parameters and
associated video data to provide processed first and second video
data by encoding the video data with a compression function
compatible with, MPEG-4, MPEG-2 or DIVX, for example.
[0024] Input processor 27 acquires audio data of multiple different
patients, data processor 29 processes the audio data to provide
processed audio data by encoding the audio data with a compression
function and communication network 21 is of bandwidth sufficient to
communicate the processed first video data and second video data
and audio data to first and second portable processing devices 12
and 14 respectively. Network 21 has sufficient bandwidth to convey
video and audio data between rooms and other devices of the
network. Input processor 27 acquires the audio data from multiple
different microphones in patient rooms associated with multiple
different patients and similarly acquires the associated video data
from multiple different cameras in patient rooms associated with
the multiple different patients. Further, input processor 27
acquires the vital sign parameters from patient monitoring devices
attached to the multiple different patients.
[0025] Communication network 21 has sufficient bandwidth to
communicate the processed first video data and second video data to
first and second portable processing devices 12 and 14 respectively
of the multiple different portable processing devices in different
remote locations in response to commands received from the first
and second portable processing devices 12 and 14 respectively.
Rules processor 19 applies rules to the vital sign parameters to
identify an alert condition indicating a significant patient
clinical condition or change of clinical condition. Data processor
29 processes data representing the alert condition for inclusion in
the processed first video data and the composite image includes an
image element indicating the alert condition. Further,
authentication processor 39 enables a user to obtain access
authorization to access patient data in response to entry of
identification data using any portable processing device of the
multiple portable processing devices in different remote
locations.
[0026] System 10 supports distributed patient monitoring, without
centralization, so a clinician may view the patients themselves via
video and their vitals signs and listen to an individual or
multiple selected patients from anywhere within an enterprise.
Multiple clinicians may view multiple patients or single patients
concurrently through a wireless or hand held portable device 12 and
14. Rules processor 19 analyzes validated discrete patient
parameters by comparing patient vital sign parameters with
predetermined thresholds. User interface processor 26 employs a Web
browser application supporting viewing video transmitted through
existing hospital network 21 as MPEG-4 (for example) compatible
compressed images. Patient rooms 41 incorporate equipment including
cameras connected via network 21 to portable devices 12 and 14 and
one or more servers (e.g., server 20). Portable devices 12 and 14
incorporate a virtual camera controller Web compatible application
allowing a clinician to control pointing, zoom, focus, and an iris
of patient room cameras in pan, tilt and zoom operations via a Web
browser in wireless portable devices 12 and 14.
[0027] FIG. 2 shows one embodiment of an architecture of a
distributed patient monitoring system. A monitoring application
within application server 76 enables the live viewing of patient
data using video and audio hardware incorporated in portable device
12. Application server 76 links to Web-server 80 comprising an
enterprise health information system managing access to electronic
medical records. Web server 80 provides patient-specific
information to the monitoring application once launched, enabling
the viewing of specific patients from within a browser-based video
viewer. Cameras and microphones are located within each patient
room via a mobile or fixed embodiment and these rooms 41 are
associated with specific patients via mapping information in a
database. This association is made at the time of in-patient
registration. Rules information engine server 82 enables analyzing
validated discrete patient vital sign parameters (including
monitored patient parameters) from each patient and displaying this
information as notifications (subject to thresholds) to a clinician
through the browser-based health information system located on the
health information server 80. Portable device 12 communicates with
servers 76, 80 and 82 via wireless access point 73 and network 21
and incorporates Web browser compatible user interface 26 for
reproducing acquired video and audio data and patient parameters.
In other embodiments servers 76, 80 and 82 may comprise a single
server (e.g., server 20), or alternatively may be located in
portable device 12 or in servers (or other processing devices) in
patient rooms.
[0028] FIG. 3 shows Web-browser compatible user interface display
image 303 presented by user interface 26 on portable device 12 and
enabling a user to initiate visual monitoring of one or more
patients and associated sets of patient parameters. Web-based image
303 is initiated from a patient health record related image
provided by a Web compatible electronic medical record application
as a child application. Patient specific registration and
demographic information is passed from the parent medical record
application to the child application enabling the child application
to determine the location of the patient and acquire patient
specific information and video and audio data for reproduction via
a Web browser application to a clinician. Display image 303
displays patient identification and room location information.
Compressed (e.g., MPEG-4 compatible) image sequence data is
transmitted via Ethernet, for example, to a Web compatible
application employed by user interface 26 of portable 12 where the
data is decoded and displayed within image 303. A virtual camera
controller 314 enables real-time control of the viewed image by
pan, tilt, zoom, focus, iris control of a camera in a patient room
using a thin client (e.g. Active X compatible) application. Image
control may also be initiated by dragging a cursor across the live
image screen in pan and tilt directions. Image 303 presents an iris
open and close feature, providing the capability to lighten or
darken the image especially when in-room lighting is on or off
(especially useful at night). A user is also able to perform live
image zooming in and out using zoom feature 305. A user information
panel 306 provides the user with icons indicating (a) whether
patient privacy is selected, whereby in-room video and audio are
disabled per clinician or patient request and (b) enabling patient
video activation so a patient may be remotely viewed. Panel 306
also includes a patient audio icon enabling a user to select
particular audio devices on a portable device 12 (FIG. 1) (i.e.,
internal or external microphone and speakers) and a pan-tilt
control icon indicating whether a user currently controls camera
movement. Multiple viewing clinicians may alternately control pan,
tilt and zoom functionality by requesting control of the camera
using a move camera icon button. Talk button 309 provides the
capability by which a viewing clinician can communicate with
in-room clinical personal or with a patient. Patient room, name,
and medical record identifiers 311 are provided within the
Web-based image 303. User list 315 provides the local user with the
identity of other users viewing the patient simultaneously as well
as their current functionality (e.g.: audio, PTZ control).
[0029] FIG. 4 shows a networked linkage of components of
distributed patient monitoring system 10 of FIG. 1 that are present
in patient rooms 41. The embodiment of FIG. 4 includes video and
audio components (comprising camera and microphone) contained
within specific rooms 403, 405 and 407. The cameras (and
microphones) have individual camera identifiers that are associated
in a database with corresponding patient room and bed identifiers
and with specific patients. Camera and microphone data is processed
by codecs (coders and decoders) in units 420, 422 and 424 which
also include pan, tilt, zoom camera controllers. MPEG-4 encoder
units 430, 432 and 434 encode camera video for transmission via a
relatively high bandwidth (such as, optical fiber, wireless or
other) network capable of at least 100 Mbits per second per patient
in one embodiment, for example. Other embodiments may employ
networks of lower bandwidth. MPEG-4 encoder units 430, 432 and 434
translate video data into compressed image representative data that
is able to be transmitted over existing Ethernet (via TCP/IP, 1
Mbit per second per patient) networks, for example. Mobile units
42, 444 and 446 comprise mobile video and audio units housing the
camera and audio components in a self-contained housing portable
(e.g., wheeled or carried) between patient rooms within a
healthcare enterprise.
[0030] Data is transmitted via network switch 436 from encoder
units 430, 432 and 434 to an application server (e.g., application
server 76 FIG. 2, server 20 FIG. 1) where it is captured and
processed by a monitoring application. Data is downloaded into a
Web browser within user interface 26 of clinician portable device
12 from a monitoring application in application server 76, in
response to data in repository 17 associating a camera and
microphone identifier with a room and with a particular patient.
System 10 advantageously provides distributed processing over the
Web (e.g., part of network 21 FIG. 1) as well as processing of
discrete patient parameters using a Rules engine (rules processor
19), direct notification of patient events managed by workflow
processor 34 and data compression using an MPEG-4 encoder and
decoder. A user authenticates into a portable device on network 21
and initiates execution of a clinical application provided by CIS
51. In response to a patient being selected in a display image
associated with the clinical application, a video image of the
patient (including patient vital sign parameters) is presented via
a web browser in a composite image as a pop-up window.
[0031] FIG. 5 illustrates a camera control system used for
controlling a camera in a patient room. A user is able to control a
camera in a patient room using a virtual pan-tilt controller in Web
page display image 503 provided by a viewer executable procedure of
a monitoring application. The controller enables a clinician to
move the camera in pan and tilt using either user interface control
icons 504 or on-image control 505 using a mouse. An x-y coordinate
grid 507 presented in image 503 enables a clinician to adjust
camera viewing in pan and tilt. A user moves a cursor across the
grid and the cursor snaps back to central (neutral) position when a
user releases the mouse, for example.
[0032] System 10 displays live motion video and audio information
of patients within a health care enterprise in a web-browser based
window such that viewing of patient information may occur
concurrently on multiple (distributed) wirelessly communicating
computers, computing tablets or other stationary or mobile
processing devices. The system enables a user to virtually control
the viewing field of cameras located within patient rooms via a
Web-browser based application that is downloaded from a remote
application server and to adjust video views of multiple patients
using the Web-browser based application. The system decodes and
displays compressed video information of patients acquired from raw
camera video feeds via a mobile computing platform. Rules processor
19 processes and analyzes patient parameters and laboratory test
results and compares parameters with pre-determined thresholds and
notifies users as to whether values collected and validated by
clinical staff (e.g.: nursing) fall within or outside of acceptable
ranges.
[0033] FIG. 6 shows Web-browser based user interface display image
603 enabling a user to initiate remote visual and audio monitoring
of a particular patient. Specifically, a user selects item 607 in a
menu accessed within clinical application related image 603
specific to patient 605 (Blair Stuart). In response to this
selection, execution of a remote visual and audio monitoring
application is initiated for monitoring patient 605.
[0034] FIG. 7 illustrates structured association of camera, room
and channel identifier data in a data table, for example, for use
in mapping a selected patient to a corresponding camera.
Specifically, row 705 illustrates association of a camera (camera
identifier) with a video channel identifier (via which video data
is conveyed) and with a patient room and location identifier.
Monitoring processor 15 (FIG. 1) retrieves a camera identifier from
the data table associating the camera and room using an active
server page, for example. The camera identifier (e.g., cam1) is
associated with a video matrix switch input channel (e.g., channel
1) and processor 15 provides input channel identifier data to the
matrix switch (switch 436 FIG. 4). In response to the input channel
data, matrix switch 436 directs the associated camera video from
the corresponding camera in corresponding room location (location
050101) received on input channel 1 to portable device 12 for
display.
[0035] FIG. 8A illustrates data flow associated with locating a
mobile audio and video hardware unit (e.g., units 42, 4 and 446
FIG. 4). Data table 803 in repository 17 (FIG. 1) associates a
patient with a medical record number, room identifier and bed
identifier. In response to a user selecting a patient for viewing,
a camera selection manager in monitoring processor 15 uses data
table 803 to determine a room identifier based on a received
patient identifier and uses data table 805 associating room
identifier, with bed and camera identifier to determine an
associated camera identifier. Processor 15 uses data table 807
associating camera identifier, input channel and output channel
identifier to determine input and output channel identifier (in
switch 436 FIG. 4) associated with the determined camera
identifier. Processor 15 further uses display communication manager
811 to select switch 436 input and output channels and display type
813 and uses associated Ethernet and serial communication protocols
for presenting patient video data on a device display type 816.
FIG. 8B illustrates mapping of a mobile unit (e.g. 442 FIG. 4) tag
to a room id. The tag is attached to the mobile unit and a tag to
room mapping table is updated based on updated RFID position of a
mobile unit and network connection under direction of a healthcare
information (and workflow) system. As the mobile embodiment is
located in the proximity of a patient, radio frequency transmission
to a central processor via a network associates that location with
a patient within the same location. This linkage enables display of
video data within the Web-based interface.
[0036] FIG. 9 shows a flowchart of a process performed by
distributed patient monitoring system 10 (FIG. 1). In step 912
following the start at step 911, monitoring processor 15 (FIG. 1)
responds to user initiated commands from multiple different
portable processing devices in different remote locations. Input
processor 27 in monitoring processor 15, acquires vital sign
parameters and associated video data representative of multiple
sequences of video images of corresponding multiple different
patients. Data processor 29 in monitoring processor 15, processes
the vital sign parameters and associated video data to provide
processed video data representing an image sequence. In one
embodiment the processed video data comprises processed first video
data representing an image sequence including a first image
including an area displaying user selectable image elements
individually associated with corresponding individual patients of
the multiple different patients and processed second video data
representing an image sequence including a composite image
including a first area showing live video of a selected first
patient and a second area presenting vital sign parameters of the
selected first patient in response to user selection of an image
element associated with the first patient in the first image.
Communication network 21 linking portable devices 12 and 14, server
20, CIS 51 and patient rooms 41, has sufficient bandwidth to
communicate the processed video data to the first and second
portable processing devices respectively of the multiple different
portable processing devices in different remote locations in
response to commands received from the first and second portable
processing devices respectively.
[0037] In step 917, authentication processor 39 enables a user to
obtain access authorization to access patient data in response to
entry of identification data using any portable processing device
of the multiple portable processing devices in different remote
locations. In step 919, user interface 26, in response to the
access authorization, enables a user to, initiate execution of a
clinical information application providing a user with a clinical
application display image-identifying multiple different patients
in corresponding multiple different locations and select a
particular patient of the multiple different patients in the
clinical application display image. A display processor in user
interface 26 in step 923 initiates generation of data representing
an image sequence (processed video) for presentation in a composite
image including a first area showing live video of a selected
particular patient and a second area presenting vital sign
parameters of the particular patient in response to user selection
of an image element associated with the particular patient of the
multiple different patients in the clinical application display
image. The image element associated with the selected patient
comprises a hyperlink presented in a list of different patients in
the display image provided by the clinical information
application.
[0038] A first portable processing device 12 of multiple portable
processing devices, has a user interface 26 that enables a user, in
response to access authorization, to, initiate execution of a
clinical information application providing a user with a clinical
application display image identifying multiple different patients
in corresponding different locations. Device 12 user interface 26
also enables a user to select a first patient of the multiple
different patients in a clinical application display image and
display an image sequence including a composite image comprising a
first area showing live video of the first patient and a second
area presenting vital sign parameters of the first patient.
Similarly, second portable processing device 14 of the multiple
portable processing devices, has a user interface 26 that enables a
user, concurrently with operation of the first portable processing
device and in response to access authorization, to, initiate
execution of a clinical information application providing a user
with a clinical application display image identifying multiple
different patients in corresponding multiple different locations.
Device 14 user interface 26 also enables a user to select a second
patient of the multiple different patients in the clinical
application display image and display an image sequence including a
composite image comprising a first area showing live video of the
second patient and a second area presenting vital sign parameters
of the second patient.
[0039] In step 926, rules processor 19, applies rules to the vital
sign parameters to identify an alert condition indicating a
significant patient clinical condition or change of clinical
condition and the composite image includes an image element
indicating the alert condition. The process of FIG. 9 terminates at
step 929.
[0040] The systems and processes of FIGS. 1-9 are not exclusive.
Other systems, processes and menus may be derived in accordance
with the principles of the invention to accomplish the same
objectives. Although this invention has been described with
reference to particular embodiments, it is to be understood that
the embodiments and variations shown and described herein are for
illustration purposes only. Modifications to the current design may
be implemented by those skilled in the art, without departing from
the scope of the invention. The system has application to surgical
theaters (operating rooms), emergency departments, first responders
via ambulance, home-health care, patient-family viewing, mobile
units, casinos, schools (classrooms), or applications within the
aerospace industry, for example. The system is readily used in
conjunction with existing critical care systems, medical/surgical
ward systems, emergency department or operating room systems. The
processes and applications may in alternative embodiments, be
located on one or more (e.g., distributed) processing devices
accessing a network linking the elements of FIG. 1. Further, any of
the functions and steps provided in FIGS. 1-9 may be implemented in
hardware, software or a combination of both and may reside on one
or more processing devices located at any location of a network
linking the elements of FIG. 1 or another linked network including
the Internet.
* * * * *