U.S. patent application number 14/291567 was filed with the patent office on 2014-12-04 for dynamic adjustment of image compression for high resolution live medical image sharing.
The applicant listed for this patent is eagleyemed, Inc.. Invention is credited to Harish P. HIRIYANNAIAH, Muhammad Zafar Javed SHAHID.
Application Number | 20140357993 14/291567 |
Document ID | / |
Family ID | 51985889 |
Filed Date | 2014-12-04 |
United States Patent
Application |
20140357993 |
Kind Code |
A1 |
HIRIYANNAIAH; Harish P. ; et
al. |
December 4, 2014 |
DYNAMIC ADJUSTMENT OF IMAGE COMPRESSION FOR HIGH RESOLUTION LIVE
MEDICAL IMAGE SHARING
Abstract
A video stream of live medical images is generated at a local
site having a medical image scanner. A live video stream is
transmitted to at least one remote site via a network, which may
include wired or wireless Internet connections. Network conditions
are monitored during a network session and predictions are made on
a predicted bit rate for transmission. The compression parameters
for the live video stream are selected based on the predicted bit
rate.
Inventors: |
HIRIYANNAIAH; Harish P.;
(San Jose, CA) ; SHAHID; Muhammad Zafar Javed;
(San Jose, CA) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
eagleyemed, Inc. |
Santa Clara |
CA |
US |
|
|
Family ID: |
51985889 |
Appl. No.: |
14/291567 |
Filed: |
May 30, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
61829887 |
May 31, 2013 |
|
|
|
Current U.S.
Class: |
600/437 ;
375/240.02 |
Current CPC
Class: |
H04N 19/124 20141101;
H04N 19/164 20141101; H04N 19/166 20141101; H04N 19/12 20141101;
H04N 19/132 20141101; A61B 6/563 20130101; H04N 19/146 20141101;
A61B 8/565 20130101; H04N 19/172 20141101 |
Class at
Publication: |
600/437 ;
375/240.02 |
International
Class: |
H04N 19/567 20060101
H04N019/567; H04N 19/625 20060101 H04N019/625; H04N 19/593 20060101
H04N019/593; H04N 19/51 20060101 H04N019/51; A61B 8/08 20060101
A61B008/08; A61B 8/00 20060101 A61B008/00 |
Claims
1. A method of transmitting over the Internet a live video stream
of ultrasound imaging data, comprising: monitoring a set of quality
of service metrics for a network communication session between a
local site having an ultrasound imaging scanner and a remote site;
predicting, based on the set of quality of service metrics, a
minimum expected bit rate to transmit a compressed video stream
including a live ultrasound video stream; selecting, in response to
the expected bit rate, video compression parameters to compress the
video stream including the live ultrasound video stream; and
transmitting the compressed video stream to the remote site.
2. The method of claim 1, wherein the at least one quality of
service metric comprises a bandwidth metric, a packet loss metric,
a packet latency, and a packet corruption metric.
3. The method of claim 1, wherein the predicting comprises a
prediction technique selected from the group consisting of a linear
predication algorithm and a nonlinear prediction algorithm.
4. The method of claim 1, wherein the predicting comprises a
prediction technique selected from the group consisting of a linear
prediction algorithm, Kalman prediction, and hidden Markov model
prediction.
5. The method of claim 1, wherein the live medical image video
stream comprises one of an ultrasound video stream, an angiography
video stream, and an endoscopy video stream.
6. The method of claim 1, wherein the selecting comprises selecting
from at least two different video compression protocols.
7. The method of claim 6, wherein the at least two different video
compression protocol include a MJPEG2000 compression protocol and
at least one of MPEG-4, H.264, H.265, VP8, and VP9.
8. The method of claim 1, wherein the selecting comprises selecting
a subset of features of a particular video compression
protocol.
9. The method of claim 8, wherein the selecting a subset of
features includes selecting at least one of: turning on or off the
MVC encoding; varying the block sizes of frames; varying the
quantization tables for intra-frame discrete cosine transform (DCT)
compression; varying motion vector compensation (MVC) for
inter-frame compression, and reducing the stream frame rate by
dropping frames ahead of the compression engine.
10. The method of claim 1, further comprising selecting a
transmission protocol based on the expected bit rate.
11. The method of claim 1, further comprising analyzing quality of
service metrics for a plurality of sessions and providing a network
service recommendation.
12. A method of tuning video compression parameters for a live
video stream of medical images in a networked telemedicine
environment having variable network quality, comprising: monitoring
at least one quality of service metric for a network session
between a local site having a medical imaging scanner and a remote
site; predicting, based on the at least one quality of service
metric, a minimum expected bit rate to transmit a compressed live
video stream of high resolution medical images from the medical
imaging scanner; selecting, in response to the expected bit rate,
video compression parameters for the live video stream of medical
images; compressing the live video stream of high resolution
medical image, using the selected video compression parameters; and
transmitting the compressed live video stream to the remote
site.
13. The method of claim 12, wherein the at least one quality of
service metric is monitored in at least a local site and remote
site.
14. The method of claim 12, wherein the at least one quality of
service metric comprises a bandwidth metric, a packet loss metric,
a packet latency, and a packet corruption metric.
15. The method of claim 12, wherein the predicting comprises a
prediction technique selected from the group consisting of a linear
predication algorithm and a nonlinear prediction algorithm.
16. The method of claim 12, wherein the predicting comprises a
prediction technique selected from the group consisting of a set of
linear prediction algorithms, Kalman prediction, and hidden Markov
model prediction.
17. The method of claim 12, wherein the live medical image video
stream comprises one of an ultrasound video stream, an angiography
video stream, and an endoscopy video stream.
18. The method of claim 12, wherein the selecting comprises
selecting from at least two different video compression
protocols.
19. The method of claim 18, wherein the at least two different
video compression protocol include a MJPEG2000 compression protocol
and at least one of MPEG-4, H.264, H.265, VP8, and VP9.
20. The method of claim 12, wherein the selecting comprises
selecting a subset of features of a particular video compression
protocol.
21. The method of claim 20, wherein the selecting a subset of
features includes selecting at least one of: turning on or off the
MVC encoding; varying the block sizes of frames; varying the
quantization tables for intra-frame discrete cosine transform (DCT)
compression; varying motion vector compensation (MVC) for
inter-frame compression, and reducing the stream frame rate by
dropping frames ahead of the compression engine.
22. The method of claim 21, further comprising selecting a
transmission protocol based on the expected bit rate.
23. The method of claim 12, further comprising analyzing quality of
service metrics for a plurality of sessions and providing a network
service recommendation.
24. A non-transitory computer readable medium including
instructions which when executed on a processor implement any one
of the methods of claim 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22,
or 23.
25. A system, comprising: an ultrasound image scanner to generate a
live video stream of ultrasonic medical images; and a computer
system to monitor network conditions in a network session to a
remote site, predict a minimum bit rate, and select compression
parameters to compress the live video stream prior to transmission
to the remote site.
Description
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefit of U.S.
Provisional Application No. 61/829,887, filed on May 31, 2013, the
contents of which are hereby incorporated by reference.
FIELD OF THE INVENTION
[0002] The present invention is generally related to techniques for
live sharing over the Internet of a video stream composed of high
resolution medical images, such as ultrasound images. More
particularly, the present invention is directly to dynamically
adapting compression techniques for a video stream of medical
images to adjust to variable network conditions.
BACKGROUND OF THE INVENTION
[0003] In telemedicine applications there is a need to share
medical data between different sites. However, one problem in the
industry is that some medical imaging applications require a large
bandwidth to stream as a live video stream of medical images.
Illustrative examples include live video streaming of ultrasound
imaging, angiography, and endoscopy. Many types of medical images
are difficult to efficiently compress. For example, ultrasound
images have a high entropy content and have compression ratios that
are dramatically lower than the compression ratios that can be
achieved for streaming television and movies. An additional
complication is that many small clinics have poor IT infrastructure
with marginal and time-varying connections to the Internet, due to
cost considerations, remote locations, or other reasons.
[0004] As an illustrative example of some of these problems, in
ultrasound (u/s) imaging, image frames typically have a resolution
of 512.times.512 pixels that are acquired at frame rates of 10 to
60 frames per second (fps). The frame rate may vary, depending in
the u/s frequency of operation. The frame size is usually fixed,
and the pixel resolution is again determined by the u/s frequency,
and can vary from 400 microns at 2 MHz frequency to 80 microns at
10 MHz frequency. It is this pixel resolution that is relevant to
the end-user, not the frame size itself. The frames at the frame
rate are usually concatenated to form a standard video stream.
[0005] The stated frame size and rates mentioned above imply a raw
data rate of 63 Mbps at 30 fps and 8 bits per pixel, gray scale.
Without any compression, the data storage required to store one
minute of the u/s stream at 30 fps and 8 bits per pixel is 450 MB.
For color Doppler u/s imaging 12 bits per pixel is required,
implying a raw data rate of about 95 Mbps at 30 fps and a data
storage requirement to store one minute of the Doppler u/stream of
675 MB. Consequently, an image compression scheme has to be
employed to reduce the bit rate for real-time network transport of
the u/s scheme and to reduce the data storage requirements of the
u/s scheme. However, the compression requirements for these two use
cases need not be identical.
[0006] In medical imaging, lossless and lossy compression schemes
are employed, depending on the imaging modality. Lossy compression
may be employed as long as the quality of compressed images does
not violate the Just Noticeable Difference (JND) threshold. This
threshold is very subjective, but, many standards bodies such as
the American College of Radiology (ACR) have established guidelines
for compression rates for various medical imaging modalities.
[0007] Ultrasound images have a high entropy content and cannot be
compressed with as high a compression ratio as conventional video
streams for movies. For ultrasound imaging, many different
compression standards are allowed. Examples of permissible
compression standards for ultrasound imaging include Motion
JPEG2000 (MJPEG2000), MPEG-4 and H.264.
[0008] Motion JPEG is a video codec in which each frame is
separately compressed into a JPEG image. As a result the quality of
the video compression is independent of the motion of the image. At
low bandwidth availability priority is given to image resolution.
In contrast, MPEG-4 is a standard that sends a reference frame and
difference data for following frames (I frames, B frames, and P
frames).
[0009] In the case of MJPEG2000, each image frame is compressed
(either lossy or lossless) and every frame in the 30 fps stream is
sent separately. Typical lossy compression rates (compression
ratios) are of the order of 1:10 to 1:15. Further effective
compression rates are not possible without losing image quality,
since inter-frame data redundancy is not captured in MJPEG2000.
Thus, while the compression rates of 1:10 to 1:15 rates of
MJPEG2000 are good, there are problems in using MGPEG2000 for data
storage or for network transport.
[0010] In the case of MPEG-4 and H.264, larger compression rates,
from 1:20 through 1:80, are possible, since they utilize
frame-to-frame redundancies and motion vector compensation schemes.
For high entropy content images, such as ultrasound, the
compression rates on the order of up to 1:20 to 1:40 are possible.
These compression standards also allow for segmenting the images
into multiple slices, and apply different compression rates for
different schemes. An ultrasound image typically includes a main
ultrasound image 105 (the active sub-image taken by the ultrasound
probe) and border regions 110, 115 which may include labeling or
text describing aspects of the image. Thus an image can often be
segmented into strips. For the image example in FIG. 1, the image
can be segmented into three strips--a top stripe having some
textual data, a left stripe having textual data and the remaining
sub-image which corresponds to the ultrasound image data.
[0011] The top and left stripes are highly compressible, to a few
bytes, since inter-frame redundancies are very high. The active
sub-image will have a lower compression rate, based on the mobility
of the organ under exam.
[0012] Image data may be sent via a wired or wireless network. In a
networked environment such as the internet, where these u/s streams
are transported in real-time, there are dynamic conditions of the
network that can momentarily constrict or disrupt the available
bandwidth for u/s stream transport. As a result there can be a
severe loss in the quality of the real time transaction and/or a
loss in the connection, which is unacceptable in many
applications.
SUMMARY OF THE INVENTION
[0013] An apparatus, system, method and computer readable medium is
disclosed to dynamically adjust compression parameters for a live
video stream of medical imaging data in a network having variable
network conditions. Quality of Service (QOS) metrics are monitored
for a network session between at least two sites. In one embodiment
the QOS metrics are used to predict a minimum expected bit rate.
The minimum expected bit rate is used to select compression
parameters for compressing the live video stream of medical imaging
data. The compression parameter may include different video
compression protocols and selectable parameters within an
individual compression protocol. In one embodiment the quality of
service metrics are also used to select a network transmission
protocol for transmitting the video stream.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] FIG. 1 illustrates a conventional ultrasound image with
dashed lines superimposed to illustrate aspects of image
compression.
[0015] FIG. 2 illustrates a system for dynamically adjusting
compression parameters of a video stream of medical imaging data in
accordance with an embodiment of the present invention.
[0016] FIG. 3 illustrates a tunable compression engine in
accordance with an embodiment of the present invention.
[0017] FIG. 4 illustrates aspects of a network QOS transaction in
accordance with an embodiment of the present invention.
[0018] FIG. 5 illustrates prediction of compression rate in
accordance with an embodiment of the present invention.
[0019] FIG. 6 illustrates selection of network transmission
protocol based on network conditions in accordance with an
embodiment of the present invention.
DETAILED DESCRIPTION
[0020] FIG. 2 illustrates an exemplary system and network
environment for sharing a video stream of compressed medical
images. At a local clinic site 205 a patient 207 is examined by a
doctor 209 or a medical technician. The patient may by a human
patient. Alternatively, many medical imaging procedures have been
adapted for veterinary medicine such that the patient may be cat,
dog, horse, etc. A medical imaging scanning device 210 generates a
live stream of video images that are transmitted over a network to
another site 260, such as local area network 265 of a medical
center. As an illustrative example, the live stream may be
transmitted to a site of a specialist doctor or a doctor from whom
a second opinion is desired. It is also understood that the live
stream may also be transmitted simultaneously to other sites.
[0021] An exemplary medical imaging scanning device 210 is an
ultrasound imaging device, although more generally other types of
live imaging device could be used, such as angiography or
endoscopy. For the case of ultrasound there is high entropy content
of the images in the video stream which in turn invokes many
tradeoffs in regards to the compression parameters used to compress
the images. Exemplary imaging technologies may require frame rates
of 10-60 fps, 8 bits per pixel gray scale and 12 bits for color
images, such as color Doppler ultrasound images. In the case of
ultrasound imaging, image frames may have a resolution of
512.times.512 pixels at frame rates of 30 fps and 8 bits per pixel,
the raw data rate is 63 Mbps. Other medical imaging techniques,
such as angiography, have similar data requirements.
[0022] The present invention is generally directed to the use of
dynamically adapting a compression scheme used for transporting a
medical image video stream, such as an ultrasound stream, across a
network with time varying network session connection quality using
feedback from at least one network quality of service agent. A live
stream of medical images carries a large amount of data.
Additionally, certain types of medical data, such as ultrasound
images, are difficult to efficiently compress. Moreover, the
network connection quality between sites may vary dramatically. For
example, a doctor at a small clinic may have a poor quality
connection to the Internet. As a result of these factors,
compressing the live video stream of the medical imaging data is
performed to attempt to maintain live connection with minimal
degradation of the quality of the live images.
[0023] The network path to a remote viewer at site 260 includes the
Internet network cloud 250 and any local networks, such as local
network 265. Reporting (R) tools are network agents that provide
network metrics at different part of the network communication
path. Typically there would be reporting tools configured in at
least both ends of the network path. These network metrics may
include attributes such as bandwidth, packet loss, and packet
corruption. The reporting tools may comprise commercial or
proprietary reporting tools. The frequency with which reports are
received may be configured. For example, many commercial network
reporting tools permit periodic generation of reports on network
conditions such as once every 100 ms, once every second, once every
five seconds, etc.
[0024] The network quality of service (QOS) metrics are monitored
and used to predict network conditions (in the near future) to
determine optimum compression parameters for transmitting a live
video stream of medical images to the remote viewer. That is, the
QOS metrics provide metrics on past and recent network conditions,
which are then used to predict network conditions when a frame of
the live video stream is transmitted. For example, suppose the
reporting tools provide a report every 5 seconds. In this example,
if the last report was received 3 seconds ago, data on the past and
most recent report (3 seconds ago) on network conditions may be
used to predict network conditions to transmit a video frame. In
particular, an expected minimum bit rate may be calculated based on
quality of service inputs such as predicted bandwidth, predicted
packet loss, and predicted packet latency. This minimum bit rate of
the connection session, in turn, implies a compression rate, or
compression ratio, for the live stream of medical images.
[0025] Block 240 illustrates an example of modules to dynamically
adjust compression parameters as network conditions vary. In one
embodiment a local computer 249 includes software modules to
perform network QOS monitoring 242, compression rate prediction
244, a tunable compression engine 246, and a call management
selection module 248. In one embodiment the call management
selection module 248 also receives the QOS metrics from network QOS
monitoring module 242 and selects the network transmission protocol
(e.g., TCP or UDP) based on the network conditions.
[0026] The tunable compression engine 244 has compression
parameters that are selectable. The selectable parameters may
include different compression protocols and/or selectable features
within one protocol. The tunable compression engine selects the
compression technique, based on the predicted compression rate, to
optimize the image quality for the medical images of the live video
stream. In particular, the tunable compression engine may select
the compression technique to minimize distortion of the video
images given the constraint of the predicted bit rate and that
certain types of medical images, such as ultrasound images, have a
high entropy content. The compressed live medical image stream is
transmitted using a network protocol, which may also be adjusted
based on network conditions.
[0027] FIG. 3 is a functional block diagram illustrating additional
aspects of the compression engine 246. A rule-based system may
select an optimum compression technique to transmit a live video
stream of medical images with minimal distortion. That is, certain
ranges of parameters, such as bandwidth, loss, and packet
corruption, are mapped to specific compression parameters. The
compression engine in one embodiment selects a compression protocol
from a choice of at least two different compression protocols used
in transmitting live video streams of medical images. The
compression engine may also make dynamic adjustments to individual
selectable features of an individual compression protocol. The
compression engine is a flexible engine. Depending on the available
bandwidth (predicted bit rate), it can switch dynamically between
different video compression protocols. In one embodiment it can
dynamically switch between MJPEG2000 compression and standard
MPEG-4/H.264 schemes. More generally, it is contemplated that other
currently implemented and proposed compression standards may be
selected. Other examples of video compression protocol standards
that may be selected include H.265, VP8, and VP9. Additionally, in
one embodiment at least one compression protocol has selectable
features that can be turned on or off. In the case of MPEG-4 and
H.264, compression rates are adjusted by one or more of:
[0028] 1) turning on or off MVC encoding,
[0029] 2) varying the block sizes of frames,
[0030] 3) varying the quantization tables for intra-frame discrete
cosine transform (DCT) compression,
[0031] 4) varying motion vector compensation (MVC) for inter-frame
compression, and
[0032] 5) reducing the stream frame rate by dropping frames ahead
of the compression engine.
[0033] As also illustrated in FIG. 3, in one embodiment an
additional storage compression engine 305 may be provided to
optimize compression for storage. In some applications it may be
desirable to also store the video stream. Consequently, as
illustrated in FIG. 3, a parallel compression engine for storage of
the compressed medical image video stream may be provided that is
not subject to such dynamic compression schemes. The compression
engine for storage may use a constant conservative compression rate
applied to the stream that will not compromise the JND threshold
guidelines of ACR.
[0034] Note that the compression engine can adapt rapidly to
changing network conditions. In many parts of the world doctor's
offices and small clinics may use wireless local networks and/or
wireless Internet connections. However, the quality of these
connections may depend on the time of day and other factors. For
example, a clinic close to a train track may experience large
changes in the quality of a wireless Internet connection whenever a
train travels close to the clinic, due to the reflection from the
metal surface of the train. As another example, for the case of a
small clinic using a WiFi connection, the quality of WiFi service
may vary based on time of day and number of users. Wireless
internet connections based on the 802.11 standard may suffer
interference and packet drop at times of the day when there are
many users on individual wireless networks and on neighboring
wireless networks. As an example, many wireless internet systems in
individual portions of a city slow down between the hours of 5 PM
to 8 PM as multiple users in the same geographic region attempt to
simultaneously access the Internet.
[0035] The compression engine 246 compresses the video stream prior
to transmission. In practice the network has time-varying quality
of service (QOS) characteristics, such as packet loss, bandwidth,
and packet corruption. If fixed (non-varying) compression
techniques are used then in the worst case the changing network
conditions can result in a severe loss in the quality of the real
time transaction and a possible loss in connection, which in
unacceptable. In accordance with an embodiment of the present
invention, the compression engine reacts to such changing
conditions by monitoring network conditions and adapting the
compression technique used to compress the video stream. The
compression scheme is adapted to the dynamic conditions of the
network by utilizing a variable rate compression scheme. For the
data archival use-case, a fixed rate compression scheme is adequate
and the rate is based on the JND metric for the specimen being
imaged.
[0036] FIG. 4 illustrates aspects related to the use of Active
Network Quality Agents (ANQA) for an ultrasound (u/s) embodiment.
The ANQA elements 405 serve as the reporting (R) agents. In a
network session the network can be pictured as a network pipe 402
between local and remote sites. The ANQA elements may be based on
proprietary software or on commercial solutions. As an example,
commercial products, such as the Airwave software of Aruba
Networks, Inc., provide network Quality of Service (QOS) metrics,
and can be configured to report these statistics to desired
locations. The ANQA elements may be local and remote reporting
elements, although more generally there may be additional ANQA
elements at different points in the network. Examples of QOS
metrics include current bandwidth (BW) for TCP connections and
packet loss rate (PLR) for UDP connections
[0037] In one embodiment, the Active Network Quality Agent (ANQA)
continually monitors the available QOS metrics for a network
session. In one embodiment a prediction scheme for determining the
available bandwidth is utilized. That, is the current and recent
reporting results from the ANQA agents are used to predict the
expected bit rate along the transmission path when the video is
transmitted.
[0038] The ANQA reporting agent(s) 405 report the QoS metrics to
the Compression Rate Predictor (CRP) 410, which outputs a
compression rate to the compression engine 420. The CRP may rely on
available metrics for current and immediate past of network Quality
of Service (QoS) to predict a compression rate which would be
compatible with the network conditions.
[0039] The CRP may, for example, utilize linear or non-linear
prediction techniques for compression rate prediction. That is, a
variety of different well-known prediction algorithms may be
modified to take the QOS metrics (e.g., bandwidth, packet loss, and
packet corruption) and generate an expected bit rate for the
compression engine.
[0040] A variety of linear prediction algorithms may, for example,
predict network quality (e.g., packet loss, bandwidth) based on
regression techniques and, curve fitting based on the measured QOS
metrics to predict network characteristics. Referring to FIG. 5, in
one embodiment, a filter is provided in combination with a forward
predictor. The current compression rate is an input and the output
is a predicated compression rate. In one embodiment a Linear
Prediction algorithm utilizes Auto-Regressive Moving-Average (ARMA)
models that utilize an error correction scheme such that the filter
would be an ARMA filter. Kalman Prediction (KP) may also be used by
the compression rate predictor. As examples, any KP scheme (Linear
or Non-Linear) that is modeled in a higher order state space scheme
may be used. Additionally, Hidden Markov Model (HMM) Prediction may
be used by the compression rate predictor.
[0041] In one embodiment the monitoring rate of ANQA agents is
adjustable. For example, this monitoring can be done every second
to about few times a minute. This monitoring rate can also be
subject to a prediction scheme, based on the rate of change of QoS,
in a manner similar to the compression rate prediction.
[0042] FIG. 6 illustrates an example for the medical image scanner
being an ultrasound imager 610. Referring to FIG. 6, in one
embodiment the ANQA are used to support a call management system
dynamically switching between different network transmission
protocols (e.g., Transmission Control Protocol (TCP) and User
Datagram Protocol (UDP) protocols) based on QoS metrics. To monitor
an individual session, at least one ANQA agent is monitored.
Preferably at least two ANQA agents (one at either end of the
network connection is monitored). Additional ANQA agents in the
cloud may also be optionally monitored. There are tradeoffs, in
terms of bandwidth and packet loss for different network protocols.
TCP, for example, provides for retransmission of dropped packets
and thus requires higher bandwidth than UDP. UDP has lower
bandwidth and lower latency requirements but does not guarantee
delivery of packets.
[0043] Additionally, as previously discussed, the QOS metrics may
different slightly for TCP vs. UDP and thus have to be factored
into the compression rate predictor. Thus, the network transmission
protocol may be selected to optimize the quality of the live video
stream in view of the QOS metrics. While TCP and UDP are examples
of network transmission protocols, it will be noted that other
network transmission protocols are contemplated, such as Stream
Control Transmission Protocol (SCTP).
[0044] In one embodiment the QOS statistics during use of the
medical image scanner may also be monitored by a service entity to
provide recommendations. The QOS statistics provide information on
the network connection quality. The QOS statistics may also be
collected over many different sessions. As a result, temporal
patterns may be detected. A large hospital or clinic may have a
dedicated in-house information technology capability. However,
small clinics or individual doctor soften lack such capabilities
and under-invest in information technology services. In one
embodiment a rule based system may provide recommendations on how
an upgrade in network capabilities (e.g., connection type and/or
connection bandwidth) may improve performance.
[0045] It will be understood that the networks metrics monitoring,
compression rate predictor, tunable compression engine, and call
management may be implemented in different ways. In one embodiment
they are implemented as software modules operating on a local
computer. Alternatively, it will be understood that some or all of
the modules may be implemented within a medical image scanner. It
will be understood that the software modules and methodologies may
be stored as computer readable instructions stored on a
non-transitory computer readable medium.
[0046] While the invention has been described in conjunction with
specific embodiments, it will be understood that it is not intended
to limit the invention to the described embodiments. On the
contrary, it is intended to cover alternatives, modifications, and
equivalents as may be included within the spirit and scope of the
invention as defined by the appended claims. The present invention
may be practiced without some or all of these specific details. In
addition, well known features may not have been described in detail
to avoid unnecessarily obscuring the invention. In accordance with
the present invention, the components, process steps, and/or data
structures may be implemented using various types of operating
systems, programming languages, computing platforms, computer
programs, and/or general purpose machines. In addition, those of
ordinary skill in the art will recognize that devices of a less
general purpose nature, such as hardwired devices, field
programmable gate arrays (FPGAs), application specific integrated
circuits (ASICs), or the like, may also be used without departing
from the scope and spirit of the inventive concepts disclosed
herein. The present invention may also be tangibly embodied as a
set of computer instructions stored on a computer readable medium,
such as a memory device.
* * * * *