U.S. patent application number 14/173612 was filed with the patent office on 2014-06-05 for ultrasound system and method of forming ultrasound image.
This patent application is currently assigned to SAMSUNG MEDISON CO., LTD.. The applicant listed for this patent is SAMSUNG MEDISON CO., LTD.. Invention is credited to Jong Sik KIM, Sung Yun KIM.
Application Number | 20140155750 14/173612 |
Document ID | / |
Family ID | 39512491 |
Filed Date | 2014-06-05 |
United States Patent
Application |
20140155750 |
Kind Code |
A1 |
KIM; Sung Yun ; et
al. |
June 5, 2014 |
ULTRASOUND SYSTEM AND METHOD OF FORMING ULTRASOUND IMAGE
Abstract
The present invention is directed to an ultrasound system
capable of providing a plurality of M-mode images corresponding to
M-mode lines without moving a probe. A volume data forming and
reconstructing unit of the present invention forms volume data
based on the ultrasound echo signals, the volume data forming and
reconstructing unit being configured to determine a beat period of
the moving object and reconstruct the volume data based on the beat
period. A processor forms at least one reference image based on the
reconstructed volume data and one or more M-mode images
corresponding to one or more M-mode lines set on the reference
image and a display unit displays the reference image, the M-mode
lines and the M-mode images.
Inventors: |
KIM; Sung Yun; (Seoul,
KR) ; KIM; Jong Sik; (Seoul, KR) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
SAMSUNG MEDISON CO., LTD. |
Hongcheon-gun |
|
KR |
|
|
Assignee: |
SAMSUNG MEDISON CO., LTD.
Hongcheon-gun
KR
|
Family ID: |
39512491 |
Appl. No.: |
14/173612 |
Filed: |
February 5, 2014 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13692811 |
Dec 3, 2012 |
|
|
|
14173612 |
|
|
|
|
12044292 |
Mar 7, 2008 |
|
|
|
13692811 |
|
|
|
|
Current U.S.
Class: |
600/443 |
Current CPC
Class: |
A61B 8/483 20130101;
A61B 8/467 20130101; A61B 8/4444 20130101; A61B 8/0883 20130101;
G01S 7/52066 20130101; A61B 8/14 20130101; A61B 8/466 20130101;
G01S 7/52074 20130101; G01S 15/8993 20130101; G01S 7/52088
20130101; A61B 8/463 20130101; A61B 8/0866 20130101; A61B 8/469
20130101; A61B 8/5207 20130101; A61B 8/08 20130101 |
Class at
Publication: |
600/443 |
International
Class: |
A61B 8/08 20060101
A61B008/08; A61B 8/00 20060101 A61B008/00; A61B 8/14 20060101
A61B008/14 |
Foreign Application Data
Date |
Code |
Application Number |
Mar 8, 2007 |
KR |
10-2007-0022982 |
Claims
1-10. (canceled)
11. An ultrasound system comprising: a probe operable to transmit
ultrasound signals to a target object containing a periodically
moving object and to receive ultrasound echo signals reflected from
the target object; a processor operable to generate volume data
based on the ultrasound echo signals; a display unit to display a
3-dimensional ultrasound image constructed based on the volume
data; and a user input unit to receive user inputs regarding
reference image setup information and M-mode line setup information
associated with the 3-dimensional ultrasound image, wherein the
processor is configured to generate a reference image representing
a plane cut through the 3-dimensional ultrasound image using the
volume data, the cut plane designated based on the reference image
setup information and to generate a M-mode line on the reference
image which is designated based on the M-mode line setup
information, and M-mode images are generated and displayed based on
the M-mode line on the reference image which is designated using
the 3-dimensional ultrasound image.
12. The ultrasound system of claim 11, wherein the user input unit
is further configured to receive rotation information upon the
3-dimensional ultrasound image, and the processor is configured to
control a rotation of the 3-dimensional ultrasound image displayed
on the display unit based on the rotation information so that the
reference image and the M-mode line can be designated by a user
using the 3-dimensional image rotated to any one of a plurality of
viewing angles.
13. The ultrasound system of claim 12, wherein the processor is
configured to determine a moving period of the moving object and
interpolate the volume data to contain a same number of frames in
each moving period.
14. A method comprising: using a probe to transmit ultrasound
signals to a target object containing a periodically moving object
and receiving ultrasound echo signals reflected from the target
object; generating volume data based on the ultrasound echo
signals; displaying a 3-dimensional ultrasound image constructed
based on the volume data; receiving user inputs regarding reference
image setup information and M-mode line setup information
associated with the 3-dimensional ultrasound image; generating a
reference image representing a plane cut through the 3-dimensional
ultrasound image using the volume data, the cut plane being
designated based on the reference image setup information;
generating a M-mode line on the reference image which is designated
based on the M-mode line setup information; and displaying M-mode
images based on the M-mode line on the reference image which is
designated using the 3-dimensional ultrasound image.
15. The method of claim 14, further comprising: receiving a user
input regarding rotation information upon the displayed
3-dimensional ultrasound image; and rotating the displayed
3-dimensional ultrasound image based on the rotation information so
that the reference image and the M-mode line can be designated by a
user using the 3-dimensional image rotated to any one of a
plurality of viewing angles.
16. The method of claim 15, further comprising: determining a
moving period of the moving object; and interpolating the volume
data to contain a same number of frames in each moving period.
17. An ultrasound system comprising: a probe operable to transmit
ultrasound signals to a target object containing a periodically
moving object and to receive ultrasound echo signals reflected from
the target object; a processor operable to generate volume data
based on the ultrasound echo signals and a period of the moving
object; a display unit to display a 3-dimensional ultrasound image
constructed based on the volume data; and a user input unit to
receive a user input as to M-mode line setup information associated
with the 3-dimensional ultrasound image, wherein the processor is
configured to generate a reference image representing a plane cut
through the 3-dimensional ultrasound image using the volume data,
and to generate a M-mode line on the reference image based on the
M-mode line setup information, and M-mode images corresponding to
the M-Mode line on the reference image are generated and
displayed.
18. A method comprising: transmitting ultrasound signals, by a
probe, to a target object containing a periodically moving object
and receiving ultrasound echo signals reflected from the target
object; generating volume data based on the ultrasound echo signals
and a period of the moving object; displaying a 3-dimensional
ultrasound image constructed based on the volume data; receiving a
user input as to M-mode line setup information associated with the
3-dimensional ultrasound image; generating a reference image
representing a plane cut through the 3-dimensional ultrasound image
using the volume data; generating a M-mode line on the reference
image based on the M-mode line setup information; and displaying
M-mode images corresponding to the M-Mode line on the reference
image.
Description
[0001] The present application claims priority from Korean Patent
Application No. 10-2007-0022982 filed on Mar. 8, 2007, the entire
subject matter of which is incorporated herein by reference.
BACKGROUND OF THE INVENTION
[0002] 1. Technical Field
[0003] The present invention generally relates to ultrasound
systems, and more particularly to an ultrasound system and a method
of forming ultrasound images.
[0004] 2. Background Art
[0005] An ultrasound system has become an important and popular
diagnostic tool due to its non-invasive and non-destructive nature.
Modern high-performance ultrasound imaging diagnostic systems and
techniques are commonly used to produce two- or three-dimensional
images of internal features of patients.
[0006] The ultrasound system generally uses a probe comprising an
array of transducer elements to transmit and receive ultrasound
signals. The ultrasound system forms an image of human internal
tissues by electrically exciting transducer elements to generate
ultrasound signals that travel into the body. Echoes reflected from
tissues and organs return to the transducer elements and are
converted into electrical signals, which are amplified and
processed to produce an ultrasound image data.
[0007] The ultrasound system provides an M-mode (motion mode) image
periodically showing the motion of a target object. The ultrasound
system first displays a B-mode image of the target object and then
displays bio-information of the target object, which corresponds to
an M-mode line set on the B-mode image, over the elapse of
time.
[0008] In the conventional ultrasound system, however, a probe
should be moved to observe different portions of a target object.
Thus, there are problems in that B-mode and M-mode images for
different portions of the target object cannot be provided at the
same time. Further, it takes a long time to form the B-mode and
M-mode images for different portions of the target object.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] FIG. 1 is a block diagram showing an ultrasound system in
accordance with one embodiment of the present invention.
[0010] FIG. 2 is a block diagram showing a volume data forming and
reconstructing unit in accordance with one embodiment of the
present invention.
[0011] FIG. 3 is a schematic diagram showing cut plane images in
volume data.
[0012] FIG. 4 is a photo showing a horizontal cut plane image in
volume data.
[0013] FIG. 5 is a photo showing an image obtained by performing
soft-thresholding upon a horizontal cut plane image.
[0014] FIG. 6 is a graph showing projected values obtained through
the horizontal projection of a horizontal cut plane image.
[0015] FIG. 7 is a photo showing an image with a horizontal cut
plane image masked by a mask defining ROI boundaries.
[0016] FIG. 8 is a photo showing the ROI set in a frame.
[0017] FIG. 9 is a schematic diagram showing an example of the VOI
set in ultrasound volume data.
[0018] FIG. 10 is a graph showing three correlation coefficient
curves.
[0019] FIG. 11 is a block diagram showing a procedure of detecting
a global period of heartbeat.
[0020] FIG. 12 is a diagram showing an example of detecting a
global period of heartbeat.
[0021] FIG. 13 is a diagram showing a procedure of reconstructing
ultrasound volume data in accordance with one embodiment of the
present invention.
[0022] FIG. 14 is a schematic diagram showing a reference frame
image and M-mode images in accordance with one embodiment of the
present invention.
[0023] FIG. 15 is a schematic diagram showing a 3-dimensional
ultrasound image and M-mode images in accordance with another
embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0024] FIG. 1 is a block diagram showing an ultrasound system
constructed in accordance with the present invention. As shown in
FIG. 1, the ultrasound system 100 includes a probe 110, a beam
forming and signal processing unit 120, a volume data forming and
reconstructing unit 130, an input unit 140, a processor 150 and a
display unit 160. The ultrasound system 100 may further include a
storage unit (not shown) for storing various data.
[0025] The probe 110 may include an array transducer containing a
plurality of transducer elements for transmitting ultrasound
signals to a target object and receiving ultrasound echo signals
from the target object. The target object may contain a
periodically moving object. The transducer elements may convert the
ultrasound echo signals into electrical reception signals.
[0026] The beam forming and signal processing unit 120 may be
operable to focus the electrical reception signals, and amplify and
process the focused signals. The volume data forming and
reconstructing unit 130 may form volume data for a 3-dimensional
ultrasound image of the target object based on the focused signals
received from the beam forming and signal processing unit 120. The
volume data forming and reconstructing unit 130 may be operable to
calculate a beat period of the moving object contained in the
target object and reconstruct the volume data based on the
calculated beat period. When the object is a heart, the beat period
may be a heartbeat period.
[0027] As shown in FIG. 2, the volume data forming and
reconstructing unit 130 may include a volume data forming unit 131,
a pre-processing unit 132, a region of interest (ROI) setting unit
133, a volume of interest (VOI) setting unit 134, a correlation
coefficient calculating unit 135, a period setting unit 136 and a
volume data reconstructing unit 137.
[0028] The volume data forming unit 131 may form volume data for
forming a 3-dimensional ultrasound image of the target object based
on the focused signals, which are consecutively received from the
beam forming and signal processing unit 120.
[0029] The pre-processing unit 132 may be operable to reduce or
remove noises from an ultrasound image. The noise reduction may
preferably be performed upon at least one of horizontal and
vertical cut plane images 210 and 220, which are obtained by
horizontally or vertically cutting the ultrasound volume data, as
shown in FIG. 3. Generally, since data corresponding to an object
desired for observation are positioned at a center portion of the
volume data, the volume data may be cut either in the horizontal or
vertical direction along its center planes. As discussed below, the
noise reduction is assumed to be performed upon the horizontal cut
plane image 210.
[0030] The pre-processing unit 132 may decompose the horizontal cut
plane image in a wavelet domain into sub-band images HH, HL, LH and
LL (wavelet transform). The pre-processing unit 132 may calculate
wavelet coefficients in each sub-band image. The pre-processing
unit 132 may perform soft-thresholding upon the calculated wavelet
coefficients. That is, if the calculated wavelet coefficient is
smaller than the threshold, then the pre-processing unit 132 may
reset the wavelet coefficient to zero. If the wavelet coefficient
is greater than the threshold, then the pre-processing unit 132 may
subtract the threshold from the wavelet coefficient to thereby
reset the wavelet coefficient. The soft-thresholding may not be
carried out for the sub-band image LL, wherein the sub-band image
LL represents low frequency (vertical)-low frequency (horizontal)
image. The soft-thresholding can be carried out by the following
equation in accordance with the present invention.
W j f ^ ( t ) = sign ( W j f ( t ) ) [ W j f ( t ) - Th ] + , [ x ]
+ = { x , x > 0 0 , otherwise ( 1 ) ##EQU00001##
[0031] wherein, W.sub.if(t) represents a coefficient of a high
frequency at a j.sup.th level decomposed in the wavelet domain,
sign( ) represents a sign of the coefficients, Th represents a
threshold having a constant value, and W.sub.j{circumflex over
(f)}f (t) represents a resulting wavelet coefficient from
performing the soft-thresholding. Thereafter, the pre-processing
unit 132 may reconstruct the decomposed images through inverse
wavelet transform to thereby obtain the horizontal cut plane image
with the noises reduced.
[0032] FIG. 4 shows an original horizontal cut plane image obtained
from the ultrasound volume data, whereas FIG. 5 shows a horizontal
cut plane image obtained by reconstructing the sub-band images with
the soft-thresholding performed in the pre-processing unit 132.
[0033] The ROI setting unit 133 may set a region of interest (ROI)
on the pre-processed horizontal cut plane image. In order to set
the ROI, the ROI setting unit 133 may perform horizontal projection
upon the pre-processed horizontal cut plane image to thereby obtain
projected values that are sum of the brightness of all pixels along
horizontal projection lines, as shown in FIG. 6. The ROI setting
unit 133 may calculate a mean value of the projected values and
calculate positions n.sub.T and n.sub.B, which represent a vertical
position of a projected value located in the most left side among
the projected values smaller than the mean value and a vertical
position of a projected value located in the most-right side among
the projected values smaller than the mean value, respectively.
Positions n.sub.T and n.sub.B may be calculated by using equation
(2) shown below.
n T = min n { n | f n < Mean } , 0 .ltoreq. n < N 2 n B = max
n { n | f n < Mean } , N 2 .ltoreq. n < N ( 2 )
##EQU00002##
[0034] wherein, f.sub.n represents a horizontally projected signal,
and Mean represents a mean of the projected values. The positions
n.sub.T and n.sub.B may be used as boundaries of the ROI.
[0035] The ROI setting unit 133 may use the boundaries n.sub.T and
n.sub.B of ROI to mask the horizontal cut plane image, thereby
removing regions that are located outside the boundaries n.sub.T
and n.sub.B, as shown in FIG. 7. An example of the ROI set on an
arbitrary frame in the ultrasound volume data is shown in FIG.
8.
[0036] The VOI setting unit 134 may be operable to set a volume of
interest (VOI) in the volume data by using the ROI. The VOI setting
unit 134 may select frames including the moving object within the
target object. A plurality of vertical lines may be set in the
horizontal cut plane image and standard deviation of brightness of
the vertical lines may be used to select the frames. For example,
the moving object may be the heart of a fetus. Since the heart
includes valves (displayed in a bright portion) as well as atria
and ventricles (displayed in a dark portion), the image of the
heart relatively has a high contrast. Thus, the vertical lines
including a heart region may be found by using the standard
deviation of the brightness of the vertical lines. Also, since the
contrast of neighboring vertical lines within the heart region
rapidly changes, the vertical lines included in the heart region
may be more accurately selected by considering standard deviation
of the neighboring vertical lines. This is to allow the vertical
lines, which are not included in the heart region and have a high
contrast, to be excluded from the vertical lines included in the
heart region.
[0037] The VOI setting unit 134 may select three vertical lines
having a maximum standard deviation difference between the
neighboring vertical lines in order to detect reference frames for
setting VOIs. Pseudo codes of algorithm for selecting three
vertical lines are as follows:
TABLE-US-00001 DO i = 0, 1, 2 Step 1. k ^ i = arg max k i ( .sigma.
k i - .sigma. k i - 1 ) , ( 0 .ltoreq. k i < K ) ##EQU00003##
Step 2. reject the range of [k.sub.i -C, K.sub.i + C] in the search
range END DO
[0038] wherein, .sigma..sub.k.sub.i represents the standard
deviation of the vertical lines existing on the horizontal cut
plane image, k.sub.i represents an order of the vertical lines
(identical to that of the frames in the volume), K represents the
number of total frames (identical to that of total vertical lines),
and C is a constant. Three frames including the three vertical
lines obtained by the above algorithm are used as the reference
frames for setting three VOIs.
[0039] The VOI setting unit 134 may be operable to collect the
neighboring frames of each of the three reference frames and set
the VOIs with the ROIs set on the collected frames. FIG. 9 shows an
example of the VOI set in the volume data. In FIG. 9, the length of
the VOI on a time axis may be determined according to frames
positioned at the right and left sides of the reference frame. The
width of the VOI is defined according to the width of the ROI set
in the ROI setting unit 133. The VOI set in the VOI setting unit
134 can be expressed as the following equation.
V.sub.{circumflex over (k)}.sub.i={f.sub.ROI(k), {circumflex over
(k)}.sub.i-1.ltoreq.k.ltoreq.{circumflex over (k)}.sub.i+1} for
each {circumflex over (k)}.sub.i (3)
[0040] wherein, {circumflex over (k)}hd i represents the positions
of three vertical lines having a maximum standard deviation in the
horizontal cut plane image (i.e., frame positions), f.sub.ROI(k)
represents the ROI in a k.sup.th frame, and V.sub.{circumflex over
(k)}.sub.i represents the VOI formed by combining the
[0041] ROI within the reference frame with those of neighboring
frames. The VOI setting unit 134 may set three VOIs for three
reference frames.
[0042] The correlation coefficient curve calculating unit 135 may
calculate the correlation coefficient curves for a constant time by
using the VOIs set in the VOI setting unit 134. The correlation
coefficient curve is calculated through the following equation.
.rho. i ( V k , V k ^ i ) = E [ V k V k ^ i ] - E [ V k ] E [ V k ^
i ] .sigma. V k .sigma. V k i , ( k ^ i - 200 .ltoreq. k < k ^ i
+ 200 ) for each k ^ i ( 4 ) ##EQU00004##
[0043] wherein, E[V.sub.k] and E[V.sub.{circumflex over (k)}.sub.i]
represent the average of brightness within VOIs at k and
{circumflex over (k)}.sub.i positions, .sigma..sub.V.sub.k and
.sigma..sub.V.sub.{circumflex over (k)}i represent the mean of
standard deviation of brightness within the VOIs at k and
{circumflex over (k)}.sub.i positions, and
.rho.(V.sub.k,V.sub.{circumflex over (k)}.sub.i) represents a
correlation coefficient between the VOI at a k position and the VOI
at a {circumflex over (k)}.sub.i position. The correlation
coefficient curve calculating unit 135 may calculate the three
correlation coefficient curves for three VOIs, which are set in the
VOI setting unit 134. The reason for calculating the three
correlation curves is to utilize the local characteristics in
volume data. FIG. 10 is a graph showing the three correlation
coefficient curves obtained in the correlation coefficient curve
calculating unit 135.
[0044] The period setting unit 136 may detect a heartbeat period.
FIG. 11 is a block diagram showing a procedure for detecting the
global period in the period setting unit 136. The period setting
unit 136 may include a filtering unit 310, a gradient calculating
unit 320 and a zero cross point detecting unit 330. The filtering
unit 310 may be operable to filter the correlation coefficient
curves to reduce noises included therein. A low pass filter may be
used in the filtering unit 310. The gradient calculating unit 320
may calculate the gradients in the filtered correlation coefficient
curves. The zero cross point detecting unit 330 may detect zero
cross points whose gradients are changed from positive to
negative.
[0045] Subsequently, the period setting unit 136 may set candidate
periods by using right and left zero crossing points of a central
zero crossing point at each of the three filtered correlation
coefficient curves. That is, the period setting unit 136 may
calculate six candidate periods of the heartbeat (Pn). FIG. 12
shows an example of detecting the heartbeat period in the filtered
correlation coefficient curve. The period setting unit 136 may
select one period (P.sub.3) having the highest frequency among the
candidate periods (P.sub.1, P.sub.2, P.sub.3) and then set the
selected period to a global period of the heartbeat. The period
selection can be expressed as the following equation.
p.sub.FHB=mode(p.sub.n)
[0046] wherein, p.sub.n represents six candidate periods detected
from three correlation coefficient curves formed from the VOI, and
p.sub.FHB represents a period having the highest frequency among
the six candidate periods.
[0047] Subsequently, the period setting unit 136 may set a new
reference frame, which is away from the reference frame by the
global period, and then set a search region including a
predetermined number of frames adjacent to the new reference frame.
The period setting unit 136 may be operable to calculate the
correlation coefficients between the VOI in the reference frame and
each VOI in each frame included in the search region. If a
correlation coefficient of an arbitrary frame is maximal among the
correlation coefficients of the frames included in the search
region and the correlation of the arbitrary frame is greater than a
value obtained by multiplying a predetermined weight by an average
of the correlation coefficients, then an interval between the
arbitrary frame and the reference frame is determined as the local
period. Thereafter, the period setting unit 136 may set a new
reference frame, which is away from the arbitrary frame by the
global period. Then, the above process for calculating the local
period is repeatedly carried out to the end of the volume data,
thereby obtaining total local periods.
[0048] The ultrasound data reconstructing unit 137 may perform a
linear interpolation for the frames included within each local
period by using the global period set in the period setting unit
136. The ultrasound data reconstructing unit 137 may be operable to
calculate a ratio (r) of the local period to the global period as
the following equation.
r = Local Period Global Period ( 6 ) ##EQU00005##
[0049] Thereafter, an interpolation frame (I') is calculated as the
following equation using the ratio (r) of the local period to the
global period.
I'=.DELTA..sub.2.times.I.sub.n+.DELTA..sub.1.times.I.sub.n+1
(7)
[0050] wherein, I.sub.n and I.sub.n+1 represent frames adjacent to
the interpolation frame I', .DELTA..sub.1 and .DELTA..sup.2
represent the distances between the adjacent frames and the
interpolation frames, and wherein .DELTA..sup.1 and .DELTA..sup.2
are determined according to the ratio (r) of the local period to
the global period. This interpolation process is carried out for
frames included in all local periods such that the same number of
frames is contained in each local period.
[0051] The ultrasound data reconstructing unit 137 may reconstruct
the interpolated volume data to provide a 3-dimensional ultrasound
image showing a figure of the heartbeat. FIG. 13 shows a procedure
for reconstructing the interpolated volume data. As shown in FIG.
13, twenty-six local periods A to Z may exist in one volume data.
Assuming that six frames are contained in one local period in the
volume data as shown in FIG. 13, the reconstructed volume data
includes six sub volumes. Each of the sub volumes consists of 26
frames A.sub.i to Z.sub.i.
[0052] Further, when the 3-dimensional volume data are acquired by
scanning the target object, the object (e.g., expectant mother or
fetus) may be moved. This makes it difficult to accurately detect
the heartbeat period of the fetus. Accordingly, the ultrasound
system may further include a motion compensating unit. The motion
compensating unit compensates for the motion of the expectant
mother or the fetus by matching the brightness of pixels between a
previously set VOI and a currently set VOI. The motion compensating
unit may be operable to calculate the motion vectors by summing the
absolute differences of brightness of pixels between the previously
set VOI and the currently set VOI. For example, assuming that the
VOI at a n.sup.th frame is expressed as V.sup.n(m), the VOI at a
next frame can be expressed as V.sup.n(m+1). In such a case, a
variable m represents the combination of n-1, n and n+1. The motion
compensating unit may move V.sup.n(m) up, down, right and left (i,
j), and then calculate the absolute differences of brightness of
pixels between V.sup.n(m) and V.sup.n(m+1) at each position. A
motion vector may be estimated at a position where the absolute
difference is minimal. The sum of the absolute difference is
calculated as the following equation.
SAD n ( i , j ) = m = - 1 1 l = 0 M - 1 k = n T n B V n ( m , k , l
) - V i , j n ( m + 1 , k , l ) for - W .ltoreq. i , j < W , 1
.ltoreq. n < K - 1 ( 8 ) ##EQU00006##
[0053] wherein, W represents a predefined motion estimated range, K
represents a total number of the frames, i,j represent motion
displacements, k,l represent the position of a pixel in the frame
included in VOI, and m represents the number of the frames.
[0054] The input unit 140 may receive setup information from a
user. The setup information may include reference frame setup
information, M-mode line setup information and rotation setup
information of a 3-dimensional ultrasound image. The processor 150
may be operable to form an M-mode image signal. The M-mode image
may be formed by using the reconstructed volume data based on the
setup information inputted from a user through the input unit
140.
[0055] The processor 150 may form a reference frame image 410 by
using the reconstructed volume data based on the reference frame
setup information as illustrated in FIG. 14. Then, the processor
150 may extract data corresponding to M-mode lines 421, 422 and
423, which are set on the reference frame, based on the M-mode line
setup information inputted through the input unit 140. The
processor 150 may form M-mode images 431, 432 and 433 corresponding
to the M-mode lines based on the extracted data. Although it is
described that three M-mode lines are set on the reference frame
image for ease of explanation in accordance with one embodiment of
the present invention, the M-mode lines may be arbitrarily set on
the reference frame image in accordance with another
embodiment.
[0056] Further, a different reference frame image may be formed
based on reference frame setup information while the previously
formed M-mode images 431, 432 and 433 are displayed. In such a
case, at least one M-mode line may be set on each of the different
reference frame images. Also, at least one M-mode image
corresponding to the M-mode line may be formed by extracting data
from the reconstructed volume data.
[0057] In accordance with another embodiment of the present
invention, reference frame setup information for forming a
plurality of reference frame images may be inputted through the
input unit 140. The processor 150 may form reference frame images
by using the reconstructed volume data based on the reference frame
setup information. At least one M-mode line may be set on each
reference frame image. The processor 150 may extract data
corresponding to the M-mode line set on each reference frame image
from the reconstructed volume data and forms M-mode image signals
for M-mode images based on the extracted data.
[0058] In accordance with yet another embodiment of the present
invention, the processor 150 may form a 3-dimensional ultrasound
image 510 as illustrated in FIG. 15 by using the reconstructed
volume data. M-mode lines 521, 522 and 523 may be set on the
3-dimensional ultrasound image 510 based on M-mode line setup
information inputted from the user through the input unit 140. The
processor 150 may extract data corresponding to the M-mode lines
521, 522 and 523 set on the 3-dimensional ultrasound image and form
M-mode images based on the extracted data. Although the three
M-mode lines are set, the number of the M-mode lines is not limited
thereto. If the rotation setup information is inputted through the
input unit 140, then the processor 150 may be operable to rotate
the 3-dimensional ultrasound image. Thereafter, M-mode lines may be
set on the rotated 3-dimensional ultrasound image and the processor
150 may form M-mode images corresponding to the M-mode lines set on
the rotated 3-dimensional ultrasound image.
[0059] The display unit 160 may receive the reference frame image
signals, the 3-dimensional ultrasound image signals and the M-mode
image signals to display the reference frame image, the
3-dimensional ultrasound image and the M-mode image.
[0060] As mentioned above, the present invention may display M-mode
images for different portions of the target object without moving a
probe.
[0061] In accordance with one embodiment of the present invention,
there is provided an ultrasound system, including: a probe operable
to transmit ultrasound signals to a target object containing a
periodically moving object and receive ultrasound echo signals
reflected from the target object; a volume data forming and
reconstructing unit operable to form volume data based on the
ultrasound echo signals, the volume data forming and reconstructing
unit being configured to determine a beat period of the moving
object and reconstruct the volume data based on the beat period; a
processor operable to at least one reference image based on the
reconstructed volume data and set a plurality of M-mode Lines on
the reference image, the processor being configured to form a
plurality of M-mode images by extracting data corresponding to the
M-mode lines from the reconstructed volume data; and a display unit
to display the reference image, the M-mode lines and the M-mode
images, one at a time, simultaneously or sequentially.
[0062] In accordance with another embodiment of the present
invention, there is provided a method of forming an ultrasound
image, including: a) transmitting ultrasound signals to a target
object containing a periodically moving object and receiving
ultrasound echo signals reflected from the target object; b)
forming volume data based on the ultrasound echo signals; c)
determining a beat period of the moving object and reconstructing
the volume data based on the beat period; d) forming at least one
reference image based on the reconstructed volume data; e) setting
a plurality of M-mode lines on the reference image; f) forming a
plurality of M-mode images by extracting data corresponding to the
plurality of M-mode lines from the reconstructed volume data; and
g) displaying the reference image, the M-mode lines and the M-mode
images, one at a time, simultaneously or sequentially.
[0063] Any reference in this specification to "one embodiment,"
"can embodiment," "example embodiment," etc. means that a
particular feature, structure or characteristic described in
connection with the embodiment is included in at least one
embodiment of the present invention. The appearances of such
phrases in various places in the specification are not necessarily
all referring to the same embodiment. Further, when a particular
feature, structure or characteristic is described in connection
with any embodiment, it is submitted that it is within the purview
of one skilled in the art to effect such feature, structure or
characteristic in connection with other ones of the
embodiments.
[0064] Although embodiments have been described with reference to a
number of illustrative embodiments thereof, it should be understood
that numerous other modifications and embodiments can be devised by
those skilled in the art that will fall within the spirit and scope
of the principles of this disclosure. More particularly, numerous
variations and modifications are possible in the component parts
and/or arrangements of the subject combination arrangement within
the scope of the disclosure, the drawings and the appended claims.
In addition to variations and modifications in the component parts
and/or arrangements, alternative uses will also be apparent to
those skilled in the art.
* * * * *