U.S. patent application number 15/395612 was filed with the patent office on 2017-06-15 for incubated state evaluating device, incubated state evaluating method, incubator, and program.
This patent application is currently assigned to NATIONAL UNIVERSITY CORPORATION NAGOYA UNIVERSITY. The applicant listed for this patent is NATIONAL UNIVERSITY CORPORATION NAGOYA UNIVERSITY, NIKON CORPORATION. Invention is credited to Hiroyuki HONDA, Ryuji KATO, Wakana YAMAMOTO.
Application Number | 20170166858 15/395612 |
Document ID | / |
Family ID | 42665320 |
Filed Date | 2017-06-15 |
United States Patent
Application |
20170166858 |
Kind Code |
A1 |
HONDA; Hiroyuki ; et
al. |
June 15, 2017 |
INCUBATED STATE EVALUATING DEVICE, INCUBATED STATE EVALUATING
METHOD, INCUBATOR, AND PROGRAM
Abstract
An incubated state evaluating device includes an image reading
unit, a feature value calculating unit, a frequency distribution
calculating unit, and an evaluation information generating unit.
The image reading unit reads a plurality of images in which a
plurality of cells incubated in an incubation container are
image-captured in time series. The feature value calculating unit
obtains each of feature values representing morphological features
of cells from. the images for each of the cells included in the
images. The frequency distribution calculating unit obtains each of
frequency distributions of the feature values corresponding to the
respective images. The evaluation information generating unit
generates evaluation information evaluating an incubated state of
cells in the incubation container based on a variation of the
frequency distributions between images.
Inventors: |
HONDA; Hiroyuki;
(Nagoya-shi, JP) ; KATO; Ryuji; (Nagoya-shi,
JP) ; YAMAMOTO; Wakana; (Kakamigahara-shi,
JP) |
|
Applicant: |
Name |
City |
State |
Country |
Type |
NATIONAL UNIVERSITY CORPORATION NAGOYA UNIVERSITY
NIKON CORPORATION |
Nagoya-shi
Tokyo |
|
JP
JP |
|
|
Assignee: |
NATIONAL UNIVERSITY CORPORATION
NAGOYA UNIVERSITY
Nagoya-shi
JP
NIKON CORPORATION
Tokyo
JP
|
Family ID: |
42665320 |
Appl. No.: |
15/395612 |
Filed: |
December 30, 2016 |
Related U.S. Patent Documents
|
|
|
|
|
|
Application
Number |
Filing Date |
Patent Number |
|
|
13203310 |
Dec 2, 2011 |
9567560 |
|
|
PCT/JP2010/001277 |
Feb 25, 2010 |
|
|
|
15395612 |
|
|
|
|
Current U.S.
Class: |
1/1 |
Current CPC
Class: |
C12M 41/46 20130101;
C12M 41/14 20130101; C12M 41/36 20130101; C12M 41/48 20130101 |
International
Class: |
C12M 1/34 20060101
C12M001/34 |
Foreign Application Data
Date |
Code |
Application Number |
Feb 26, 2009 |
JP |
2009-044375 |
Claims
1. An incubated state evaluating device, comprising: an image
reading unit reading a plurality of images in which a plurality of
cells incubated in an incubation container are image-captured in
time series; a feature value calculating unit obtaining, for each
of the cells included in the images, each of feature values
representing morphological features of the cells from the images; a
frequency distribution calculating unit obtaining each of frequency
distributions of the feature values corresponding to the respective
images.
2. The incubated state evaluating device according to claim 1,
wherein an evaluation information generating unit generating
evaluation information evaluating an incubated state of the cells
in the incubation container based on a variation of the frequency
distributions between the images.
Description
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application is a continuation application of U.S.
patent application Ser. No. 13/203,310, filed Dec. 2, 2011, which
in turn is a U.S. National Stage application claiming the benefit
of prior filed International Application No. PCT/JP2010/001277,
filed Feb. 25, 2010, in which the international Application claims
a priority date of Feb. 26, 2009 based on prior filed Japanese
Application No. 2009-044375, the entire contents of which are
incorporated herein by reference.
TECHNICAL FIELD
[0002] The present application relates to an incubated state
evaluating device, an incubated state evaluating method, an
incubator, and a program performing evaluation of an incubated
state of cells.
BACKGROUND ART
[0003] An art evaluating an incubated state of cells is a basic
technology in a wide range of fields including a sophisticated
medical field such as a regenerative medicine and a screening of
medical products. For example, there is a process proliferating and
differentiating cells at in vitro in the regenerative medicine
field. In the above-stated process, it is inevitable to accurately
evaluate the incubated state of cells tea manage results of the
differentiation of cells, and presence/absence of canceration and
infection of cells. As an example, an evaluation method of cancer
cells using a transcription factor as a marker is disclosed in
Patent Document 1.
Patent Document 1: Japanese Unexamined Patent Application
Publication No. 2007-195533
DISCLOSURE
Problems to be Solved
[0004] However, a pre-process to implement the marker to each cell
being an evaluation object in advance is necessary in the
above-stated conventional art, and therefore, it is very
complicated.
[0005] Accordingly, it is still requested to evaluate the incubated
state of cells from an image with high accuracy by a comparatively
easy method.
[0006] A proposition of the present application is to provide a
method to evaluate an incubated state of cells from an image with
high accuracy by a comparatively easy method.
Means for Solving the Problems
[0007] An incubated state evaluating device according to an aspect
includes an image reading unit, a feature value calculating unit, a
frequency distribution calculating unit, and an evaluation
information generating unit. The image reading unit reads a
plurality of images in which a plurality of cells incubated in an
incubation container are image-captured in time series. The feature
value calculating unit obtains each of feature values representing
morphological features of the cells from the images for each of the
cells included in the images. The frequency distribution
calculating unit obtains each of frequency distributions of the
feature values corresponding to the respective images. The
evaluation information generating unit generates evaluation
information evaluating an incubated state of the cells in the
incubation container based on a variation of the frequency
distributions between the images.
[0008] An incubated state evaluating device according to another
aspect includes an image reading unit, a feature value calculating
unit, and an evaluation information generating unit. The image
reading unit reads a plurality of images in which a plurality of
cells incubated in an incubation container are image-captured in
time series. The feature value calculating unit obtains each of
feature values representing morphological features of the cells
from the images for each of the cells included in the images. The
evaluation information generating unit generates evaluation
information predicting a future incubated state of the cells in the
incubation container by using a variation of the feature values
between the plurality of images.
[0009] Note that an incubator incorporating the incubated state
evaluating device, a program configured to cause a computer to
function as the incubated state evaluating device, a program
storage medium, and the one representing operations of the
incubated state evaluating device by a category of method are also
effective as concrete aspects of the present application.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] FIG. 1 is a block diagram illustrating an outline of an
incubator in one embodiment.
[0011] FIG. 2 is a front view of the incubator in the one
embodiment,
[0012] FIG. 3 is a plan view of the incubator in the one
embodiment.
[0013] FIG. 4 is a flowchart illustrating an example of an
observation operation at the incubator.
[0014] FIG. 5 is a flowchart illustrating an example of a
generation process of a computation model,
[0015] FIG. 6 is a view illustrating an outline of each feature
value.
[0016] FIGS. 7(a) to 7(c) are histograms each illustrating an
example of variation over a time lapse of a "Shape Factor", and
FIGS. 7(d) to 7(f) are histograms each illustrating an example of
variation over a time lapse of a "Fiber Length".
[0017] FIGS. 8(a) to 8(c) are views each illustrating features of
cell morphology obtained from microscopic image performing a time
lapse observation of a normal cell group.
[0018] FIGS. 9(a) to 9(c) are views each illustrating features of
cell morphology obtained from microscopic image performing a time
lapse observation of a cell group in which cancer cells are mixed
in normal cells.
[0019] FIG. 10 is a view illustrating an outline of an ANN.
[0020] FIG. 11 is a view illustrating an outline of an FNN.
[0021] FIG. 12 is a view illustrating a sigmoid function in the
FNN.
[0022] FIG. 13 is a flowchart illustrating an operation example of
an incubated state evaluating process.
[0023] FIGS. 14(a) and 14(b) are views illustrating incubated state
examples of myoblasts.
[0024] FIG. 15 is a histogram illustrating a variation over a time
lapse of a "Shape Factor" at an incubation time of the
myoblasts.
[0025] FIG. 16 is a histogram illustrating a variation over a time
lapse of the "Shape Factor" at the incubation time of the
myoblasts.
[0026] FIG. 17 is a graphic chart illustrating prediction results
of each sample in a first prediction model of an example.
[0027] FIG. 18 is a graphic chart illustrating prediction results
of each sample in a second prediction model of an example.
DETAILED DESCRIPTION OF THE EMBODIMENT
Configuration Example of One Embodiment
[0028] Hereinafter, a configuration example of an incubator
including an incubated state evaluating device according to one
embodiment is described with reference to the drawings. FIG. 1 is a
block diagram illustrating an outline of the incubator according to
the one embodiment. Besides, FIG. 2 and FIG. 3 are a front view and
a plan view of the incubator in the one embodiment.
[0029] An incubator 11 according to the one embodiment includes an
upper casing 12 and a lower casing 13. The upper casing 12 is
placed on the lower casing 13 under an assembled state of the
incubator 11. Note that an inner space between the upper casing 12
and the lower casing 13 is divided into upper and lower parts by a
base plate 14.
[0030] At first, an outline of a configuration of the upper casing
12 is described. A temperature-controlled room 15 incubating cells
is formed inside the upper casing 12. The temperature-controlled
room 15 includes a temperature regulator 15a and a humidity
regulator 15b, and an inside of the temperature-controlled room 15
is maintained to be an environment suitable for the incubation of
cells (for example, an atmosphere at a temperature of 37.degree. C.
and with a humidity of 90%) (Note that the temperature regulator
15a and the humidity regulator 15b are not illustrated in FIG. 2
and FIG. 3).
[0031] A large door 16, a middle door 17, and a small door 18 are
provided at a front surface of the temperature-controlled room 15.
The large door 16 covers front surfaces of the upper casing 12 and
the lower casing 13. The middle door 17 covers the front surface of
the upper casing 12, and isolates environments between the
temperature-controlled room 15 and outside when the large door 16
is opened. The small door 18 is a door to carry in/out an
incubation container 19 incubating cells, and attached to the
middle door 17. It becomes possible to suppress an environmental
change of the temperature-controlled room 15 by performing the
carrying in/out of the incubation container 19 from the small door
18. Note that airtightnesses of the large door 16, the middle door
17, and the small door 18 are respectively maintained by packings
P1, P2 and P3.
[0032] Besides, a stocker 21, an observation unit 22, a container
conveyor 23, and a conveyor table 24 are disposed at the
temperature-controlled room 15, Here, the conveyor table 24 is
disposed at a near side of the small door 18, to carry it/out the
incubation container 19 from the small door 18.
[0033] The stocker 21 is disposed at a left side of the
temperature-controlled room 15 when it is seen from the front
surface of the upper casing 12 (a lower side in FIG. 3). The
stocker 21 includes plural shelves, and plural incubation
containers 19 are able to be housed in respective shelves of the
stocker 21. Note that cells being the incubation objects are housed
in each of the incubation containers 19 together with a culture
medium.
[0034] The observation unit 22 is disposed at a right side of the
temperature-controlled room 15 when it is seen from the front
surface of the upper casing 12. This observation unit 22 is able to
execute a time lapse observation of cells inside the incubation
container 19.
[0035] Here, the observation unit 22 is disposed by being fitted
into an opening of the base plate 14 of the upper casing 12. The
observation unit 22 includes a sample stage 31, a stand arm 32
projected toward upward of the sample stage 31, and a main body
part 33 housing a microscopic optical system for phase difference
observation and an imaging device (34). The sample stage 31 and the
stand arm 32 are disposed at the temperature-controlled room 15, on
the other hand, the main body part 33 is housed inside the lower
casing 13.
[0036] The sample stage 31 is made up of a light transmissive
material, and the incubation container 19 is able to be placed
thereon. The sample stage 31 is made up to be able to move in a
horizontal direction, and a position of the incubation container 19
placed at an upper surface can be adjusted. Besides, an LED light
source 38 is housed in the stand arm 32. The imaging device 34 is
able to acquire features of cell morphology obtained from
microscopic image of cells by capturing images of the cells in the
incubation container 19 transilluminated from an upper side of the
sample stage 31 by the stand arm 32, via the microscopic optical
system.
[0037] The container conveyor 23 is disposed at a center of the
temperature-controlled room 15 when it is seen from the front
surface of the upper casing 12. The container conveyor 23 performs
a transfer of the incubation container 19 among the stocker 21, the
sample stage 31 of the observation unit 22, and the conveyor table
24.
[0038] As illustrated in FIG. 3, the container conveyor 23 includes
a vertical robot 34 having an articulated arm, a rotation stage 35,
a mini stage 36, and an arm part 37. The rotation stage 35 is
attached to a tip portion of the vertical robot 34 via a rotation
shaft 35a to be able to rotate for 180 degrees in a horizontal
direction. It is therefore possible for the rotation stage 35 to
face the arm part 37 relative to each of the stocker 21, the sample
stage 31, and the conveyor table 24.
[0039] Besides, the mini stage 36 is attached to be able to slide
in the horizontal direction relative to the rotation stage 35. The
arm part 37 gripping the incubation container 19 is attached to the
mini stage 36.
[0040] Next, an outline of a configuration of the lower casing 13
is described. The main body part 33 of the observation unit 22 and
a control device 41 of the incubator 11 are housed inside the lower
casing 13.
[0041] The control device 41 is coupled to each of the temperature
regulator 15a, the humidity regulator 15b, the observation unit 22,
and the container conveyor 23. The control device 41 totally
controls each part of the incubator 11 in accordance with a
predetermined program.
[0042] As an example, the control device 41 maintains inside the
temperature-controlled room 15 to be a predetermined environmental
condition by controlling each of the temperature regulator 15a and
the humidity regulator 15b. Besides, the control device 41 controls
the observation unit 22 and the container conveyor 23 based on a
predetermined observation schedule, and automatically executes an
observation sequence of the incubation container 19. Further, the
control device 41 executes an incubated state evaluating process
performing evaluation of the incubated state of cells based on the
images acquired by the observation sequence.
[0043] Here, the control device 41 includes a CPU 42 and a storage
unit 43. The CPU 42 is a processor executing various calculation
processes of the control device 41. Besides, the CPU 42 functions
as each of a feature value calculating unit 44, a frequency
distribution calculating unit 45, and an evaluation information
generating unit 46 by the execution of the program (note that
operations of the feature value calculating unit 44, the frequency
distribution calculating unit 45, and the evaluation information
generating unit 46 are described later).
[0044] The storage unit 43 is made up of nonvolatile storage media
such as a hard disk, a flash memory, and so on. Management data
relating to each incubation container 19 housed at the stocker 21
and data of the features of cell morphology obtained from
microscopic image captured by the imaging device are stored at the
storage unit 43. Further, the programs executed by the CPU 42 are
stored at the storage unit 43.
[0045] Note that (a) index data representing individual incubation
containers 19, (b) housed positions of the incubation containers 19
at the stocker 21, (c) kinds and shapes (well plate, dish, flask,
and so on) of the incubation containers 19, (d) kinds of cells
incubated at the incubation container 19, (e) the observation
schedule of the incubation container 19, (f) imaging conditions at
the time lapse observation time (a magnification of an objective
lens, observation points inside the container, and so on), or the
like are included in the above-stated management data. Besides, the
management data are generated by each small container as for the
incubation container 19 capable of simultaneously incubating cells
in plural small containers such as the well plate.
Example of Observation Operation in One Embodiment
[0046] Next, an example of observation operations at the incubator
11 in the one embodiment are described with reference to a
flowchart in FIG. 4. FIG. 4 illustrates an operation example in
which the time lapse observation of the incubation container 19
carried into the temperature-controlled room 15 is performed in
accordance with a registered observation schedule.
[0047] Step S101: The CPU 42 judges whether or not an observation
start time of the incubation container 19 arrives by comparing the
observation schedule of the management data of the storage unit 43
and a current date and time. When it is the observation start time
(YES side), the CPU 42 transfers the process to S102. On the other
hand, when it is not the observation time of the incubation
container 19 (NO side), the CPU 42 waits until the next observation
schedule time.
[0048] Step S102: The CPU 42 instructs the container conveyor 23 to
convey the incubation container 19 corresponding to the observation
schedule. The container conveyor 23 carries out the indicated
incubation container 19 from the stocker 21 and places on the
sample stage 31 of the observation unit 22. Note that an entire
observation image of the incubation container 19 is captured by a
bird view camera (not-illustrated) housed in the stand arm 32 at a
phase When the incubation container 19 is placed on the sample
stage 31.
[0049] Step S103: The CPU 42 instructs the observation unit 22 to
capture the features of cell morphology obtained from microscopic
image of the cells. The observation unit 22 illuminates the
incubation container 19 by lighting the LED light source 38, and
captures the features of cell morphology obtained from microscopic
image of the cells inside the incubation container 19 by driving
the imaging device 34.
[0050] At this time, the imaging device 34 captures the features of
cell morphology obtained from microscopic image based on the
imaging conditions (the magnification of the objective lens, the
observation points inside the container) specified by a user based
on the management data stored at the storage unit 43. For example,
when plural points inside the incubation container 19 are observed,
the observation unit 22 sequentially adjusts the position of the
incubation container 19 by the drive of the sample stage 31, to
capture each features of cell morphology obtained from microscopic
image at each point. Note that the data of the features of cell
morphology obtained from microscopic image acquired at the S103 is
read into the control device 41, and stored to the storage unit 43
by the control of the CPU 42.
[0051] Step S104: The CPU 42 instructs the container conveyor 23 to
convey the incubation container 19 after a completion of the
observation schedule. The container conveyor 23 conveys the
indicated incubation container 19 from the sample stage 31 of the
observation unit 22 to a predetermined housing position of the
stocker 21. After that, the CPU 42 finishes the observation
sequence to return the process to the S101. The description of the
flowchart in FIG. 4 is finished.
Incubated State Evaluating Process in One Embodiment
[0052] Next, an example of the incubated state evaluating process
in the one embodiment is described. In the one embodiment, an
example in which the control device 41 estimates a mixed ratio of
cancer cells in incubated cells of the incubation container 19 by
using plural features of cell morphologies obtained from
microscopic images acquired by performing the time lapse
observation of the incubation container 19 is described.
[0053] The control device 41 in the incubated state evaluating
process finds frequency distributions of feature values
representing morphological features of cells from the above-stated
features of cell morphologies obtained from microscopic images. The
control device 41 generates evaluation information in which the
mixed ratio of cancer cells is estimated based on a variation over
a time lapse of the frequency distribution. Note that the control
device 41 generates the evaluation information by applying the
acquired evaluation information to a computation model generated in
advance by a supervised learning.
Example of Generation Process of Computation Model
[0054] Hereinafter, an example of a generation process of the
computation model being a pre-process of the incubated state
evaluating process is described with reference to a flowchart in
FIG. 5. In the generation process of the computation model, the
control device 41 determines a combination of the frequency
distributions used for the generation of the evaluation information
from plural combinations of the frequency distributions of which
photographing time of the image and kinds of the feature values are
each different.
[0055] In the example in FIG. 5, features of cell morphology
obtained from microscopic image group of a sample is prepared in
advance by performing the time lapse observation of the incubation
container 19 where a cell group in which cancer cells are mixed is
incubated by the incubator 11 at the same view filed with the same
photographing condition. Note that in the features of cell
morphology obtained from microscopic image of the sample, a total
number of the cells and the number of cancer cells included in each
image are each known not from the image but by being experimentally
counted.
[0056] In the example in FIG. 5, the time lapse observation of the
incubated container 19 is performed until 72 hours elapsed by every
eight hours while a time when eight hours elapsed from the
incubation start is set as a first time. Accordingly, in the
example in FIG. 5, nine pieces (8 h, 16 h, 24 h, 32 h, 40 h, 48 h,
56 h, 64 h, 72 h) of the features of cell morphologies obtained
from microscopic images of the sample of which incubation container
19 and observation point are in common are acquired as one set.
Note that in the example in FIG. 5, the features of cell
morphologies obtained from microscopic images of the sample are
prepared for plural sets by performing the time lapse observation
of the plural incubation containers 19 respectively. Besides,
plural features of cell morphologies obtained from microscopic
images photographing plural points (for example, five points
observation or the whole of the incubation container 19) of the
same incubation container 19 at the same observation time zone may
be treated as an image for one time of the time lapse
observation.
[0057] Step S201: The CPU 42 reads the data of the features of cell
morphologies obtained from microscopic images of the sample
prepared in advance from the storage unit 43. Note that the CPU 42
in the S201 acquires information representing the total number of
cells and the number of cancer cells corresponding to each image at
this time.
[0058] Step S202: The CPU 42 specifies the image to be a process
object from among the features of cell morphologies obtained from
microscopic images of the sample (S201). Here, the CPU 42 at the
S202 sequentially specifies all of the features of cell
morphologies obtained from microscopic images of samples prepared
in advance as the process objects.
[0059] Step S203: The feature value calculating unit 44 extracts
the cells included in the image as for the features of cell
morphologies obtained from microscopic images being the process
objects (S202). For example, when the cells are captured by a phase
contrast microscope, a halo appears at a periphery of a portion of
which change of the phase difference is large such as a cell wall.
Accordingly, the feature value calculating unit 44 extracts the
halo corresponding to the cell wall by a publicly known edge
extracting method, and estimates that a closed space surrounded by
an edge by a contour tracing process is a cell. It is thereby
possible to extract individual cells from the features of cell
morphology obtained from microscopic image.
[0060] Step S204: The feature value calculating unit 44 finds each
of feature values representing morphological features of the cell
as for each cell extracted from the image at the S203. The feature
value calculating unit 44 at the S204 finds the following 16 kinds
of feature values respectively as for each cell.
[0061] Total Area (Refer to FIG. 6(a))
[0062] A "total area" is a value representing an area of a focused
cell. For example, the feature value calculating unit 44 is able to
find the value of the "total area" based on the number of pixels of
a region of the focused cell.
[0063] Hole Area (Refer to FIG. 6(b))
[0064] A "hole area" is a value representing an area of a "hole" in
the focused cell. Here, the "hole" means a part in which intensity
of image in the cell is a threshold value or more by a contrast (a
place to be a near white state in the phase difference
observation). For example, a lysosome of a cell organelle (the
lysosome was confirmed by staining later but not in the actual
image) and so on are detected as the "hole". Besides, a cell
nucleus and the other cell organelle may be detected as the "hole"
depending on the image. Note that the feature value calculating
unit 44 detects a group of pixels of which luminance value in the
cell is the threshold value or more as the "hole", and may find the
value of the "hole area" based on the number of pixels of the
"hole".
[0065] Relative Hole Area (refer to FIG. 6(c))
[0066] A "relative hole area" is a value in which the value of the
"hole area" is divided by the value of the "total area" (relative
hole area =hole area/total area). The "relative hole area" is a
parameter representing a percentage of the cell organelle in a size
of the cell, and the value varies in accordance with, for example,
a hypertrophy of the cell organelle, deterioration of a shape of a
nucleus, and so on.
[0067] Perimeter (Refer to FIG. 6(d))
[0068] A "perimeter" is a value representing a length of an outer
periphery of the focused cell. For example, the feature value
calculating unit 44 is able to acquire the value of the "perimeter"
by the contour tracing process when the cell is extracted.
[0069] Width (Refer to FIG. 6(e))
[0070] A "width" is a value representing a length in an image
lateral direction (X direction) of the focused cell.
[0071] Height (Refer to FIG. 6(f))
[0072] A "height" is a value representing a length in an image
vertical direction (Y direction) of the focused cell.
[0073] Length (Refer to FIG. 6(g))
[0074] A "length" is a value representing a maximum value among
lines getting across the focused cell (an entire length of the
cell).
[0075] Breadth (Refer to FIG. 6(h))
[0076] A "breadth" is a value representing a maximum value among
lines orthogonal to the "length" (a width of the cell).
[0077] Fiber Length (Refer to FIG. 6(i))
[0078] A "fiber length" is a value representing a length when the
focused cell is artificially assumed to be liner. The feature value
calculating unit 44 finds the value of the "fiber length" by the
following expression (1).
[Expression 1]
Fiber Length=1/4(P+ {square root over (P.sup.2-16A)}) (1)
[0079] Note that in the expression in the present specification, a
character "P" represents the value of the "perimeter". Similarly, a
character "A" represents the value of the "total area".
[0080] Fiber Breadth (refer to FIG. 6(j))
[0081] A "fiber breadth" is a value representing a width (a length
in a direction orthogonal to the "fiber length") when the focused
cell is artificially assumed to be liner. The feature value
calculating unit 44 finds the value of the "fiber breadth" by the
following expression (2).
[Expression 2]
Fiber Breadth=1/4(P- {square root over (P.sup.2-16A)}) (2)
[0082] Shape Factor (Refer to FIG. 6(k))
[0083] A "Shape factor" is a value representing a circular degree
(roundness of the cell) of the focused cell. The feature value
calculating unit 44 finds the value of the "shape factor" by the
following expression (3).
[ Expression 3 ] Shape Factor = 4 .pi. A P 2 ( 3 ) ##EQU00001##
[0084] Elliptical form Factor (Refer to FIG. 6(l))
[0085] An "elliptical form factor" is a value in which the value of
the "length" is divided by the value of the "breadth" (Elliptical
form factor=length/breadth), and is a parameter representing a
degree of slenderness of the focused cell.
[0086] Inner Radius (Refer to FIG: 6(m))
[0087] An "inner radius" is a value representing a radius of an
incircle of the focused cell.
[0088] Outer Radius (Refer to FIG. 6(n))
[0089] An "outer radius" is a value representing a radius of a
circumcircle of the focused cell.
[0090] Mean Radius (Refer to FIG. 6(o))
[0091] A "mean radius" is a value representing an average distance
between all points making up a contour of the focused cell and a
gravity center point thereof.
[0092] Equivalent Radius (Refer to FIG. 6(p))
[0093] An "equivalent radius" is a value representing a radius of a
circle having the same area with the focused cell. A parameter of
the "equivalent radius" represents a size when the focused cell is
virtually approximated to a circle.
[0094] Here, the feature value calculating unit 44 may find the
above-stated each feature value by adding error amounts to the
number of pixels corresponding to the cell. At this time, the
feature value calculating unit 44 may find the feature values in
consideration of the photographing conditions (a photographing
magnification, an aberration of the microscopic optical system, and
so on) of the features of cell morphology obtained from microscopic
image. Note that the feature value calculating unit 44 may find the
gravity center point of each cell based on a publicly known gravity
calculation method, and may find each parameter based on the
gravity center point when the "inner radius", the "outer radius",
the "mean radius", the "equivalent radius" are found.
[0095] Step S205: The feature value calculating unit 44 records
each of the 16 kinds of feature values of each cell (S204) to the
storage unit 43 as for the features of cell morphologies obtained
from microscopic images being the process objects (S202).
[0096] Step S206: The CPU 42 judges whether or not all of the
features of cell morphologies obtained from microscopic images are
already processed (a state in which feature values of each cell are
already acquired in the features of cell morphologies obtained from
microscopic images of all of the samples). When the above-stated
requirement is satisfied (YES side), the CPU 42 transfers the
process to S207. On the other hand, when the above-stated
requirement is not satisfied (NO side), the CPU 42 returns to the
S202, and repeats the above-stated operations while setting the
other features of cell morphologies obtained from microscopic
images which are not processed as the process objects.
[0097] Step S207: The frequency distribution calculating unit 45
finds a frequency distribution. of the feature value by each kind
of feature value as for each features of cell morphology obtained
from microscopic image. Accordingly, the frequency distribution
calculating unit 45 at the S207 finds the frequency distributions
of 16 kinds of feature values for the features of cell morphologies
obtained from microscopic images acquired by one time observation.
Besides, the number of cells corresponding to each division of the
feature values is found as a frequency (%) in each frequency
distribution.
[0098] Besides, the frequency distribution calculating unit 45 at
the S207 normalizes the division of the frequency in the
above-stated frequency distribution by using a standard deviation
by each kind of the feature value. Here, a case when the division
at the frequency distribution of the "shape factor" is determined
is described as an example.
[0099] At first, the frequency distribution calculating unit 45
calculates the standard deviation of all of the values of the
"shape factor" found from each of the features of cell morphologies
obtained from microscopic images. Next, the frequency distribution
calculating unit 45 substitutes the value of the standard deviation
to an expression of a Fisher, to find a reference value of the
division of the frequency in the frequency distribution of the
"shape factor". At this time, the frequency distribution
calculating unit 45 divides the standard deviation (S) of the all
values of the "shape factor" by four, and round off at the third
decimal place to set it as the reference value. Note that the
frequency distribution calculating unit 45 of the one embodiment
plots divisions for 20 series on a monitor and so on, when the
frequency distribution is illustrated as a histogram.
[0100] As an example, when the standard deviation S of the "shape
factor" is 259, "64.75" becomes the reference value because
259/4=64.750. When the frequency distribution of the "shape factor"
of the focused image is found, the frequency distribution
calculating unit 45 classifies the cells into each class set by
every 64.75 from "0" (zero) value in accordance with the value of
the "shape factor", and the number of cells in each class is
counted.
[0101] As stated above, the frequency distribution calculating unit
45 normalizes the division of the frequency distribution with the
standard deviation by each kind of the feature value, and
therefore, it is possible to approximate a tendency of the
frequency distribution between the different feature values in a
large sense. Accordingly, it is comparatively easy in the one
embodiment to find a correlation between the incubated state of
cells and the variation of the frequency distribution between the
different feature values.
[0102] Here, FIG. 7(a) is a histogram illustrating a variation over
a time lapse of the "shape factor" when an initial mixture ratio of
the cancer cell is "0" (zero) %. FIG. 7(b) is a histogram
illustrating a variation over a time lapse of the "shape factor"
when the initial mixture ratio of the cancer cell is 6.7%. FIG.
7(c) is a histogram illustrating a variation over a time lapse of
the "shape factor" when the initial mixture ratio of the cancer
cell is 25%.
[0103] Besides, FIG. 7(d) is a histogram illustrating a variation
over a time lapse of the "fiber length" when the initial mixture
ratio of the cancer cell is "0" (zero) %. Note that the histogram
is illustrated only up to a value of "fiber length=323" for easy to
understanding in the drawing. FIG. 7(e) is a histogram illustrating
a variation over a time lapse of the "fiber length" when the
initial mixture ratio of the cancer cell is 63%. FIG. 7(f) is a
histogram illustrating a variation over a time lapse of the "fiber
length" when the initial mixture ratio of the cancer cell is
25%.
[0104] Step S208: The evaluation information generating unit 46
finds the variation over the time lapse of the frequency
distribution by each kind of feature value.
[0105] The evaluation information generating unit 46 at the S208
combines two frequency distributions of Which kinds of feature
values are the same and photographing times are different among the
frequency distributions (9.times.16) acquired from the features of
cell morphologies obtained from microscopic images for one set. As
an example, the evaluation information generating unit 46
respectively combines the frequency distributions at eight hours
elapsed and 16 hours elapsed, the frequency distributions at eight
hours elapsed and 24 hours elapsed, the frequency distributions at
eight hours elapsed and 32 hours elapsed, the frequency
distributions at eight hours elapsed and 40 hours elapsed, the
frequency distributions at eight hours elapsed and 48 hours
elapsed, the frequency distributions at eight hours elapsed and 56
hours elapsed, the frequency distributions at eight hours elapsed
and 64 hours elapsed, the frequency distributions at eight hours
elapsed and 72 hours elapsed of which kinds of feature values are
the same and photographing times are different as for nine
frequency distributions. Namely, when the feature value of one kind
in one set is focused, eight kinds of combinations as a total are
generated per the frequency distributions of the feature value.
[0106] The evaluation information generating unit 46 finds a
variation of the frequency distribution (an absolute value of the
difference of the frequency distributions between images is
integrated) by the following expression (4) as for each of the
eight kinds of combinations.
[ Expression 4 ] variation of frequency distribution = i Control i
- Sample i ( 4 ) ##EQU00002##
[0107] Note that in the expression (4), the "control" represents a
frequency (the number of cells) for one division at the frequency
distribution in an initial state (at eight hours elapsed). Besides,
the "sample" represents the frequency for one division at a
frequency distribution being a comparison object. Further, "i" is a
variable representing a division of the :frequency
distribution,
[0108] The evaluation info kation generating unit 46 is able to
acquire eight ways of variations of the frequency distributions as
for each of all of the feature values by performing the above
stated calculation by each kind of the feature value. Namely, it is
possible to obtain 16 kinds.times.8 ways of 128 combinations of the
frequency distributions as for the features of cell morphologies
obtained from microscopic images for one set. Hereinafter, one
combination of the frequency distributions is represented only as
an "index" in the present specification. Note that it goes without
saying that the evaluation information generating unit 46 at the
S208 finds each of the variations of the 128 frequency
distributions corresponding to respective indexes in the plural
sets of the features of cell morphologies obtained from microscopic
images.
[0109] Here, a reason focusing on the variation over the time lapse
of the frequency distribution in the one embodiment is described.
FIGS. 8 each illustrate the features of cell morphology obtained
from microscopic image when a normal cell group the initial mixture
ratio of cancer cell is "0" (zero) %) is incubated in the incubator
11, and the time lapse observation is performed. Note that the
histograms illustrating the frequency distributions of the "shape
factor" found from the respective features of cell morphologies
obtained from microscopic images are FIG. 8 are illustrated in FIG.
7(a).
[0110] In this case, the number of cells increases in accordance
with the time lapse in each of the images in FIG. 8, but the
frequency distributions corresponding to each of the images
maintain almost the same shape in the histograms of the "shape
factor" illustrated FIG. 7(a).
[0111] On the other hand, FIGS. 9 each illustrate the features of
cell morphology obtained from microscopic image when the normal
cell group to which the cancer cells are mixed in advance for 25%
is incubated in the incubator 11, and the time lapse observation is
performed. Note that the histograms representing the frequency
distributions of the "shape factor" found from the respective
features of cell morphologies obtained from microscopic images in
FIG. 9 are illustrated in FIG. 7(c)
[0112] In this case, a ratio of the cancer cells (rounded cells) of
which shapes are different from the normal cell increases in
accordance with the time lapse in each of the images in FIG. 9.
Accordingly, a large variation appears in the shapes of the
frequency distributions corresponding to the respective images in
accordance with the time lapse in the histograms of the "shape
factor" illustrated in FIG. 7(c). It turns out that the variation
over the time lapse of the frequency distribution is strongly
correlated with the mixture of the cancer cells. The inventors
perform the evaluation of the incubated cell state from the
information of the variation over the time lapse of the frequency
distribution based on the above-stated knowledge.
[0113] Step S209: The evaluation information generating unit 46
specifies one or more of indexes properly reflecting the incubated
state of cells by a multivariate analysis from among the 128
indexes. In a selection of combination of the indexes, a usage of a
linear model is effective in accordance with the cells and a
complexity of the morphology thereof in addition to a nonlinear
model equivalent to a later-described fuzzy neural network. The
evaluation information generating unit 46 finds a computation model
deriving the number of cancer cells from the features of cell
morphologies obtained from microscopic images by using the
above-stated specified one or more indexes together with the
selection of the combination of indexes as stated above.
[0114] Here, the evaluation information generating unit 46 at the
S209 finds the computation model by a Fuzzy Neural Network (FNN)
analysis.
[0115] The FNN is a method combining an Artificial Neural Network
(ANN) and a fuzzy inference. In the FNN, a decision of a membership
function is performed automatically by incorporating the ANN into
the fuzzy inference so as to avoid a portion depending the decision
on a human being which is a defect of the fuzzy inference. The ANN
being one of learning machines (refer to FIG. 10) is the one in
which a neural network in a brain of a living human body is
mathematically modeled, and has characteristics described below.
The learning in the ANN is a process in which a model is built such
that an output value approximates to a supervisor value by changing
a coupling load in a circuit coupling between nodes (represented by
circles in FIG. 10) so that an error between the supervisor value
and the output value (Y) becomes small by a back propagation (BP)
method by using a data the earning (input value: X) having an
objective output value (supervisor value). According to this BP
method, it is possible for the ANN to automatically acquire
knowledge by the learning. It is possible to evaluate a general
versatility of the model by finally inputting data which is not
used for the learning. Conventionally, the decision of the
membership function is dependent on a human sense, but it becomes
possible to identify the membership function automatically by
incorporating the ANN as stated above into the fuzzy inference.
This is the FNN. In the FNN, the BP method is used, and thereby it
is possible to automatically identify and modeling an input/output
relationship given to a network by changing the coupling load as
same as the ANN. The FNN has a characteristic in which a knowledge
can be acquired as a linguistic rule which is easy to be understood
by the human being such as the fuzzy inference (refer to a dialog
balloon at a lower right in FIG. 11 as an example) by analyzing the
model after the learning. Namely, the FNN automatically determines
an optimum combination of the fuzzy inference in the combination of
variables such as numeric values representing the morphological
features of the cells from a structure, the features thereof, and
it is possible to simultaneously perform an estimation relating to
a prediction target and a generation of rules representing the
combinations of the indexes effective for the prediction.
[0116] The structure of the FNN is made up of four layers of an
"input layer", a "membership function part (antecedent part)"
determining parameters We, Wg included in a sigmoid function, a
"fuzzy rule part (consequent part)" capable of determining Wf and
extracting a relationship between an input and an output as a rule,
and an "output layer" (refer to FIG. 11). There are We, Wg, Wf in
the coupling loads determining the model structure of the FNN. The
coupling load We determines a center position of the sigmoid
function used for the membership function, and the Wg determines a
gradient at the center position (refer to FIG. 12). Within the
model, an input value is expressed with flexibility near the human
sense by a fuzzy function (refer to a dialog balloon at a lower
left in FIG. 11 as an example). The coupling load Wf represents a
contribution of each fuzzy area for an estimation result, and a
fuzzy rule can be derived from the Wf. Namely, the structure inside
the model can be decoded afterward, and it can be written as the
rule (refer to a dialog balloon at a lower right in FIG. 11 as an
example).
[0117] The Wf value being one of the coupling loads is used to
create the fuzzy rule in the FNN analysis. When the Wf value is a
positive value and large, a unit makes a large contribution to be
judged as "efficient for the prediction", and the index lit to the
rule is judged to be "effective". When the Wf value is a negative
value and small, the unit makes a large contribution to be judged
as "not efficient for the prediction", and the index fit to the
rule is judged to be "not effective".
[0118] As a more concrete example, the evaluation information
generating unit 46 at the S209 finds the above-stated computation
model by the processes of the following (A) to (H).
[0119] (A) The evaluation information generating unit 46 selects
one index from along 128 indexes.
[0120] (B) The evaluation information generating unit 46 finds the
number of cancer cells (prediction value) in a set of respective
features of cell morphologies obtained from microscopic images by a
calculation expression in Which the variation of the frequency
distribution by the index selected at the (A) is set as a
variable.
[0121] A calculation expression to find the number of the cancer
cells from one index is assumed to be "Y=.alpha.X.sub.1" (note that
the "Y" is a calculated value of the cancer cells (for example, a
value representing an increased number of the cancer cells), the
"X.sub.1" is the variation of the frequency distribution
corresponding to the selected index (the one found at the S208),
and the ".alpha." is a coefficient value corresponding to the
"X.sub.1", respectively). At this time, the evaluation information
generating unit 46 substitutes an arbitrary value to the ".alpha.",
and substitutes each variation of the frequency distribution in
each set to the "X.sub.1". The calculated value (Y) of the cancer
cells in each set is thereby found.
[0122] (C) The evaluation information generating unit 46 finds each
error between the calculated value Y found at the (B) and the
actual number of cancer cells (supervisor value) as for each set of
the features of cell morphologies obtained from microscopic images.
Note that the supervisor value is found by the evaluation
information generating unit 46 based on the information of the
number of cancer cells read at the S201.
[0123] The evaluation information generating unit 46 corrects the
coefficient ".alpha." of the calculation expression by the
supervised learning such that the error of the calculated value at
each set of the features of cell morphologies obtained from
microscopic images becomes smaller.
[0124] (D) The evaluation information generating unit 46 repeats
the processes of the (B) and the (C), and acquires a model of the
calculation expression in which an average error of the calculated
values becomes the smallest as for the index of the (A).
[0125] (E) The evaluation information generating unit 46 repeats
respective processes of the (A) to the (D) as for each of the 128
indexes. The evaluation information generating unit 46 compares the
average errors of the calculated values in each of the 128 indexes,
and sets the index of which average error becomes the lowest to be
a first index used for the generation of the evaluation
information.
[0126] (F) The evaluation information generating unit 46 finds a
second index to be combined with the first index found at the (E).
At this time, the evaluation information generating unit 46 pairs
the first index with the remaining 127 indexes one by one. Next,
the evaluation information generating unit 46 finds a prediction
error of the cancer cells by a calculation expression in each
pair.
[0127] A calculation expression to find the number of the cancer
cells from two indexes is assumed to be
"Y=.alpha.X.sub.1+.beta.X.sub.2" (note that the "Y" represents the
calculated value of the cancer cells, the "X.sub.1" represents the
variation of the frequency distribution corresponding to the first
index, the ".alpha." represents a coefficient value corresponding
to the "X.sub.1", the "X.sub.2" is a variation of a frequency
distribution corresponding to a selected index, the ".beta." is a
coefficient value corresponding to the "X.sub.2" respectively). The
evaluation information generating unit 46 finds the values of the
coefficients ".alpha.", ".beta." such that the average error of the
calculated values is the smallest by the similar processes as the
(B) and the (C).
[0128] After that the evaluation information generating unit 46
compares the average errors of the calculated values found at the
respective pairs, and finds the pair of which average value is the
lowest. The evaluation information generating unit 46 sets the
indexes of the pair of which average errors are the lowest to be
the first and the second index used for the generation of the
evaluation information.
[0129] (G) The evaluation information generating unit 46 terminates
a calculation process at a stage when a predetermined termination
condition is satisfied. For example, the evaluation information
generating unit 46 compares the average errors by the respective
calculation expressions before and after the index is added. The
evaluation information generating unit 46 terminates the
calculation process of the S209 when the average error of the
calculation expression after the index is added is higher than the
average error of the calculation expression before the index is
added (or when the difference of both is within a tolerance
range).
[0130] (H) On the other hand, when the termination condition is not
satisfied at the (G), the evaluation information generating unit 46
further adds the number of indexes to repeat the similar processes
as the (F) and the (G). Accordingly, a narrow down of the indexes
is performed by a stepwise variable selection when the computation
model is found.
[0131] Step S210: The evaluation information generating unit 46
records information of the computation model found at the S209
(information representing each index used for the calculation
expression, information of the coefficient values corresponding to
each index in the calculation expression, and so on) to the storage
unit 43. Hereinabove, the description of FIG. 5 is finished.
[0132] Here, a computation model of which prediction accuracy of
the contamination rate of cancer cells is 93.2% can be acquired
when three of the "combination of the frequency distributions of
the `shape factor` at eight hours elapsed and 72 hours elapsed",
the "combination of the frequency distributions of the `perimeter`
at eight hours elapsed and 24 hours elapsed", and the "combination
of the frequency distributions of the `length` at eight hours
elapsed and 72 hours elapsed" are used as the indexes of the
computation model.
[0133] Besides, a computation model of which prediction accuracy of
the contamination rate of cancer cell is 95.5% can be acquired when
six of the "combination of the frequency distributions of the
`shape factor` at eight hours elapsed and 72 hours elapsed", the
"combination of the frequency distributions of the `fiber breadth`
at eight hours elapsed and 56 hours elapsed", the "combination of
the frequency distributions of the `relative hole area` at eight
hours elapsed and 72 hours elapsed", the "combination of the
frequency distributions of the `shape factor` at eight hours
elapsed and 24 hours elapsed", the "combination of the frequency
distributions of the `breadth` at eight hours elapsed and 72 hours
elapsed", and the "combination of the frequency distributions of
the `breadth` at eight hours elapsed and 64 hours elapsed" are used
as the indexes of the computation model.
Example of Incubated State Evaluating Process
[0134] Next, an operation example of the incubated state evaluating
process is described with reference to a flowchart in FIG. 13.
[0135] Step S301: The CPU 42 reads the data of the plural features
of cell morphologies obtained from microscopic images to be the
evaluation objects from the storage unit 43. Here, the features of
cell morphologies obtained from microscopic images being the
evaluation objects are the ones acquired by performing the time
lapse observation of the incubation containers 19 incubating the
cell groups at the same visual field with the same photographing
conditions by the incubator 11. Besides, the time lapse observation
in this case is performed by every eight hours up to 72 hours
elapses while setting the time when eight hours elapsed after the
incubation start time as the first time to align the conditions
with the example in FIG. 5.
[0136] Step S302: The CPU 42 reads the information of the
computation model of the storage unit 43 (the one recorded at the
S210 in FIG. 5).
[0137] Step S303: The feature value calculating unit 44, the
frequency distribution calculating unit 45, and the evaluation
information generating unit 46 each find the variation of the
frequency distribution as for each index corresponding to the
variables of the above-stated computation model. The process in the
S303 correspond to the S203, S204, S207, S208 in FIG. 5, and
therefore, the redundant description is not given.
[0138] Step S304: The evaluation information generating unit 46
substitutes the variation of the frequency distribution of each
index found at the S303 into the computation model read at the S302
to perform the calculation. The evaluation information generating
unit 46 generates the evaluation information representing the
mixture ratio of the cancer cells in the features of cell
morphologies obtained from microscopic images being the evaluation
object based on the calculation result. After that, the evaluation
information generating unit 46 displays the evaluation information
on a not-illustrated monitor or the like. Hereinabove, the
description of FIG. 13 is finished.
[0139] According to the one embodiment, it is possible for the
control device 41 to accurately predict the mixture ratio of the
cancer cells from the variation over the time lapse of the
frequency distribution of the feature value by using the features
of cell morphologies obtained from microscopic images acquired by
the time lapse observation. Besides, it is possible for the control
device 41 of the one embodiment to set the cells as it is to be the
evaluation object, and therefore, it is extremely effective when,
for example, the cells incubated for a screening of medical
products and a regenerative medicine are evaluated.
[0140] Note that in the one embodiment, the example evaluating the
mixture ratio of the cancer cells from the features of cell
morphologies obtained from microscopic images is described, but for
example, it is possible to use the control device 41 for evaluation
of a degree of an induction of differentiation of an embryonic stem
cell (ES cell) and an induced pluripotent stem cell (iPS cell).
Besides, the evaluation information found in the one embodiment is
able to be used as an abnormality detection means of a
differentiation, a dedifferentiation, a tumor cancer, an activation
deterioration, a contamination of cells, and so on in the
incubation cell group being the evaluation object, and as a means
to engineeringly manage a quality of the incubation cell group
being the evaluation object.
EXAMPLE
[0141] Hereinafter, an example of a differentiation prediction of
myoblasts is described as an example of the one embodiment. An
application of this differentiation prediction of the myoblasts is
expected in, for example, a quality control in a myoblast sheet
transplantation performed as one of treatments for a heart disease,
a quality control in a regenerative therapy of muscular tissues,
and so on.
[0142] When a component of a culture medium is changed caused by
lowering of a serum. concentration at the incubation time of the
myoblasts, a differentiation from the myoblast to a myotube cell
occurs, and it is possible to create intramuscular tissues. FIG.
14(a) illustrates an example of the incubated state of the
myoblasts, and FIG. 14(b) illustrates an example in which the
myoblasts are differentiated.
[0143] Besides, FIG. 15, FIG. 16 are histograms each illustrating a
variation over a time lapse of the "shape factor" at the incubation
time of the myoblasts. FIG. 15 illustrates the frequency
distributions of the "shape factor" at "0" (zero) hour elapsed and
112 hours elapsed when the differentiation is recognized in the
myoblasts (serum is 4%). FIG. 16 illustrates the frequency
distributions of the "shape factor" at "0" (zero) hour elapsed and
112 hours elapsed when the differentiation is not recognized in the
myoblasts (high serum condition). Note that the frequency
distribution at "0" (zero) hour elapsed is represented by a dotted
line and the frequency distribution at 112 hours elapsed is
represented by a solid line in each of FIG. 15 and. FIG. 16.
[0144] When the two histograms in FIG. 15 are compared, there is a
large variation in the shapes of the two. On the other hand, when
the two histograms in FIG. 16 are compared, there is not such a
large variation. Accordingly, it turns out that there is the
variation in the histogram in accordance with a variation in the
differentiation of the myoblasts (a mixture ratio of the
differentiated myoblasts).
[0145] Here, in the example, the time lapse observation is
performed as for 72 pieces of samples of the myoblasts by the
incubator according to the one embodiment at eight hours interval
up to the fifth day respectively. A control device (an incubated
state evaluating device) performs as differentiation prediction of
the myoblasts by the following two stages of processes.
[0146] At the first stage, a two group discrimination model
alternatively discriminating presence/absence of the
differentiation of the myoblasts is generated by the control device
based on a generation process of the above-stated computation
model. Specifically, the control device selects an index from among
all of the indexes acquired from the features of cell morphologies
obtained from microscopic images from eight hours elapsed to 32
hours elapsed after the observation is started, and develops a
first discrimination model finding a degree of differentiation of
the myoblasts, and a second discrimination model discriminating
presence/absence of the differentiation by a threshold value from
the degree of differentiation found by the first discrimination
model. The control device separates each of the 72 pieces of
samples into two groups in accordance with the presence/absence of
the differentiation by a discriminant analysis according to the
first discrimination model and the second discrimination model. As
a result, the control device is able to discriminate the
presence/absence of the differentiation correctly in all of the 72
pieces of samples.
[0147] At a second stage, a prediction model predicting the degree
of differentiation of the myoblasts at the fifth day (120 hours
elapsed) is generated by the control device based on the generation
process of the computation model. In the example, two kinds of
prediction models (a first prediction model, a second prediction
model) are developed by the control device by using only 42 pieces
of samples which are discriminated to be "differentiated" at the
process of the first stage from among the 72 pieces of samples.
[0148] The first prediction model is a prediction model using five
indexes of the "combination of the frequency distributions of the
`breadth` at eight hours elapsed and 48 hours elapsed", the
"combination of the frequency distributions of the `breadth` at
eight hours elapsed and 32 hours elapsed", the "combination of the
frequency distributions of the `inner radius` at eight hours
elapsed and 24 hours elapsed", the "combination of the frequency
distributions of the `length` at eight hours elapsed and 104 hours
elapsed", the "combination of the frequency distributions of the
`hole area` at eight hours elapsed and 96 hours elapsed".
[0149] FIG. 17 is a graphic chart illustrating a prediction result
of each sample according to the first prediction model in the
example. A vertical axis in FIG. 17 represents the degree of
differentiation of the sample at the fifth day predicted by the
first prediction model. A horizontal axis in FIG. 17 represents a
value in which the degree of differentiation of the sample is
evaluated by a person of skill (an apparent degree of
differentiation) at the time of the fifth day. In FIG. 17, one
point is plotted on the graph by each sample. Note that it can be
judged that an accuracy of prediction is higher as the point is
plotted near a line extending from an upper right to a lower left
of the graph in FIG. 17. In the first prediction model, the
prediction accuracy (a right answer rate) of the differentiation
rate is 90.5% (error.+-.5%).
[0150] Besides, the second prediction model is a prediction model
using five indexes of the "combination of the frequency
distributions of the `breadth` at eight hours elapsed and 48 hours
elapsed", the "combination of the frequency distributions of the
`breadth` at eight hours elapsed and 32 hours elapsed", the
"combination of the frequency distributions of the `inner radius`
at eight hours elapsed and 24 hours elapsed", the "combination of
the frequency distributions of an `orientation (cell orientation)`
at eight hours elapsed and 16 hours elapsed", the "combination of
the frequency distributions of a `modified orientation (variation
degree of cell orientation)` at eight hours elapsed and 24 hours
elapsed".
[0151] Note that the "orientation" is a feature value representing
an angle made by a major axis of each cell and a horizontal
direction (X axis) of an image. When the values of the
"orientation" are the same, the cells are oriented in the same
direction. Besides, the "modified orientation" is a feature value
digitizing the angle of each cell under a state in which the cells
in the image are deformed by a filtering, and calculating a
variation thereof. A value of the "modified orientation" has a
characteristic representing a larger value as the angles of the
cells are more diverse.
[0152] FIG. 18 is a graphic chart representing a prediction result
of each sample according to the second prediction model in the
example. A way how to look at FIG. 18 is similar to FIG. 17, and
therefore, the redundant explanation is not given. In the second
prediction model, the prediction of differentiation is performed
based on observation results up to the second day (48 hours
elapsed), and a prediction accuracy thereof becomes 85.7% (error
.+-.5%). According to the second prediction model in the example,
it is possible to perform a qualitative prediction of the
differentiation predication of the myoblasts which is normally very
difficult with high accuracy by the observation results up to the
second day.
Supplementary Items of Embodiments
[0153] (1) In the one embodiment, the example in which the
incubated state evaluating device is incorporated in the control
device 41 of the incubator 11 is described, but the incubated state
evaluating device of the present invention may be made up of an
external independent computer acquiring the features of cell
morphologies obtained from microscopic images from the incubator 11
and performing an analysis thereof (this case is not
illustrated).
[0154] (2) In the one embodiment, the example in which the
respective functions of the feature value calculating unit, the
frequency distribution calculating unit, the evaluation information
generating unit are enabled by a program by way of software is
described., but it goes without saying that these processes may be
enabled by an ASIC by way of hardware.
[0155] (3) In the one embodiment, the example in which the control
device 41 finds the computation model of the evaluation information
and the indexes thereof by the FNN is described. However, the
incubated state evaluating device of the present invention may be
the one finding the computation model of the evaluation information
and the indexes thereof by, for example, the other multivariate
analysis such as a multiple regression analysis.
[0156] Further, the incubated state evaluating device of the
present invention may combine multi-variate or multi-parameter to
generate final evaluation information by a majority decision (or a
weighting average) of calculation results by these computation
models. In this case, it is possible to cover a state of which
accuracy is low according to one computation model by the other
model, and to enhance the accuracy of the evaluation information in
a case such that, for example, the MRA is effective for data of
which mixture ratio is low and the FNN is effective for the data of
which mixture ratio is high.
[0157] (4) Besides, the incubated state evaluating device may at
first calculate calculation results by combining plural indexes, to
adjust the indexes by a stepwise method when the computation model
of the evaluation information is found. Besides, all of the indexes
may be used if the accuracy of data is high.
[0158] (5) In the one embodiment, the example in which the
variation of the frequency distribution is found by using an
absolute value sum of the difference between two frequency
distributions is described, but the variation of the frequency
distribution may be found from a square sum of the difference
between the two frequency distributions. Besides, the calculation
expressions illustrated in the one embodiment are just examples,
and it may be, for example, an n-th equation and so on more than
secondary.
[0159] (6) The feature values illustrated in the embodiment and the
example are only examples, and it goes without saying that
parameters of the other feature values may be used in accordance
with the kinds of cells being the evaluation object.
[0160] The many features and advantages of the embodiments are
apparent from the detailed specification and, thus, it is intended
by the appended claims to cover all such features and advantages of
the embodiments that fall within the true spirit and scope thereof.
Further, since numerous modifications and changes will readily
occur to those skilled in the art, it is not desired to limit the
inventive embodiments to the exact construction and operation
illustrated and described, and accordingly all suitable
modifications and equivalents may be resorted to, falling within
the scope thereof.
* * * * *