U.S. patent application number 11/543813 was filed with the patent office on 2007-04-19 for probe observing device and surface texture measuring device.
This patent application is currently assigned to MITUTOYO CORPORATION. Invention is credited to Masanori Arai.
Application Number | 20070086620 11/543813 |
Document ID | / |
Family ID | 37441065 |
Filed Date | 2007-04-19 |
United States Patent
Application |
20070086620 |
Kind Code |
A1 |
Arai; Masanori |
April 19, 2007 |
Probe observing device and surface texture measuring device
Abstract
Provided is a probe observing device, including: a camera (230)
for taking an image of a probe (210 ); an image processing unit
(330) for processing data on the image taken by the camera (230); a
monitor (400) for displaying the image data processed by the image
processing unit (330); and a mouse (500) for inputting an
instruction for image processing through manual operation. The
image processing unit (330) includes an image data processing unit
(350) for processing the image data in accordance with the
instruction inputted by the mouse. The camera (230) includes a
low-magnification lens system, and is provided to be fixed in
position with respect to the probe (210) such that the probe (210)
enters the view field of the camera (230).
Inventors: |
Arai; Masanori;
(Kawasaki-shi, JP) |
Correspondence
Address: |
OLIFF & BERRIDGE, PLC
P.O. BOX 19928
ALEXANDRIA
VA
22320
US
|
Assignee: |
MITUTOYO CORPORATION
KAWASAKI-SHI
JP
|
Family ID: |
37441065 |
Appl. No.: |
11/543813 |
Filed: |
October 6, 2006 |
Current U.S.
Class: |
382/100 |
Current CPC
Class: |
G01B 11/007 20130101;
G01B 5/008 20130101 |
Class at
Publication: |
382/100 |
International
Class: |
G06K 9/00 20060101
G06K009/00 |
Foreign Application Data
Date |
Code |
Application Number |
Oct 19, 2005 |
JP |
2005-304338 |
Claims
1. A probe observing device for taking an image of a probe provided
to a measuring device main body and for displaying the image of the
probe, the measuring device main body detecting a surface of the
object to be measured by using the probe to measure a surface
texture of the object surface to be measured, the probe observing
device comprising: an image taking device for taking an image of
the probe; an image processing unit for processing image data on
the image taken by the image taking device; a display for
displaying the image data processed by the image processing unit;
and an operating device for inputting, through manual operation, an
instruction for image processing to be performed by the image
processing unit, wherein: the image processing unit comprises an
image data processing unit for processing image data in accordance
with one of the instruction inputted by the operating device and an
instruction inputted from a measuring part program; and the image
of the probe is capable of being operated through one of the
instruction inputted by the operating device and the instruction
inputted from the measuring part program.
2. A probe observing device according to claim 1, wherein the image
taking device comprises a lens system for magnifying the image of
the probe at a magnification of 100 times or less.
3. A probe observing device according to claim 1, further
comprising a three-axis-orthogonal driving mechanism for driving
the image taking device in the directions of three axes of an
X-axis, a Y-axis, and a Z-axis, the three axes being orthogonal to
one another.
4. A probe observing device according to claim 1, wherein the image
taking device is provided to be fixed in position with respect to
the probe such that the probe enters a view field of the image
taking device.
5. A probe observing device according to claim 1, wherein: the
operating device is capable of inputting: a region selection
instruction for selecting, as a selected region, a predetermined
region on a display screen of the display; and a movement
instruction for moving the selected region to a predetermined
position on the display screen of the display; and the image data
processing unit comprises: a selected region recognition unit for
recognizing the selected region selected based on the region
selection instruction and extracting data corresponding to the
selected region from the image data; and a region movement
processing unit for moving the selected region to a position as
instructed by the movement instruction on the display screen of the
display.
6. A probe observing device according to claim 1, wherein: the
operating device is capable of inputting: a region selection
instruction for selecting, as a selected region, a predetermined
region on a display screen of the display; and a magnification
instruction for magnifying the selected region to a predetermined
size on the display screen of the display; and the image data
processing unit comprises: a selected region recognition unit for
recognizing the selected region selected by the operating device
and extracting data corresponding to the selected region from the
image data; and a region enlargement processing unit for magnifying
the selected region to a size instructed by the magnification
instruction on the display screen of the display.
7. A probe observing device according to claim 1, wherein the image
data processing unit further comprises: a selected region
recognition unit for recognizing a region including a measuring
point set in advance and the probe, and extracting data
corresponding to the region such recognized from the image data;
and a region enlargement processing unit for magnifying the region
extracted by the selected region recognition unit at a
predetermined magnification on the display screen of the display in
accordance with a distance between the measuring point and the
probe.
8. A surface texture measuring device, comprising: a probe
observing device; and a measuring device main body having a probe,
wherein the probe observing device is a probe observing device for
taking an image of a probe provided to a measuring device main body
and for displaying the image of the probe, the measuring device
main body detecting a surface of the object to be measured by using
the probe to measure a surface texture of the object surface to be
measured, the probe observing device comprising: an image taking
device for taking an image of the probe; an image processing unit
for processing image data on the image taken by the image taking
device; a display for displaying the image data processed by the
image processing unit; and an operating device for inputting,
through manual operation, an instruction for image processing to be
performed by the image processing unit, wherein: the image
processing unit comprises an image data processing unit for
processing image data in accordance with one of the instruction
inputted by the operating device and an instruction inputted from a
measuring part program; and the image of the probe is capable of
being operated through one of the instruction inputted by the
operating device and the instruction inputted from the measuring
part program.
9. A surface texture measuring device according to claim 8, wherein
the image taking device comprises a lens system for magnifying the
image of the probe at a magnification of 100 times or less.
10. A surface texture measuring device according to claim 8,
further comprising a three-axis-orthogonal driving mechanism for
driving the image taking device in the directions of three axes of
an X-axis, a Y-axis, and a Z-axis, the three axes being orthogonal
to one another.
11. A surface texture measuring device according to claim 8,
wherein the image taking device is provided to be fixed in position
with respect to the probe such that the probe enters a view field
of the image taking device.
12. A surface texture measuring device according to claim 8,
wherein: the operating device is capable of inputting: a region
selection instruction for selecting, as a selected region, a
predetermined region on a display screen of the display; and a
movement instruction for moving the selected region to a
predetermined position on the display screen of the display; and
the image data processing unit comprises: a selected region
recognition unit for recognizing the selected region selected based
on the region selection instruction and extracting data
corresponding to the selected region from the image data; and a
region movement processing unit for moving the selected region to a
position as instructed by the movement instruction on the display
screen of the display.
13. A surface texture measuring device according to claim 8,
wherein: the operating device is capable of inputting: a region
selection instruction for selecting, as a selected region, a
predetermined region on a display screen of the display; and a
magnification instruction for magnifying the selected region to a
predetermined size on the display screen of the display; and the
image data processing unit comprises: a selected region recognition
unit for recognizing the selected region selected by the operating
device and extracting data corresponding to the selected region
from the image data; and a region enlargement processing unit for
magnifying the selected region to a size instructed by the
magnification instruction on the display screen of the display.
14. A surface texture measuring device according to claim 8,
wherein the image data processing unit further comprises: a
selected region recognition unit for recognizing a region including
a measuring point set in advance and the probe, and extracting data
corresponding to the region such recognized from the image data;
and a region enlargement processing unit for magnifying the region
extracted by the selected region recognition unit at a
predetermined magnification on the display screen of the display in
accordance with a distance between the measuring point and the
probe.
Description
BACKGROUND OF THE INVENTION
[0001] 1. Field of the Invention
[0002] The present invention relates to a probe observing device
and a surface texture measuring device. The present invention
relates to, for example, a probe observing device for observing a
probe used in measuring a surface texture of an object to be
measured by using a monitor or the like.
[0003] 2. Description of Related Art
[0004] There has been known a surface texture measuring device for
measuring a surface texture or a three-dimensional profile of an
object to be measured by scanning the object surface to be
measured. Examples of such the surface texture measuring device
include a roughness measuring machine, a counter measuring machine,
a roundness measuring machine, and a coordinate measuring machine.
The surface texture measuring device employs a probe for detecting
the object surface to be measured. In recent years, in response to
an increasing demand for an extremely high measuring accuracy along
with the use of a more precise object to be measured, a precision
structure has been adopted for the probe. Therefore, it has become
difficult to visually observe a probe with a naked eye. Under the
circumstance, there has been proposed a method of observing a probe
by magnifying the probe by a magnifier or the like to thereby
position the probe on a measuring point (refer to, for example, JP
07-505958 A).
[0005] JP 07-505958 A discloses, for example, a measuring device
including: a probe for detecting a surface of the object to be
measured; a video camera for taking an image of the probe; and a
monitor for displaying the image taken by the video camera. At
least one of the probe and the video camera is provided so as to be
capable of rotating or pivoting, thereby making it possible to
adjust the probe and the video camera such that the probe enters an
image-taking range of the video camera. With this configuration,
the probe is adjusted such that the measuring point and the probe
both enter the view field of the video camera simultaneously, and a
user brings the probe to the measuring point with reference to the
image displayed on the monitor.
[0006] In this method, in a case where the precision probe is
adopted, the video camera must be equipped with a
high-magnification magnifier in order to observe the probe at a
sufficient magnification.
[0007] In this case, however, the view field of the video camera
narrows as the magnification of the magnifier increases, making it
difficult to adjust the probe to enter the view field of the video
camera. Accordingly, there arises a problem in that it is also
time-consuming to adjust the probe and the camera such that the
probe enters the view field of the camera, which reduces measuring
efficiency. On the other hand, adoption of a low-magnification
magnifier naturally makes it easy to adjust the probe to enter the
view field of the video camera, whereas the close observation of
the precision probe cannot be attained with the low-magnification
magnifier. Therefore, there arises another problem in that it is
impossible to accurately bring the probe to the measuring point,
which reduces measuring accuracy.
SUMMARY OF THE INVENTION
[0008] The present invention has an object to provide a probe
observing device and a surface texture measuring device which are
capable of observing a probe with high measuring accuracy through
simple operation.
[0009] The present invention provides a probe observing device for
taking an image of a probe provided to a measuring device main body
and for displaying the image of the probe, the measuring device
detecting a surface of the object to be measured by using the probe
to measure a surface texture of the object surface to be measured,
the probe observing device including: an image taking device for
taking an image of the probe; an image processing unit for
processing image data on the image taken by the image taking
device; a display for displaying the image data processed by the
image processing unit; and an operating device for inputting,
through manual operation, an instruction for image processing to be
performed by the image processing unit, in which: the image
processing unit includes an image data processing unit for
processing image data in accordance with one of the instruction
inputted by the operating device and an instruction inputted from a
measuring part program; and the image of the probe is capable of
being operated through one of the instruction inputted by the
operating device and the instruction inputted from the measuring
part program.
[0010] With the above configuration, the image taking device takes
an image of the probe. Then, the image processing unit processes
data on the image taken by the image taking device. The image
processing unit processes the image data based on the instruction
inputted by the operating device or the instruction inputted from
the measuring part program. The image processing includes, for
example, digital zoom processing for magnifying only a part of the
image. The image data thus processed is displayed on the display.
The probe is observed based on the image displayed on the display.
Even a precision probe, for example, may be displayed after being
enlarged through processing such as digital zoom processing, which
enables a user to observe the probe with clarity.
[0011] With the above configuration, data on the taken image can be
processed by the image data processing unit. Accordingly, the image
of the probe is clearly displayed on the display through the
processing such as enlargement processing, with the result that the
probe can be observed with clarity, making it possible, for
example, to bring the probe to a measuring point with accuracy.
[0012] Also, the image data processing unit performs image
processing so as to, for example, have an enlarged image of the
probe can be displayed, which makes it possible to reduce the
magnification of the lens system provided to the image taking
device itself. The image taking device provided with such the
low-magnification lens system has a wider view field as compared
with a case of adopting a high-magnification lens system. With the
wider view field, it is possible to easily bring the probe into the
view field by merely directing the orientation of the lens toward
the probe. Accordingly, there is no need to make a fine adjustment
to the positional relationship between the probe and the image
taking device such that the probe enters the view field of the
image taking device. Therefore, the operation is simplified to save
work in the measurement, leading to an increase in measuring
efficiency.
[0013] Here, the measuring part program is configured to execute
measuring operation following the predetermined measuring
procedure. Based on the program, operations are executed to, for
example, move the probe to a measuring point, magnify and reduce
the probe, and sample the measurement data.
[0014] According to the present invention, the image taking device
preferably includes a lens system for magnifying the image of the
probe at a magnification of 100 times or less.
[0015] With the above configuration, the image taking device has a
wider view field when provided with the lens system of a
magnification of 100 times or less, which makes it easy to bring
the probe into the view field by merely directing the orientation
of the lens toward the probe. Accordingly, there is no need to make
a fine adjustment to the positional relationship between the probe
and the image taking device such that the probe enters the view
field of the image taking device. Therefore, the operation is
simplified to save work in the measurement, leading to an increase
in measuring efficiency.
[0016] To be specific, the lens may have the magnification of 2 to
4 times, and the image processing such as digital zoom processing
may be performed at a magnification of the order of 20 times, to
thereby obtain a magnification of the order of 40 to 80 times in
total.
[0017] According to the present invention, the probe observing
device preferably includes a three-axis-orthogonal driving
mechanism for driving the image taking device in the directions of
three axes of an X-axis, a Y-axis, and a Z-axis, the three axes
being orthogonal to one another.
[0018] With the above configuration, it is possible to move the
image taking device in the directions of the X-axis, the Y-axis,
and the Z-axis, by the three-axis-orthogonal driving mechanism to
thereby bring the probe into the view field of the image taking
device.
[0019] Here, the image taking device has a wider view field when
provided with a low-magnification lens system, which makes it
possible to bring the probe into the view field by roughly moving
the image taking device through the three-axis-orthogonal driving
mechanism. Further, the image data can be processed by the image
data processing unit, and accordingly, it is not always necessary
to make an adjustment to bring the probe in the center of the view
field. Therefore, it is easy to make a positional adjustment
through the three-axis-orthogonal driving mechanism, which
simplifies the operation to save work in the measurement, leading
to an increase in measuring efficiency.
[0020] According to the present invention, the image taking device
is preferably provided to be fixed in position with respect to the
probe such that the probe enters a view field of the image taking
device.
[0021] Here, the image taking device is provided in such a manner
that the positional relationship between the image taking device
and the probe is fixed, which includes a case where the probe is
mounted to a probe mount if the probe mount for mounting and fixing
the probe is provided to the measuring device main body, and the
image taking device is also mounted and fixed to the probe
mount.
[0022] Also, the lens system may have a relatively low
magnification of, for example, 15 times, 10 times, or 5 times.
[0023] Due to the above configuration, there is no need to make a
fine adjustment to the positional relationship between the probe
and the image taking device such that the probe enters the view
field of the image taking device. Therefore, the operation is
simplified to save work in the measurement, leading to an increase
in measuring efficiency.
[0024] According to the present invention, it is preferable that
the operating device be capable of inputting: a region selection
instruction for selecting, as a selected region, a predetermined
region on a display screen of the display; and a movement
instruction for moving the selected region to a predetermined
position on the display screen of the display, and the image data
processing unit include: a selected region recognition unit for
recognizing the selected region selected based on the region
selection instruction and extracting data corresponding to the
selected region from the image data; and a region movement
processing unit for moving the selected region to a position as
instructed by the movement instruction on the display screen of the
display.
[0025] With the above configuration, the image data processing unit
performs the image processing, in response to an instruction
inputted from the operating device, based on the instruction
inputted. For example, in a case where the probe is located in the
corner of the view field of the image taking device, a region
selection instruction is inputted from the operating device so as
to select a predetermined region including the probe as the
predetermined region on the display screen. Then, the selected
region recognition unit recognizes the region selected through the
operating device, and extracts a part of the data corresponding to
the selected region from the image data obtained by the image
taking device. Further, a movement instruction is inputted from the
operating device so as to move the selected region to a
predetermined position, for example, to the center of the display
screen. Then, the region movement processing unit moves the
selected region to the position as instructed by the movement
instruction. Due to the movement of the selected region, the probe
is displayed in the center of the screen on the display screen of
the display. The user can clearly observe the probe, which is
displayed in the center of the display screen, to thereby, for
example, bring the probe to a measuring point with accuracy. It is
possible to select the region on the display screen to move the
selected region to, for example, the center of the display screen,
which eliminates the need to make an adjustment to the positional
relationship between the probe and the image taking device such
that the probe actually comes to the center of the view field of
the image taking device. Therefore, the operation is simplified to
save work in the measurement, leading to an increase in measuring
efficiency.
[0026] According to the present invention, it is preferable that
the operating device be capable of inputting: a region selection
instruction for selecting, as a selected region, a predetermined
region on a display screen of the display; and an enlargement
instruction for magnifying the selected region at a predetermined
magnification on the display screen of the display, and the image
data processing unit include: a selected region recognition unit
for recognizing the selected region selected by the operating
device and extracting data corresponding to the selected region
from the image data; and a region enlargement processing unit for
magnifying the selected region at a magnification as instructed by
the enlargement instruction on the display screen of the
display.
[0027] With the above configuration, the image data processing unit
performs the image processing, in response to an instruction
inputted from the operating device, based on the instruction
inputted. For example, a region selection instruction is inputted
from the operating device so as to select a predetermined region
including the probe as the predetermined region on the display
screen. Then, the selected region recognition unit recognizes the
region selected through the operating device, and extracts a part
of the data corresponding to the selected region from the image
data obtained by the image taking device. Further, an enlargement
instruction is inputted from the operating device so as to magnify
the selected region at a predetermined magnification. Then, the
region enlargement processing unit magnifies the selected region at
the magnification as instructed by the enlargement instruction.
[0028] Due to the enlargement of the selected region, the enlarged
probe is displayed on the display screen of the display. The user
can clearly observe the enlarged probe displayed on the display
screen, to thereby, for example, bring the probe to a measuring
point with accuracy. It is possible to select the region to magnify
the selected region on the display screen, and therefore the lens
system of the image taking device itself may have a low
magnification. With the low-magnification lens system, the image
taking device obtains a wider view field, which makes it possible
to bring the probe into the view field.
[0029] Further, with the wider view field, it is possible to bring
the probe into the view field by merely fixing the orientation of
the image taking device toward the probe. Therefore, there is no
need to make an adjustment to the positional relationship between
the probe and the image taking device, saving work in the
measurement, thereby making it possible to increase measuring
efficiency. It is also possible to display the entire probe and the
entire object to be measured on the screen at a low magnification
at first so as to cutout in stages the periphery of the measuring
point for enlargement, which leads to ease of operation of bringing
the probe to the measuring point.
[0030] According to the present invention, it is preferable that
the image data processing unit further include: a selected region
recognition unit for recognizing a region including a measuring
point set in advance and the probe, and extracting data
corresponding to the region thus recognized from the image data;
and a region enlargement processing unit for magnifying the region
extracted by the selected region recognition unit at a
predetermined magnification on the display screen of the display in
accordance with a distance between the measuring point and the
probe.
[0031] It is also possible, for example, that the coordinate of the
measuring point is inputted in advance, and the measuring point is
automatically enlarged when the probe comes close to the measuring
point.
[0032] With such the configuration, the operation to be performed
by the user is made significantly easy.
[0033] The present invention also provides a surface texture
measuring device including the probe observing device and a
measuring device main body having a probe.
[0034] With the above configuration, it is possible to clearly
observe the probe with reference to the image of the probe
displayed on the display, to thereby, for example, make it possible
to bring the probe to the measuring site with accuracy. As a
result, the surface texture measuring device performs measurement
with a high degree of accuracy.
[0035] Also, there is no need to make an adjustment to the
positional relationship between the probe and the image taking
device, which simplifies the operation to save work in the
measurement, thereby making it possible to increase measuring
efficiency of the surface texture measuring device.
BRIEF DESCRIPTION OF THE DRAWINGS
[0036] In the accompanying drawings:
[0037] FIG. 1 is a diagram showing an entire configuration of a
surface texture measuring device;
[0038] FIG. 2 is a diagram showing a state where an image of a
probe is taken by a camera;
[0039] FIG. 3 is a diagram showing a configuration of a central
control unit;
[0040] FIG. 4 is a diagram showing a state where an image of a
contact portion of the probe taken by the camera is displayed on a
monitor;
[0041] FIG. 5 is a diagram showing a state where an image of the
probe is moved to the center of a monitor screen;
[0042] FIG. 6 is a diagram showing how the image of the probe is
enlarged on the monitor screen; and
[0043] FIG. 7 is a diagram showing an enlarged image of the probe
displayed on the monitor screen.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
[0044] Hereinafter, an embodiment of the present invention is
explained with reference to the drawings and reference numerals
denoting elements in the drawings.
[0045] FIG. 1 is a diagram showing an entire configuration of a
surface texture measuring device 100.
[0046] FIG. 2 is a diagram showing a state where an image of a
probe 210 is taken by a camera 230.
[0047] The surface texture measuring device 100 includes: a
measuring device main body 200; a central control unit 300; a
monitor (i.e., display) 400; and a mouse (i.e., operating device)
500. The measuring device main body 200 includes the probe 210
capable of moving three-dimensionally and a camera (i.e., image
taking device) 230 for observing the probe 210. The central control
unit 300 controls operation of the measuring device 200 and
processes image data on an image taken by the camera 230. The
monitor 400 displays measurement data and image data. The mouse 500
is used by a user to input an operation instruction through manual
operation.
[0048] The measuring device main body 200 includes: the probe 210;
a coordinate driving mechanism 220 as a moving mechanism for
three-dimensionally moving the probe 210 along with a surface of
the object to be measured; and the camera 230 for observing the
probe 210.
[0049] Examples of the probe 210 include a vibration touch probe
210 for steadily vibrating a stylus 211 with a contact portion 212
on the tip in the axial direction, and for detecting a change in
vibrations of the stylus 211 on contact with the object surface to
be measured by the contact portion 212. The object surface to be
measured may be scanned by the vibration touch probe 210 by the
contact portion 212 with the contact portion 212 of the vibration
touch probe 210 staying contact with the object surface to be
measured with a constant measuring force. Alternatively, the
contact portion 212 of the vibration contact probe 210 may be
brought into contact with the object surface to be measured and
taken away from the surface to thereby perform a touch
measurement.
[0050] The coordinate driving mechanism 220 adopts a slide
mechanism in X, Y, and Z directions, which is used in a
conventional coordinate measuring device. To be specific, the
coordinate driving mechanism 220 includes: a bridge-type frame 221
provided to be capable of slide-moving back and forth (in the Y
direction) with respect to a stage 240; a Z-column 223 provided to
be capable of slide-moving a beam 222 of the bridge-type frame 221
from side to side (in the X direction); and a Z-spindle 224
provided to be capable of sliding vertically (in the Z direction)
within the Z-column 223. Incidentally, there is also an alternative
configuration in which the bridge-type frame 223 is fixed, and
upper surface of the stage 240 can be slide-moved back and forth
(in the Y direction).
[0051] The probe 210 and the camera 230 are provided underneath of
the Z-spindle 224. In other words, the base of the Z-spindle 224
serves as a probe mount. Each axis of the coordinate driving
mechanism 220 has an encoder 225 (refer to FIG. 3) provided for
detecting a driving amount.
[0052] The camera (i.e., image taking device) 230 is provided with
an optical system (not shown).
[0053] The optical system includes an optical lens system such as
an objective lens and an image taking system such as a CCD.
[0054] The camera 230 is provided underneath of the Z-spindle 224.
The camera 230 is adjustable so as to be oriented in a direction in
which the probe 210 enters the view field thereof, but the
orientation of the camera has normally adjusted to be fixed in a
direction in which the contact portion 212 of the probe 210 enters
the view field thereof. The camera 230 employs an objective lens of
a magnification which is low enough to easily capture the contact
portion 212 of the probe 210 in the view field of the camera 230
without adjusting the orientation of the camera 230 or the position
of the probe 210. The objective lens may have a magnification of,
for example, 3 times to 20 times.
[0055] FIG. 3 is a diagram showing a configuration of a central
control unit 300.
[0056] The central control unit 300 includes: a measuring operation
control unit 310; a profile analysis unit 320; an image processing
unit 330; a data output unit 360; and a central processing unit
(CPU) 370. The measuring operation control unit 310 controls
measuring operation through controlling the operation of the
coordinate driving mechanism 220. The profile analysis unit 320
obtains by calculation a surface texture of an object to be
measured, based on the result of detection made by the encoder 225.
The image processing unit 330 performs image processing on the data
on an image taken by the camera 230. The data output unit 360
performs output processing on the measured data and the image data.
The central processing unit (CPU) 370 controls operation of the
central control unit as a whole.
[0057] The measuring operation control unit 310 controls a driving
amount of each of the bridge-type frame 221, the Z-column 223, and
the Z-spindle 224 of the coordinate driving mechanism 220. To be
specific, the measuring operation control unit 310 controls the
operation of the coordinate driving mechanism 220 such that the
object surface to be measured is scanned by the probe 210 with the
probe 210 staying contact with the object surface to be measured.
The measuring operation control unit 310 causes the encoder 225 to
sample a displacement of the probe 210 at a predetermined sampling
pitch. The result of the sampling is outputted to the profile
analysis unit 320.
[0058] The profile analysis unit 320 obtains the surface profile of
the object to be measured based on the displacement of the probe
210 detected by the encoder 225. The profile data thus obtained as
to the object surface to be measured is displayed on the monitor
400 through the data output unit 360.
[0059] The image processing unit 330 includes: an image entry unit
340 for accepting entries of the data on the image taken by the
camera 230 and buffering the data; and an image data processing
unit 350 for processing the image data accepted by the image entry
unit 340, in accordance with an instruction from the user.
[0060] The image entry unit 340 stores an image signal sent from
the camera 230. When the signal is a digital signal, the image
entry unit 340 directly stores the signal as digital data. When the
signal is an analog signal, the image entry unit 340 converts the
signal to a digital signal and stores the digital signal as digital
data. The image data processing unit 350 includes a selected region
recognition unit 351, a region movement processing unit 352, a
region enlargement processing unit 353. Operation of those units
will be explained later with reference to examples of the monitor
screen of FIGS. 4 to 7.
[0061] Here, the camera 230, the image processing unit 330, the
monitor 400, and the mouse 500 constitute the probe observing
device.
[0062] An operation of measuring an object to be measured by using
the surface texture measuring device 100 having the above structure
will be explained as follows.
[0063] First, an object to be measured is placed on the stage 240,
and the contact portion 212 of the probe 210 is brought into
contact with a measuring site of the object to be measured; the
user adjusts the position of the probe 210 with reference to the
display on the monitor 400 in bringing the contact portion 212 of
the probe 210 to the measuring site.
[0064] The camera 230, which is fixed such that the contact portion
212 of the probe 210 enters the view field thereof, takes an image
of the contact portion 212 of the probe 210.
[0065] The image data thus taken is entered to the image entry unit
340 and temporarily buffered as a digital image signal. Then, the
image entry unit 340 outputs the image data through the data output
unit 360 to the monitor 400, on which the contact portion 212 of
the probe 210 is displayed.
[0066] At this time, the orientation of the camera 230 is fixed
such that the contact portion 212 of the probe 210 enters the view
field thereof, which does not necessarily mean that the contact
portion 212 is in the center of the view field of the camera
230.
[0067] FIG. 4 is a diagram showing a state where an image of the
contact portion 212 of the probe 210 taken by the camera 230 is
displayed on the monitor 400.
[0068] As shown in FIG. 4, for example, there may be a case where
the contact portion 212 of the probe 210 is located in the corner
of the view field of the camera. In such the case, the user
instructs data processing to be made such that the contact portion
212 of the probe 210 is displayed in substantially the center of
the monitor screen.
[0069] In processing the image data, the user first selects the
region the user desires to move to substantially the center of the
screen. For example, as shown in FIG. 4, the user selects the
region by using a mouse pointer on the display screen through
controlling the mouse 500.
[0070] Once the region selecting operation is performed as
described above, the selected region recognition unit 351
recognizes the selected region. The selected region recognition
unit 351 extracts data corresponding to the selected region from
the image data to put the extracted data into another frame, while
drawing a box 402 enclosing the selected region 401 on the
monitor.
[0071] Next, the user pulls the selected region 401 on the screen,
so as to move the region to substantially the center of the monitor
display screen, to substantially the center of the screen.
[0072] For example, the user drags the selected region 401 by using
the mouse 500.
[0073] When the user performs operation of dragging the selected
region 401 so as to move the selected region as described above,
the region movement processing unit 352 moves the selected region
401 along with the dragging operation. The selected region 401 thus
moved is displayed in a different window 403 in the center of the
screen, as shown in FIG. 5. The contact portion of the probe 210 is
displayed in the window 403, and the user brings the contact
portion 212 to the measuring site with reference to the display
screen in which the contact portion 212 of the probe 210 is
displayed.
[0074] The camera 230 is equipped with the objective lens of a
magnification which is low enough that the camera 230 can capture
the contact portion 212 of the probe 210 in the view field without
positional adjustment. Under the low magnification of the objective
lens, the contact portion 212 of the probe 210 may not be enlarged
to be sufficient enough for a visual check on the display screen.
For example, as shown in FIG. 6, there may be a case where the
contact portion 212 of the probe 210 is not enlarged sufficient
enough.
[0075] In such the case, the user instructs the image data
processing so that the contact portion 212 of the probe 210 is
enlarged on the monitor screen. That is, the user selects a region
to magnify by enclosing the region.
[0076] The selected region 401 is recognized by the selected region
recognizing unit 351, and the user pulls (or drags) the corner of
the selection are 401 so as to magnify the selected region 401 on
the display screen. Once the operation of dragging the corner of
the selected region 401 is performed as described above, the region
enlargement processing unit 353 magnifies the selected region 401
along with the dragging operation. Then, the selected region 401 is
enlarged and displayed in a different window 404, as shown in FIG.
7.
[0077] The user brings the contact portion 212 to the measuring
site with reference to the window 404 in which the contact portion
212 of the probe 210 is enlarged.
[0078] When the contact portion 212 of the probe 210 is brought to
the measuring site through the above-described operation, the
measuring operation control unit 310 starts measuring operation. To
be specific, the measuring operation control unit 310 controls the
driving of the coordinate driving mechanism 220 such that the probe
212 scans the object surface to be measured with staying in contact
with the object surface to be measured with a constant measuring
force. A probe displacement at this time is detected by the encoder
225 provided to each axis of the coordinate driving mechanism 220.
The detected value is analyzed by the profile analysis unit 320,
and the result of the analysis is displayed on the monitor 400.
[0079] According to the present invention having the above
configuration, the following effects are produced.
[0080] (1) The image data processing unit 350 is capable of
processing the image data taken by the camera 230, and an image of
the probe 210, for example, is processed to be enlarged, which
makes it possible to clearly display an image of the probe 210 on
the monitor 400. Therefore, the user can clearly observe the probe
210 with reference to the display on the monitor 400, and can bring
the probe 210 to the measuring point with accuracy.
[0081] (2) The display on the monitor 400 can be adjusted through
image processing performed by the image data processing unit 350 so
as to, for example, magnify the probe 210 to be displayed.
Therefore, the camera itself only needs to be provided with a lens
system of a low magnification. The wider view field of the camera
is obtained with the low-magnification lens system, which makes it
easy to bring the probe 210 into a view field of the camera.
[0082] Therefore, there is no need to make a fine adjustment to the
positional relationship between the probe 210 and the camera 230
such that the probe 210 enters the view field of the camera 230.
Accordingly, the operation is simplified to save work in the
measurement, leading to an increase in measuring efficiency.
[0083] (3) It is possible to display the probe 210 in the center of
the display screen through input operation of the mouse 500 for
inputting the movement instruction. At this time, a mere operation
through the mouse 500 is necessary, and there is no need to make an
adjustment to the positional relationship between the probe 210 and
the camera 230 such that the probe 210 actually comes into the
center of the view field of the camera 230. Accordingly, it is
possible to save work in the measurement to thereby increase
measuring efficiency.
[0084] (4) It is possible to magnify the probe 210 to be displayed
on the monitor 400 by inputting an enlargement instruction through
the mouse 500. Accordingly, the lens system provided to the camera
itself may be of a low magnification. The wider view field of the
camera 230 is obtained with the low magnification of the lens
system, which makes it easy to bring the probe 210 into the view
field. Further, with the wider view field, it is possible to bring
the probe 210 into the view field by merely fixing the camera
toward the probe 210, and there is no need to make a fine
adjustment to the positional relationship between the probe 210 and
the camera 230. Therefore, work in the measurement can be saved to
thereby increase measuring efficiency.
[0085] It should be noted that the present invention is not limited
to the foregoing embodiment, and any modification or improvement
made within a scope of achieving the object of the present
invention is encompassed by the present invention.
[0086] The probe 210 only needs to have a configuration capable of
detecting the object surface to be measured through contacting with
or in the proximity of the measuring point of the object to be
measured. For example, the probe 210 is not limited to a vibration
touch probe, and may also be a scanning probe for performing
scanning measurement by holding the stylus 211 in such a manner
that the stylus can be three-dimensionally least displaceable,
while keeping the least displacement of the stylus 211 constant to
thereby maintain the measuring force constant. Alternatively, the
probe 210 may be a non-contact probe for detecting the object
surface to be measured without contacting therewith. Examples of
the non-contact probe include a cantilever for detecting a tunnel
current used in an atomic force microscope (AFM), and an
electrostatic capacitance probe.
[0087] The image data processing include geometrical processing
such as moving, scaling, rotation, a distortion correction.
Further, it is also possible to assume such functions as a contrast
adjustment function, a sharping function, and filtering function
such as an edge extraction.
[0088] There has been explained the case of displaying the contact
portion 212 of the probe 210 in the center of the screen, by taking
an example where the region is first selected through operation of
the mouse and the selected region is then dragged to a
predetermined position. Alternatively, for example, the user may
bring a mouse pointer to the contact portion 212 of the probe 210
displayed on the screen so as to indicate the contact portion 212
of the probe 210 and click the mouse, so that the image processing
unit 330 may automatically recognize the position of the probe
contact portion 212 on the display screen to move the region of a
predetermined area to substantially the center of the monitor
screen.
[0089] In the above embodiment, there has been explained a case
where the user selects the region and gives an enlargement
instruction to thereby magnify the measuring point for display. It
is also possible, for example, that the coordinate of the measuring
point is inputted in advance, and the measuring point is
automatically enlarged when the probe comes close to the measuring
point.
[0090] The mouse 500 is used as an example of an operating device
and an explanation has been made thereon. Such the operating device
may be a keyboard, or anything that is capable of inputting an
image processing instruction to the image data processing unit.
With the keyboard, the selected region or the movement destination
of the selected region may be designated through input of the
coordinate values, and a magnification of the selected region may
be instructed by numeric values through input of numbers.
[0091] The magnification of the lens system of the camera (i.e.,
the image taking device) may be variable.
[0092] The camera may be provided to the bridge-type frame or on
the stage.
[0093] Further, in the above embodiment, there has been explained a
case where the camera is fixed. Alternatively, the camera may be
capable of being moved by a three-axis-orthogonal driving mechanism
in the directions of three orthogonal axes, namely, an X-axis, a
Y-axis, and a Z-axis.
[0094] Also, a cursor (for example, a cross cursor) may be
displayed on the monitor.
[0095] Further, the cursor is used as reference to adjust a posture
(such as a vertical posture or a horizontal posture) of the
probe.
[0096] Further, the monitor may be provided to a position apart
from the measuring device main body, and the measuring device main
body may be made remotely controllable to perform remote
measurement.
[0097] The priority application Number JP2005-304338 upon which
this patent application is based is hereby incorporated by
reference.
* * * * *