出典(authority):フリー百科事典『ウィキペディア(Wikipedia)』「2014/05/01 18:24:28」(JST)
A micrometer (/maɪˈkrɒmɨtər/ US dict: mī·krŏm′·ĭ·tər), sometimes known as a micrometer screw gauge, is a device incorporating a calibrated screw widely used for precise measurement of components[1] in mechanical engineering and machining as well as most mechanical trades, along with other metrological instruments such as dial, vernier, and digital calipers. Micrometers are usually, but not always, in the form of calipers (opposing ends joined by a frame), which is why micrometer caliper is another common name. The spindle is a very accurately machined screw and the object to be measured is placed between the spindle and the anvil. The spindle is moved by turning the ratchet knob or thimble until the object to be measured is lightly touched by both the spindle and the anvil.
Micrometers are also used in telescopes or microscopes to measure the apparent diameter of celestial bodies or microscopic objects. The micrometer used with a telescope was invented about 1638 by William Gascoigne, an English astronomer.
Colloquially the word micrometer is often shortened to mike or mic (/ˈmaɪk/) (US dict: mīk′).
The word micrometer is a neoclassical coinage from Greek micros, meaning "small", and metron, meaning "measure". The Merriam-Webster Collegiate Dictionary[2] says that English got it from French and that its first known appearance in English writing was in 1670. Neither the metre nor the micrometre nor the micrometer (device) as we know them today existed at that time. However, the people of that time did have much need for, and interest in, the ability to measure small things and small differences. The word was no doubt coined in reference to this endeavor, even if it did not refer specifically to its present-day senses.
The first ever micrometric screw was invented by William Gascoigne in the 17th century, as an enhancement of the vernier; it was used in a telescope to measure angular distances between stars and the relative sizes of celestial objects.
Henry Maudslay built a bench micrometer in the early 19th century that was jocularly nicknamed "the Lord Chancellor" among his staff because it was the final judge on measurement accuracy and precision in the firm's work.
The first documented development of handheld micrometer-screw calipers was by Jean Laurent Palmer of Paris in 1848;[3] the device is therefore often called palmer in French, and tornillo de Palmer ("Palmer screw") in Spanish. (Those languages also use the micrometer cognates: micromètre, micrómetro.) The micrometer caliper was introduced to the mass market in anglophone countries by Brown & Sharpe in 1867,[4] allowing the penetration of the instrument's use into the average machine shop. Brown & Sharpe were inspired by several earlier devices, one of them being Palmer's design. In 1888 Edward Williams Morley added to the precision of micrometric measurements and proved their accuracy in a complex series of experiments.
The culture of toolroom accuracy and precision, which started with interchangeability pioneers including Gribeauval, Tousard, North, Hall, Whitney, and Colt, and continued through leaders such as Maudslay, Palmer, Whitworth, Brown, Sharpe, Pratt, Whitney, Leland, and others, grew during the Machine Age to become an important part of combining applied science with technology. Beginning in the early 20th century, one could no longer truly master tool and die making, machine tool building, or engineering without some knowledge of the science of metrology, as well as the sciences of chemistry and physics (for metallurgy, kinematics/dynamics, and quality).
The topmost image shows the three most common types of micrometer; the names are based on their application:
Each type of micrometer caliper can be fitted with specialized anvils and spindle tips for particular measuring tasks. For example, the anvil may be shaped in the form of a segment of screw thread, in the form of a v-block, or in the form of a large disc.
Micrometers use the principle of a screw to amplify small distances (that are too small to measure directly) into large rotations of the screw that are big enough to read from a scale. The accuracy of a micrometer derives from the accuracy of the thread-forms that are at its heart. In some cases it is a differential screw. The basic operating principles of a micrometer are as follows:
For example, if the lead of a screw is 1 mm, but the major diameter (here, outer diameter) is 10 mm, then the circumference of the screw is 10π, or about 31.4 mm. Therefore, an axial movement of 1 mm is amplified (magnified) to a circumferential movement of 31.4 mm. This amplification allows a small difference in the sizes of two similar measured objects to correlate to a larger difference in the position of a micrometer's thimble. In some micrometers, even greater accuracy is obtained by using a differential screw adjuster to move the thimble in much smaller increments than a single thread would allow.[5][6][7]
In classic-style analog micrometers, the position of the thimble is read directly from scale markings on the thimble and shaft. A vernier scale is often included, which allows the position to be read to a fraction of the smallest scale mark. In digital micrometers, an electronic readout displays the length digitally on an LCD display on the instrument. There also exist mechanical-digit versions, like the style of car odometers where the numbers "roll over".
A micrometer is composed of:
The spindle of an imperial micrometer has 40 threads per inch, so that one turn moves the spindle axially 0.025 inch (1 ÷ 40 = 0.025), equal to the distance between two graduations on the frame. The 25 graduations on the thimble allow the 0.025 inch to be further divided, so that turning the thimble through one division moves the spindle axially 0.001 inch (0.025 ÷ 25 = 0.001). Thus, the reading is given by the number of whole divisions that are visible on the scale of the frame, multiplied by 25 (the number of thousandths of an inch that each division represents), plus the number of that division on the thimble which coincides with the axial zero line on the frame. The result will be the diameter expressed in thousandths of an inch. As the numbers 1, 2, 3, etc., appear below every fourth sub-division on the frame, indicating hundreds of thousandths, the reading can easily be taken.
Suppose the thimble were screwed out so that graduation 2, and three additional sub-divisions, were visible (as shown in the image), and that graduation 1 on the thimble coincided with the axial line on the frame. The reading would then be 0.2000 + 0.075 + 0.001, or .276 inch.
The spindle of an ordinary metric micrometer has 2 threads per millimetre, and thus one complete revolution moves the spindle through a distance of 0.5 millimeter. The longitudinal line on the frame is graduated with 1 millimetre divisions and 0.5 millimetre subdivisions. The thimble has 50 graduations, each being 0.01 millimetre (one-hundredth of a millimetre). Thus, the reading is given by the number of millimetre divisions visible on the scale of the sleeve plus the particular division on the thimble which coincides with the axial line on the sleeve.
Suppose that the thimble were screwed out so that graduation 5, and one additional 0.5 subdivision were visible (as shown in the image), and that graduation 28 on the thimble coincided with the axial line on the sleeve. The reading then would be 5.00 + 0.5 + 0.28 = 5.78 mm.
Some micrometers are provided with a vernier scale on the sleeve in addition to the regular graduations. These permit measurements within 0.001 millimetre to be made on metric micrometers, or 0.0001 inches on inch-system micrometers.
The additional digit of these micrometers is obtained by finding the line on the sleeve vernier scale which exactly coincides with one on the thimble. The number of this coinciding vernier line represents the additional digit.
Thus, the reading for metric micrometers of this type is the number of whole millimetres (if any) and the number of hundredths of a millimetre, as with an ordinary micrometer, and the number of thousandths of a millimetre given by the coinciding vernier line on the sleeve vernier scale.
For example, a measurement of 5.783 millimetres would be obtained by reading 5.5 millimetres on the sleeve, and then adding 0.28 millimetre as determined by the thimble. The vernier would then be used to read the 0.003 (as shown in the image).
Inch micrometers are read in a similar fashion.
Note: 0.01 millimetre = 0.000393 inch, and 0.002 millimetre = 0.000078 inch (78 millionths) or alternatively, 0.0001 inch = 0.00254 millimetres. Therefore, metric micrometers provide smaller measuring increments than comparable inch unit micrometers—the smallest graduation of an ordinary inch reading micrometer is 0.001 inch; the vernier type has graduations down to 0.0001 inch (0.00254 mm). When using either a metric or inch micrometer, without a vernier, smaller readings than those graduated may of course be obtained by visual interpolation between graduations.
A micrometer reading is not accurate if the thimble is overtorqued. A useful feature of many micrometers is the inclusion of a torque-limiting device on the thimble—either a spring-loaded ratchet or a friction sleeve. Without this device, workers may overtighten the micrometer on the work, causing the mechanical advantage of the screw to squeeze the material or tighten the screw threads, giving an inaccurate measurement. However, with a thimble that will ratchet or friction slip at a certain torque, the micrometer will not continue to advance once sufficient resistance is encountered. This results in greater accuracy and repeatability of measurements—most especially for low-skilled or semi-skilled workers, who may not have developed the light, consistent touch of a skilled user.
A standard one-inch micrometer has readout divisions of .001 inch and a rated accuracy of +/- .0001 inch[8] ("one tenth", in machinist parlance). Both the measuring instrument and the object being measured should be at room temperature for an accurate measurement; dirt, abuse, and low operator skill are the main sources of error.[9]
The accuracy of micrometers is checked by using them to measure gauge blocks,[10] rods, or similar standards whose lengths are precisely and accurately known. If the gauge block is known to be 0.7500" ± .00005" ("seven-fifty plus or minus fifty millionths", that is, "seven hundred fifty thou plus or minus half a tenth"), then the micrometer should measure it as 0.7500". If the micrometer measures 0.7503", then it is out of calibration. Cleanliness and low torque are especially important when calibrating—each tenth (that is, ten-thousandth of an inch), or hundredth of a millimeter, "counts"; each is important. A mere spec of dirt, or a mere bit too much squeeze, obscure the truth of whether the instrument is able to read correctly. The solution is simply conscientiousness—cleaning, patience, due care and attention, and repeated measurements (good repeatability assures the calibrator that his/her technique is working correctly).
Calibration would not record the error at approx 5 points along the range. Only one can be adjusted to zero. If the micrometer is in good condition, then they are all so near to zero that the instrument seems to read essentially "-on" all along its range; no noticeable error is seen at any locale. In contrast, on a worn-out micrometer (or one that was poorly made to begin with), one can "chase the error up and down the range", that is, move it up or down to any of various locales along the range, by adjusting the barrel, but one cannot eliminate it from all locales at once.
Calibration can also include the condition of the tips (flat and parallel), any ratchet, and linearity of the scale.[11] Flatness and parallelism are typically measured with a gauge called an optical flat, a disc of glass or plastic ground with extreme accuracy to have flat, parallel faces, which allows light bands to be counted when the micrometer's anvil and spindle are against it, revealing their amount of geometric inaccuracy.
Commercial machine shops, especially those that do certain categories of work (military or commercial aerospace, nuclear power industry, and others), are required by various standards organizations (such as ISO, ANSI, ASME, ASTM, SAE, AIA, the U.S. military, and others) to calibrate micrometers and other gauges on a schedule (often annually), to affix a label to each gauge that gives it an ID number and a calibration expiration date, to keep a record of all the gauges by ID number, and to specify in inspection reports which gauge was used for a particular measurement.
Not all calibration is an affair for metrology labs. A micrometer can be calibrated on-site anytime, at least in the most basic and important way (if not comprehensively), by measuring a high-grade gauge block and adjusting to match. Even gauges that are calibrated annually and within their expiration timeframe should be checked this way every month or two, if they are used daily. They usually will check out OK as needing no adjustment.
The accuracy of the gauge blocks themselves is traceable through a chain of comparisons back to a master standard such as the international prototype meter. This bar of metal, like the international prototype kilogram, is maintained under controlled conditions at the International Bureau of Weights and Measures headquarters in France, which is one of the principal measurement standards laboratories of the world. These master standards have extreme-accuracy regional copies (kept in the national laboratories of various countries, such as NIST), and metrological equipment makes the chain of comparisons. Because the definition of the meter is now based on a light wavelength, the international prototype meter is not quite as indispensable as it once was. But such master gauges are still important for calibrating and certifying metrological equipment. Equipment described as "NIST traceable" means that its comparison against master gauges, and their comparison against others, can be traced back through a chain of documentation to equipment in the NIST labs. Maintaining this degree of traceability requires some expense, which is why NIST-traceable equipment is more expensive than non-NIST-traceable. But applications needing the highest degree of quality control mandate the cost.
A micrometer that has been tested and found to be off might be restored to accuracy by recalibration. On most micrometers, a small pin spanner is used to turn the barrel relative to the frame, so that its zero line is repositioned relative to the screw and thimble. (There is usually a small hole on the barrel to accept the spanner's pin.)
This calibration procedure will cancel a zero error: the problem that the micrometer reads nonzero when its jaws are closed.
However, if the error originates from the parts of the micrometer being worn out of shape and size, then restoration of accuracy by this means is not possible; rather, repair (grinding, lapping, or replacing of parts) is required.
Wikimedia Commons has media related to Micrometer. |
|
|
全文を閲覧するには購読必要です。 To read the full text you will need to subscribe.
リンク元 | 「μm」「ミクロメータ」「マイクロメータ」 |
拡張検索 | 「submicrometer」「micrometer calipers」「objective micrometer」「ocular micrometer」 |
.