JOHT 18: Measurement
Jun. 29th, 2020 09:16 amThe Joy of High Tech
by
Rodford Edmiston
Being the occasionally interesting ramblings of a major-league technophile.
This has been updated, to correct the materials used for gauge blocks.
Please note that while I am an engineer (BSCE) and do my research, I am not a professional in this field. Do not take anything here as gospel; check the facts I give. And if you find a mistake, please let me know about it.
Measurement
One of the greatest accomplishments of human technology is our skill with measurement. This is actually not a new development. For all of recorded history humans have been pushing the limits of the available tools and techniques in order to achieve better measurements. As just one example - though a well known, sensational one - the foundation for the Great Pyramid was leveled and the cornerstones laid with astounding accuracy and precision.
That last sentence, by the way, brings up an important point. Accuracy is not the same thing as precision. Put one way, accuracy is how close to right a value is, while precision is how many decimal places it has. Put another, if the local time is 12:02 PM, and someone tells you "It's about Noon," that statement has good accuracy but poor precision. If they tell you "It's 7:53:02" then they are providing great precision but poor accuracy. Consistency is also a part of accuracy; that is, how consistent is a series of measurements of the same thing. A measuring tool which gives widely different answers to the same question about the same object under the same conditions is generally useless.
Accuracy and precision both can be evaluated as fractions of the whole, such as parts in a thousand. Which means the scale of the measurement is important. Over a distance of several kilometers, being within a few centimeters (that is, to within a few parts in a hundred thousand) is pretty good accuracy. Being half a millimeter off in placing a transistor on a tiny integrated circuit chip isn't. For lengths of multiple kilometers, rounding the measurement to a whole meter is probably precise enough. On the other hand, if you are referring to a diffraction grating, even if the measurements are in centimeters you better have a lot of decimal places.
Achieving high accuracy and precision in measuring long distances is surprisingly easy, though labor intensive. With a basic knowledge of geometry, a few simple tools and a reasonable amount of repetition, surveyors can be accurate to within one part in 20,000 or so. Professional surveyors with theodolites and calibrated measuring tapes can place benchmarks to an accuracy of about one part in 60,000. This sort of accuracy astounds people not familiar with surveying. They often protest that (for example) the ancient Egyptians couldn't have surveyed their monuments so accurately with the tools they had. (Heinlein plays on this in Farnham's Freehold, when the people of a technologically advanced future society express skepticism over the titular character's information on how accurately USGS benchmarks are placed.)
The secret - which anyone who has taken a surveying course knows - is repetition. You survey a point between known benchmarks and tie into each of them, or come back to your starting point, find your accumulated error and calculate how much you are off. If the error is too great, you re-survey, going the other direction. Using these methods, even college students in a hurry to complete a surveying camp project with beat-up old tools over rough, wooded terrain can easily exceed an accuracy of one part in 10,000.
Above I mentioned calibration. To calibrate, you must have a standard to calibrate against. Any competent engineer will tell you, if a standard exists, even an arbitrary one, use it, unless you have a very good reason not to. Historically, standards of length are based on the length of some body part of a monarch. (Now stop that...) The English foot, for instance, is based on the length of the foot of a monarch who happened to have very large pedal extremities. The French foot was even longer (which causes a persistent misunderstanding when people use the French units for Napoleon's height). Weight standards were often such things as a large number of a specific type of small seed. (Using a large number of items this way means having a good chance of the numbers of unusually large and small ones evening out, and makes the result more accurate and reproducible.) Volume measures were generally based on one or both of the above.
Ancient Egyptian monuments often feature a portrayal of linear measurement standards. A relief carving of a pharaoh or god may be shown with arms spread, and lines crossing the body at various points to indicate the measurement standards. (Often the crossing lines are omitted, since the meaning was presumably understood.) The cubit, the hand, the finger and other Egyptian measures were surprisingly constant down through the ages, at least within the same region. That is, once later humans understood there were often more than one of each type in use at the same time in the same area! (The ancient Egyptian royal and common cubits, for instance, were different, and used for different purposes.)
Volume standards are a little trickier, but have been found. Weight standards (artificial ones, as opposed to the natural ones mentioned above) are perhaps the most durable, and the easiest to produce with reasonable accuracy and precision. Scale weights are found in many archeological digs, and those from within a particular region and era are surprisingly consistent.
Simple balance beam scales can be surprisingly accurate and precise. I reload much of my own ammunition. A few years ago I decided to have my powder scale calibrated at the local office of weights and measures. This scale was inexpensive, around $40, and made mostly of plastic and stamped sheet steel. Yet it was so accurate and precise that the metrologist (not meteorologist) who performed the calibration was impressed. When you consider that a difference of a tenth of a grain of powder (a grain being 1/7000th of a pound, or 0.0648 gram) can cause a change in bullet impact of well over a centimeter at 100 meters, you can see that these scales must be both accurate and precise.
Standards for other types of measurement also exist in history, but are generally more subjective. A touchstone is a rock on which a piece of gold or silver is rubbed. The color of the mark left gives an accurate estimate of the actual precious metal content of the object, but a trained eye is needed to judge this. Today, of course, there are several accurate, repeatable methods of analyzing alloys, many of them non-destructive.
Measurement techniques pretty much stayed the same for thousands of years. Oh, there were some attempts to refine measuring methods, or to create standards based on nature. Mechanical clocks, for instance, simply divide time into smaller increments than the sundial or hourglass can, and parcels them out more uniformly. Twelve hour days date back to ancient Sumeria. With the invention of the thermometer came the first tool specifically intended to measure temperature, and the first new concept of what could be measured in millennia. (Galileo had a simple thermometer, around 1593, but the world needed until 1714 for Daniel Fahrenheit to make the first practical device.)
Once the idea of measuring temperature escaped into the scientific community, several people began developing standards of measurement at the same time. This is why we have the Centigrade/Celsius and Fahrenheit scales. Only later, when molecular theory was developed, was the concept of an absolute standard - absolute zero, the lowest possible temperature - for temperature developed. Today most physical scientists use the Kelvin scale for temperature measurement. This starts at absolute zero and goes up, using degrees the same size as those in the Centigrade system. The standard freezing point of water - zero on the Centigrade scale - is 273 and a bit in kelvins.
The French Revolution also brought a revolution in the science of measurement. The new French government wanted to make as clean a break from the past as possible. Part of this included getting rid of multiple, usually regional, values for things like the toise measure of length, so everyone would be measuring the same way. The findings of the scientists who accompanied Napoleon to Egypt also stimulated science as a whole. New ways of doing things - scientific ways - were developed, and old ways were reinvestigated and revised. Wherever possible, the resulting systems of measurement were based on natural phenomena. They did a pretty good job, even if they did get the size of the Earth wrong. About the only system the French developed which is not still widely used is their calendar. (Why the United States and Britain don't exclusively use the metric system is beyond me. It's especially useful in science. Space probes have been lost because of conversion errors between metric and Imperial measurements.)
With modern equipment, measurements can be made quickly, precisely and accurately. Electronics aren't necessarily required, either. Gauge blocks - developed over a century ago - are still used today, with the Johanson Gauge perhaps being the best known of these. The basic form and accuracy of gauge blocks have not changed notably in many decades. (Note that Cadillac and Ford both used the Johansson gauges early on to make standardization of parts easier. With an accurate, precise and consistent means of measuring, making each bore hole or wheel rim the same size was much easier. This is where Eli Whitney failed in his attempts at doing the same thing with firearms decades earlier. He couldn't measure accurately and consistently enough.)
To measure with a gauge block you simply select a combination of blocks (using the fewest in combination you can, in a process called stacking) which adds up to the length you want, "wring" the blocks by rubbing the appropriate ends against each other to squeeze the air out from between them, and compare the resulting reference length to the target. One interesting feature of gauge blocks is that the surfaces are so smooth and flat that once they are mated in this way they stay together unless considerable effort is used to part them. The combination of the adhesive action of the ultra-thin film of preservative oil or atmospheric moisture on the blocks and the molecular attraction, or bonding, between the very flat and parallel mating surfaces, will actually hold them together.
There are several classifications of gauge blocks, depending on how precise and accurate they need to be. A laboratory or master set is typically accurate to within 0.000002 (one part in half a million). These are normally used only in temperature-controlled labs as references to compare or check the accuracy of other gauges. Next come the inspection sets, accurate to within 0.000004 to 0.000002, and used to inspect the accuracy of working sets. The working sets are accurate to 0.000006 to 0.000002. These are used in shops for machine tool setups, layout work and measurement and to calibrate adjustable gauges, such as micrometers and calipers.
Most gauge blocks are made of tool steel, chromium carbide (Croblox) or tungsten carbide. These days ceramic gauge blocks are becoming popular.* Both temperature stability and wear resistance are very important.
Note my comment above about temperature-controlled labs. Physical characteristics are related. Normally, temperature variations don't significantly alter measurements, but when measuring to high accuracy and precision in a true metrology laboratory, the temperature of the room, and all its contents, is set at 20 degrees C and is held within 0.25 degrees C. Ted Doiron, a physicist in the precision engineering division of the National Institute of Standards and Technology in Gaithersburg, MD, often referred to something he called Doiron's Law of Dimensional Metrology: "The guy with the best thermostat wins!" It's simple, but very true: in millionths measurements, temperature is everything.
gauge blocks are still the industry-standard length masters. They are used daily in a broad spectrum of applications, from measuring parts to relatively loose tolerances on the factory floor to measuring to a few parts in a million in an environmentally controlled metrology laboratory.
Once you're familiar with stacking and wringing, there are three gauge-block preparation steps you should take each time you are going to make a measurement.
Using clean and demagnetized gauge blocks is paramount. A gauge block that has grease and grime on it will be inaccurate, and a magnetic charge will make a block more likely to pick up contaminants. Most block cleaning jobs can be accomplished by wiping each block with a soft, lint-free cloth moistened with mineral spirits. You must also demagnetize all blocks which can retain a magnetic field. Good electronic demagnetizers and gauss gauges are common catalog items today. Eliminate nicks and burrs. gauge blocks require good overall geometry to measure accurately. Nicked, burred, or scratched measuring faces will not wring together well and will most likely provide anomalous readings.
The recommended method for deburring the measuring faces of steel gauge blocks is to lightly "swipe" them along the face of a clean, flat, serrated, Arkansas or granite stone, while using clean mineral spirits as a carrier. This procedure, if performed correctly, will not hinder the quality or integrity of the gauge blocks. Note that different block materials will require other types of deburring stones.
Maintain temperature. Varying environmental temperatures affect material size, especially when using two different materials in your application. For example, if the gauge blocks are steel and the part is aluminum, thermal expansion/contraction causes the two materials to change size at different rates. Measuring at different temperatures will therefore give different values. If the temperature is fluctuating as well, the problem is compounded.
Use of gauge blocks in the temperature-controlled atmosphere of the metrology laboratory yields the most reliable measurements. However, proper handling can lead to accurate use on the plant floor as well.
Measurements, as mentioned above, are interrelated. Mass depends on volume and density. Measuring volume depends on measuring length. Measuring temperature depends on measuring changes in length or volume or conductivity. And so on. At some point, there have to be standards for one or more types of measurement. Throughout most of history, measurement standards were arbitrary and unrepeatable, also as mentioned above. In some cases a master unit was created, such as the platinum alloy bar which for over a century was the meter, or another one which was the kilogram. However, in our modern era some standard besides an arbitrary lump of metal is desired. Today, the meter length standard is a specific number of wavelengths of a specific frequency of light. The time standard a certain number of a specific type of nuclear reactions. The kilogram is determined in terms of the Planck Constant. (Remember my comment above about how using a large number of small things in a standard makes the average more accurate and reproducible?)
Using these standards has revolutionized physics, both large scale and small. You may not think that a researcher being able to verify results to one part in a few trillion could possibly have any impact on everyday life, but it does, if indirectly. Building a Gigahertz processor for a computer requires an astoundingly high level accuracy and precision in both the design and the manufacturing of the chip. Yet we not only achieve this goal, we do it repeatedly, for thousands upon millions of identical chips.
Today we can make instruments of fantastic accuracy and precision. A project I read about recently (and describe in more detail in another JOHT column) involves launching a satellite to orbit the Earth and measure its frame drag. That is, the infinitesimal effect our planet's rotating mass produces on the structure of space around it. For a long time most scientists thought this subtle effect would have to be measured indirectly, by making observations of matter falling into a black hole. However, a few individuals thought otherwise, and after decades of work have built a space-rated clock and other instruments sensitive enough to perform the task with the relatively paltry mass of the Earth.
One of the most valuable skills an engineer can have is knowing when to stop. With measurement, that means knowing how many decimal places is enough. For large structures - an office building or a bridge - half a centimeter may be close enough. For drinking water, a few parts per million or billion, depending on the contaminant. For nuclear power plants, a few parts per billion. Not enough decimal places, and the building may fall apart, someone get sick, or the reactor melt down. Too many and the bid will be too high. Either way, the engineer is out of work.
However, once the number of decimal places required is determined, measurements must still be made to at least that standard. (Remember: measure twice, cut once.) So let's keep in mind the masters of measurement, the metrologists, and how important they are not just to us, but to civilization throughout history.
*My thanks to Tom Lipton for his corrections on the materials used for gauge blocks.
no subject
Date: 2020-06-29 06:37 pm (UTC)They arrived, carefully packed in a vacuum sealed case. Whereupon someone in receiving, noting how expensive they were, opened said case and proceeded to drill holes in them to attach metal asset tags. *megacringe*
Needless to say procedures were changed.
Ah well, as the ancient joke goes: measure with micrometer, mark with chalk, cut with axe
(no subject)
From:(no subject)
From: