An old, beaten-down adage goes, “Get it right the first time”, and, in an ideal world, we would always get it right the first time. However, to further bandy clichés, nobody is perfect. So what happens when things go awry? How can we eliminate or attenuate undesirable outcomes? This is a common dilemma in science and can also be applied to other aspects in life. At the risk of appearing fanatical with yet another Richard Feynman quote, I’d like to present the following scenario:
“We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It’s a little bit off because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.
Why didn’t they discover the new number was higher right away? It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number close to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that”
Feynman makes an interesting point on “how we fool ourselves”. While researching, it’s imperative that we eliminate this possibility. For a few months before I came to UCSB, I performed quality control testing for a pharmaceutical company outside of Sacramento. The plant would send samples to be tested, and we would analyze the percentage of residual water in a dried pharmaceutical ingredient, for example. If the samples were 0.1% above a specified water content, they would be deemed unfit for consumption. It’s not inconceivable to think that if I breathed heavily or spat when I spoke, I could increase the water content by 0.1%. Therefore, it was extremely important to be meticulous.
It almost seemed irrational how careful we had to be. Solvents that were purchased at 99.99% purity were still filtered before use. New pipette tips had to be used, even when the same sample was being tested. Syringe needles were replaced if they accidently brushed against the side of a clean flask. If you touched a beaker but didn’t use it, the beaker needed washing. It was not initially a habit to be so cautious, but it was necessary and quickly became ingrained within me. This nearly-obsessive level of mindfulness eliminates the possibility of contaminated data; the results obtained should be unquestionable. It also makes life easier when unexpected results do appear, since careful protocol leaves no doubt about the possibility of human error.
This is not to say that researchers of the past had not taken this approach while attempting to resolve the charge of the electron. It could be true that all outside variables were eliminated, but researchers lacked an understanding of the esoteric data. Fundamental discoveries could be difficult when confidence is lacking due to the novelty of the technique. More likely, Millikan’s data probably seemed unquestionably reproducible. Any researcher who did not get a close value would think his results were wrong. Still, if researchers had taken a careful approach, they could better argue for (instead of attempting to hide) values higher than Millikan’s.
Feynman’s quote alludes to researchers hiding data, a perversion even Millikan was later found to be guilty of. This is arguably even worse than poor data. It indicates that scientists are not upholding the high standards required for objectivity. At least poorly generated data is not a deliberate attempt to swindle colleagues and the general public. Poor data can be eradicated through proper training. Falsified data is a testimony of a person’s character and likely could not be rectified. Millikan’s transgressions were done to eliminate numbers far from his average, which didn’t change his overall results. Still, it is inexcusable; other researchers likely fooled themselves by questioning their own values when Millikan’s seemed so precise.
It’s not likely that people often intentionally falsify data. Amidst the cold fusion debacle in the early 1990s, where Utah physicists claimed a groundbreaking discovery of highly efficient energy production, data fabrication became a hot topic. Researchers at Texas A&M were accused of intentionally contaminating samples to obtain positive results. The Utah physicists were also suspected of providing false evidence and ended up leaving the country after enduring derision and ridicule, but this is not a call to arms against researchers lying about data. Scientific catastrophes like these are far and few between, and it’s more likely that data omission similar to Milliken’s experiments is more pervasive. As argued though, data omission can be just as hindering as deliberate fabrication.
“Getting things right” means much more than just avoiding bad data. Occasionally, life presents opportunities to define your character by choosing what is right. It’s sometimes easier to justify a wrong choice than it is to make the right one. If you throw a snack wrapper at a trash can and it lands on the floor, it’s easy to say, “What’s one more piece of trash?” It only takes a little extra effort to walk over and do the right thing.
The consequences of inaction can feel more urgent than just a small, unaesthetic piece of garbage on the ground. When I worked at the pharmaceutical company, I would prepare samples, start the analysis (which could sometimes take a couple of hours), and then work on something else. Sometimes, I would make mistakes and realized my mistake halfway through the analysis. There were a few options at that point: I could hide my mistake and forge the data, I could feign ignorance and report poor data, or I could spend extra time and correct my mistake. When I first started, I frequently had to exercise this last option. There were times I stayed 3-4 hours overtime, but it was better to sacrifice a little time then than have to pay dearly later.
There are a lot of mantras to live by that can motivate “right” behavior. Treat others as you’d want to be treated. Integrity is doing the right thing when nobody’s looking. Wisdom is knowing what to do next; virtue is doing it. Be the change you wish to see in the world. These quotes can inspire admirable behavior, but I try to live like this when given a choice: the path I choose should not be defined by my ability to justify it. As soon as I require justification, I probably made the wrong choice. This way, hopefully, I can find the “right” way to live, even if it doesn’t happen the first time and takes a little extra effort.