One of the most common errors in applied mathematical analysis is to fail to notice when a mathematical argument proves too much. This occurs when the same argument can be deployed more generally than in the particular case being considered, and in other cases where it can be deployed it leads to conclusions that are clearly absurd.
I can relate to this from personal experience. I was once managing a Superfund Project. We were testing a high temperature process which would incinerate organic toxins (reduce them to CO2 and H2O) and would react inorganics into chemically inert compounds. To verify that this process worked we had to create a statistically representative sampling plan and then test all sample using a US EPA Toxicity test. These test use very sophisticated instruments call a Gas Chromatograph/Mass Spectrometer (GC/MS) which can measure organics in the parts per billion range and a an Inductively Coupled Argon Plasma Spectrometer (ICP) which can measure inorganics/metals in the parts per billion range. This US EPA toxicity test tests for 43 common industrial toxic (Arsenic, Mercury, Lead, 1,1,1-Trichloroethlyene, Benzene, etc,.). We tested in the order of 20 samples per our sampling program. All 20 sample came back with all 43 toxins measuring below instrument detection limits. Which pretty much means those 43 toxins were not present in all 20 some samples.
When I presented that information to the general contractor representing the US Air Force their representative asked me. How do I know this information is significant? To which I replied, because all the data complies with the data quality objectives of our sampling plan and and the sampling plan is statistically representative based on the "XYZ" formula we used. The Rep then said to me "No, that's not what I mean. How do I know that these toxins couldn't end up in your product at some time?". To which I replied, using common sense "Cause if it aint there, it aint there!". To which he said "How so?". I said "look at the data. Nothign was detected!" and he replied "But what's the probability that something could be detected?"
By this time I was getting upset so I said to him. Look if any data population set or a sample of that population set has no standard deviation, that is there is no variation in the data, that is proof that the data is statistically significant.
To which he said "How do I know that?"
So I then took my data set. Which was essentially all "0" or nothing. I then calculated a standard deviation, a 99% confidence interval and a Students (t) Test of statisctical significance for nothing.
When I was completed I had succesfully demonstrated that, yes ineed, nothing is significant!
Which fully meets the authors criteria of absurdity stated above. How could nothing be significant?
A perfect case of mathametics making you dumb.
The good news was I spent an entire day calculating that and I charged the general contractor 8 billable hours for it. He wasn't to happy about that either.