Quotations I like18 Jan 2015
Last update: 2015-02-12
This document lists some of the quotations I like (quotations that are relevant for the work I am doing, or related to science). This page is updated from time to time.
About Ronald Fisher.
To call in the statistician after the experiment is done may be no more than asking him to perform a post-mortem examination: he may be able to say what the experiment died of.
See the above mentioned page on "Cross Validated".
For many more quotes on software engineers, see the page Quotes for Software Engineers (by Rajapakse DC).
About Maurice Wilkes.
In 1949 as soon as we started programming we found to our surprise that it wasn’t as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realised that a large part of my life from then on was going to be spent in finding mistakes in my own programs.
Wilkes, M. V. (1985). Memoirs of a computer pioneer. Cambridge, Mass: MIT Press. See this article of the Daily Telegraph, citing this quotation.
About Solomon W Golomb.
Don't apply a model until you understand the simplifying assumptions on which it is based and can test their applicability.
Distinguish at all times between the model and the real world.
You will never strike oil by drilling through the map!
Don’t expect that by having named a demon you have destroyed him.
The purpose of notation and terminology should be to enhance insight and facilitate computation – not to impress or confuse the uninitiated
I found this citation in a presentation given by Karl Johan Åström at a seminar in Collège de France (Seminar organised by Gérard Berry, available online). The original paper is: Golomb, SOLOMON W. "Mathematical models: Uses and limitations ." Reliability, IEEE Transactions on 20, no. 3 (1971): 130-131.
Note: I haven't been able to check the quotation myself, and I have found numerous variations around Solomon W Golomb quotations (some might be more accurate than others).
About Gérard Berry.
Informatics, in the present state a the titanic fight between two absolutely opposite components: [...] The man [here Kepler] who is clever, rigorous [...] and slow. [...] and the computer [...] which is superfast, superexact and superstupid.
Lecture "Getting Rid of Bugs" (2010), from the seminar series "Seven Keys to the Digital Future", Edinburgh, UK, September 23rd-October 6th 2010. The video is available online (See also Gérard Berry's homepage).
The same citation, as a transcript of the same lecture:
The problem with computers [...], is the gulf between computers and the people who program them. While people are clever, only semi-rigorous and slow at performing computations, computers are superfast, superexact and superstupid – and this leads to a "titanic fight" between machines and humans
About Douglas Bates.
Maybe I’ll become a theoretician. Nobody expects you to maintain a theorem.
September 2013, R fortunes #356
Fred P. Brooks
About Fred P. Brooks.
adding manpower to a late software project makes it later.
This quotation is know as Brooks's law.
In F.P. Brooks, Jr. "The Mythical Man-Month". 1975. Addison-Wesley.
See also the other quotes from Fred Brooks on Quotes for Software Engineers.
About Ralph Johnson.
Before software can be reusable it first has to be usable.
Source: Quotes for Software Engineers. Notice that I have not been able to find the exact reference of this quote.
About Alfred Korzybski.
The map is not the territory
1931, meeting of the American Association for the Advancement of Science in New Orleans, Louisiana. See this Wikipedia page.
See also a similar citation ("drilling through the map") by Solomon W. Golomb.
Sven Ove Hansson
About Sven Ove Hansson
In issues of risk, like all other social issues, the role of the expert is to investigate facts and options, not to make decisions or to misrepresent facts in a unidimensional way that leaves the decision-maker no choice.
Science without precaution means acting too little and too late against environmental hazards. Precaution without science means acting with the wrong priorities. What we need is science-based precaution
In: Hansson, Sven Ove. "Seven myths of risk." Risk Management (2005): 7-17.