# Absolute certainty

(rearrange/expand a bit) |
(→An example from arithmetic: chg wording, now that the order of presentation is different) |
||

Line 5: | Line 5: | ||

==An example from arithmetic== | ==An example from arithmetic== | ||

− | + | Consider the statement: | |

* 1 + 1 = 2 | * 1 + 1 = 2 | ||

Given the usual definitions of the symbols <tt>1</tt>, <tt>+</tt>, <tt>=</tt>, and <tt>2</tt>, one can be absolutely certain that 1 + 1 = 2. | Given the usual definitions of the symbols <tt>1</tt>, <tt>+</tt>, <tt>=</tt>, and <tt>2</tt>, one can be absolutely certain that 1 + 1 = 2. |

## Revision as of 16:01, 5 November 2009

**Absolute certainty** is belief beyond any possible doubt (not just reasonable doubt, as in criminal trials in the U.S.). The only propositions that someone could be absolutely certain about are those proven within a rigorous logical system — and even those would typically have to be conditional statements, since they would necessarily rest on unproven assumptions (axioms or postulates), which one may not be absolutely certain of.

The 17th-century philosopher René Descartes asserted that the only thing he could be absolutely certain of was his own existence (summed up in his famous epigram, "*Cogito ergo sum*" — "I think, therefore I am"). He then tried to use this as the basis for all his beliefs.

## An example from arithmetic

Consider the statement:

- 1 + 1 = 2

Given the usual definitions of the symbols `1`, `+`, `=`, and `2`, one can be absolutely certain that 1 + 1 = 2.

But is it really so simple? What *are* the "usual definitions" of these symbols? What are the "real" definitions (i.e., that mathematicians use)? Does anything in the statement rest on unproven assumptions? What are they? The answers to these questions are actually extremely complicated. In fact, when Alfred North Whitehead and Bertrand Russell tried to place all of mathematics on a rigorous logical foundation in the early 20th century, eventually producing the three-volume work *Principia Mathematica*, it took over 700 pages of dense logical argumentation to get to the point where they could prove that 1 + 1 = 2. (Granted, if they were *only* trying to prove that statement, they probably could have done it more quickly.)

Incidentally, in the base-2 number system, 1 + 1 = 0, not 2, because there is no symbol `2` defined in such a system. Thus, one "hidden assumption" underlying the original statement is that we are working in a base-3 or larger number system. Although this might seem like some kind of irrelevant "semantic trick", it is important to realize that all statements are made within a particular context and that altering the context can change the truth value of a statement (or, more to the point in the present discussion, the degree of belief associated with it).

## Absolute certainty and apologetics

A theist who claims absolute certainty about an element of his or her religious beliefs is likely revealing more about the method by which they came to the belief (namely, uncritical acceptance or simple assumption) than the actual strength of the belief itself.

The argument that strong atheism is an untenable position because one cannot know for sure that God does not exist is based in part on the idea that for an atheist to believe no gods exist they have to have absolute certainty about it. However, belief is not the same thing as certainty. In fact, many people will say they "believe" something precisely when they don't feel certain enough to say they "know" it. In any case, one who claims certainty about the nonexistence of any gods would more accurately be called a gnostic atheist (a much stronger position than what is usually meant by strong atheism).

As Matt Dillahunty of the Austin television show *The Atheist Experience* points out, "absolute certainty is a red herring" because there are so few things (perhaps none) that can rise to that degree of the belief.