Following are some items relating to probability discussed in the history of mathematics.

 Pascal's Wager Bernoulli Trials Pascal's Wager Introduction Are you a betting person? Would you bet on God or against God? Would you bet that God exists or doesn't? Blaise Pascal (1623-1662) did exactly that, he bet or made a wager on God. Background Blaise Pascal was an accomplished French mathematician, scientist, philosopher, and religious writer. His achievements and inventions include: (1) A published essay on conic sections that helped develop differential calculus, (2) at age 18, built the world's first computer. The computer language PASCAL is named in his honor, (3) investigated the problem of the vacuum and then invented the syringe, barometer, and hydraulic press, (4) developed the world's first public transportation system, (5) developed probability theory with Fermat, and (6) authored a religious polemic considered the first great masterpiece of French prose. The Pensees Pascal's last and unfinished written work is the Pensees. The Pensees is made up of approximately 1000 fragments largely dealing with philosophic/religious topics. It is clear that this writing is a defense of the Christian religion. The French word pensee means "to think." The title of Pascal's work is translated into English as "Thoughts." The Wager What is Pascal's Wager anyway? Pascal's Wager is usually referred to as an argument which requires the one bet on God's existence. Pascal puts it this way in words: "Let us examine this point and declare: 'Either God exists, or He does not.' To view shall we incline? Reason cannot decide for us one way or the other: we are separated by an infinite gulf. At the extremity of this infinite distance a game is in progress, where either heads or tails may turn up. What will you wager? According to reason you cannot bet either way; according to reason you can defend neither proposition …. 'Both are wrong. The right thing is not to wager at all.' Yes, but a bet must be laid. There is no option: you have joined the game." Of course, Pascal's Wager is actually a mathematical proof that shows it is wiser to believe in God. Mathematically speaking, Pascal's Wager goes as follows: Suppose there is a nonzero number e (perhaps extremely small) that is the probability of there being a God. If there is no God, and you believe in God anyway, then, although you will be deluded, and although you may suffer the ridicule of atheists (for nothing), your loss will not be enormous. Let us say that it will not exceed 1 "felz" of happiness. Hence the mathematical expectation of believing in God is at least: e x 2/e - (1 - e) x 1 = 1 + e On the other hand, if there is not God and you do not believe in God, you may gain a little but not more than, say, 1 "felz" of happiness. However, if there is a God and you do not choose to believe in Him, then you will not get anything good out of your not believing in Him. The mathematical expectation of not believing in God is thus less than: (1 - e) x 1 + e x 0 = 1 - e Since 1+e > 1-e, and since, other things being equal, a wise person will act in such a way as to maximize the expectation of their happiness, it follows that, other things being equal, a wise person will choose to believe in God. Conclusion Pascal's Wager is an argument appealing to the mind. If it succeeds then it does so by setting the individual on a path that will make it possible for him/her to recognize God's grace when it is offered. Contributed by Chuck Hammond References: Anglin, W.S. (1994). Mathematics: A Concise History and Philosophy. Springer-Verlag: New York. Armour, L. (1993). "Infini Rien": Pascal's Wager and the Human Paradox. Southern Illinois University Press: Carbondale and Edwardsville. Hajek, Alan. (2000). Pascal's Wager. Stanford Encyclopedia of Philosophy. http://plato.stanford.edu/entries/pascal-wager/. Rescher, N. (1985). Pascal's Wager: A Study of Practical Reasoning in Philosophical Theology. University of Notre Dame Press: Notre Dame, Indiana. Contents  |  Next Bernoulli Trials Boy? Girl? Heads? Tails? Win? Lose? Do any of these sound familiar? When there is the possibility of only two outcomes occuring during any single event, it is called a Bernoulli Trial. Jakob Bernoulli, a profound mathematician of the late 1600s, from a family of mathematicians, spent 20 years of his life studying probability. During this study, he arrived at an equation that calculates probability in a Bernoulli Trial. His proofs are published in his 1713 book Ars Conjectandi (Art of Conjecturing). To be considered a Bernoulli trial, an experiment must meet each of three criteria: There must be only 2 possible outcomes, such as: black or red, sweet or sour. One of these outcomes is called a success, and the other a failure. Successes and Failures are denoted as S and F, though the terms given do not mean one outcome is more desirable than the other. Each outcome has a fixed probability of occurring; a success has the probability of p, and a failure has the probability of 1 - p. Each experiment and result are completely independent of all others. A simple example of a Bernoulli trial is the flipping of a balanced coin. Each of the three criteria are met: There are only 2 possible outcomes; head or tails. Each outcome has a fixed probability of 1/2. The outcome of each flip is independent of the previous flips. A generic equation can be used to calculate probability of successes and failures in a Bernoulli trial. It is written as such: P = Probability; m = # of successes; n = #of failures; p = probability of S(Success); 1 - p= probability of F(Failures) In order to determine the probability of an event occurring, one must know either the number of trials that wilt be completed or the number of successes and failures, which will give you the total number of trials. For this example, we will use 4. Say we want to know the probability of a coin landing, in no particular order, heads-up 3 times and tails-up only once with heads as success and tails as failures, the equation is as follow: Probability(3Successes and 1 Failure), abbreviated as P(3S,1F) Therefore, the chance of flipping 3 heads and 1 tail is .25, or 25%. Because the probability of certain events is not always known, Bernoulli arrived at a way of figuring the probability of S if p is not known. By using his theory of Large Numbers, which states, “The more observations one made of a given situation, the better one would be able to predict future occurrences, (Katz, 1998), Bernoulli devised a formula to find an equivalence to p: X/N ; when X is the number of successes and N is a large number of trials. Also, by theory, the greater the number of trials performed, the closer one comes to the true probability of an event occurring. But, due to the nature of chance, this is not always true. Notice, that as the ratio of success (or failures) to the number of trials becomes closer to its own probability, the probability of that number of success and failures increases, closer to the fixed probabilities. An example is as follows: You have a balanced coin, and will flip it 4 times. The desired outcome is 2 heads(successes) and 2 tails(failures). The probability is closer to the fixed probabilities, though it is not exactly the same. Contributed by Lindsay Eastridge References: www.math.uan.edu/stat/bernouliu/bernoulli1.html Dunham, William, (1994), The Mathematical Universe. Katz, Victor J., (1998), A History of Mathematics. Contents  |  Next  |  Previous