Skip to main content

Coin Tossing

This is an example that everyone has seen at some point in their studies. Let us take a formal view on it. If we toss a coin kk times, we represent an outcome by a string s{0,1}ks \in \{0,1\}^k where si=1s_i = 1 means that the ii-th coin toss turns up head.

Define Ωn={0,1}n\Omega_n = \{0,1\}^n (so we have Ωn=2n|\Omega_n| = 2^n). For sΩns \in \Omega_n, define k(s)=isik(s) = \sum_i s_i (so this counts the number of heads occurring in ss). For p(0,1)p \in (0,1), define the function fp:ΩnRf_p: \Omega_n \rightarrow {\mathbb R} by

fp(s)=pk(s)(1p)nk(s)f_p(s) = p^{k(s)} (1-p)^{n - k(s)}

Theorem 37

(Ωn,fp)(\Omega_n, f_p) defines a probability space.

Let us show this kind of proof once 🙂 (after this, it is your own exercise).

Proof:

We just need to show that sfp(s)=1\sum_s f_p(s) = 1. This can be done by induction on nn. The base case when n=1n=1 is obvious (i.e., (1p)+p=1(1-p) + p=1). Now assume that it is true that (Ωn,fp)(\Omega_n, f_p) is a probability space. Consider (Ωn+1,fp)(\Omega_{n+1}, f_p), and we want to prove that sΩn+1fp(s)=1\sum_{s \in \Omega_{n+1}} f_p(s) = 1. We break the strings in Ωn+1\Omega_{n+1} into two sets ZZ and OO containing those strings where the last bit is zero and one respectively. Therefore,

sΩn+1fp(s)=sZfp(s)+sOfp(s)\sum_{s\in \Omega_{n+1}} f_p(s) = \sum_{s \in Z} f_p(s) + \sum_{s \in O}f_p(s)

We will see that the first sum is (1p)(1-p) and the second is pp, which would imply our claim. For sZs \in Z, if we write s=t0s = t 0 (the last bit is zero), we have fp(s)=pk(s)(1p)n+1k(s)=fp(t)(1p)f_p(s) = p^{k(s)} (1-p)^{n+1- k(s)} = f_p(t)(1-p). Therefore, the sum sZfp(s)=(1p)(tΩnfp(t))=(1p)\sum_{s \in Z} f_p(s) = (1-p) \cdot (\sum_{t \in \Omega_n} f_p(t)) = (1-p) (using the induction hypothesis). The second equality is left as an exercise.

Exercise 85

Finish the previous proof, that is, prove that sOfp(s)=p\sum_{s \in O}f_p(s) = p in the induction step.

Notice that when p=1/2p = 1/2, this is a uniform distribution. Let us see some examples of events in this probability space.

  • Consider the event AA of getting the same results in nn coin tosses. So A={0n,1n}A = \{0^n, 1^n\} and the probability is fp(A)=pn+(1p)nf_p(A) = p^n + (1-p)^n. (obtained by summing the probabilties of atomic events).

  • Now the event BB consists of the outcomes that have exactly two heads, i.e., B={s:k(s)=2}B= \{s: k(s) = 2\}. Each ss occurs with probability p2(1p)n2p^2 (1-p)^{n-2}, so we have fp(B)=(n2)p2(1p)n2f_p(B) = {n \choose 2} p^2 (1-p)^{n-2}.

  • We can further define CrC_r to consist of outcomes with exactly rr heads, so this partitions Ωn\Omega_n into rCr\bigcup_r C_r, where the probability of event CrC_r is (nr)pr(1p)nr{n \choose r} p^r (1-p)^{n-r}.

Now let us discuss a bit how our intuition about probabilities fits into the picture here. For instance, if we want to calculate the probability that the first and second coin tosses are the same, we would immediately focus on just two tosses and say that this is pp+(1p)(1p)p\cdot p + (1-p)\cdot (1-p) (we use the fact that the first and second tosses are independent, which we haven't defined here yet).

In the exercises below, you will be asked to do these calculations using only the definitions we have seen so far, i.e., you can only use the fact that the probability of an event is the sum of its atomic events.

Exercise 86

Calculate the following probabilities:

  • View a string as s=s1s2snΩns = s_1 s_2 \cdots s_n \in \Omega_n. Prove that P[si=1]=p{\mathbb P}[s_i =1] = p.

  • Let iji \neq j.

    • Prove that P[si=1sj=1]=p2{\mathbb P}[s_i = 1 \land s_j = 1] = p^2.
    • Prove that P[si=sj]=p2+(1p)2{\mathbb P}[s_i = s_j] = p^2 + (1-p)^2.

The following is a bit more difficult.

Exercise 87

Consider the case when p=1/2p=1/2. What is the probability that the number of heads is even?

Hint: You may benefit from the binomial theorem.