Skip to main content

Section 3.2 Families of Discrete Distributions: Uniform and Binomial

Families of Probability Distributions.

Section 3.1 gave us a general introduction to probability distributions. In the next several sections we will introduce some specific families of probability distributions. What makes these specific families of probability distributions interesting is that:

  1. they are very common in every-day situations.

  2. we can develop general rules for finding the mean and standard deviation for probability distributions that are members of a specific family.

This means that we can avoid the work of computing expected value and standard deviation from the probability distribution table, and instead create a formula for finding these parameters for common random processes.

In this section we will look at two discrete probability distribution families. The first is a fairly straightforward, but common none-the-less distribution seen whenever you roll a fair die or flip a fair coin. The second is more involved, but is very common when dealing with yes/no sorts of questions such as “do you support candidate \(X\)” or “did you draw a red ball?”

Subsection 3.2.1 Discrete Uniform Distribution

When you roll a fair die or flip a fair coin, each of the outcomes is equally likely. If you assign the values of those outcomes to a random variable, then each value of the variable is equally likely. One could say that the probabilities are uniform. These are members of the following family of distributions.

Definition 3.2.1.

A random variable \(X\) has a discrete uniform probability distribution if X can take on one of \(k\) possible values, \(x_1\text{,}\) \(x_2\text{,}\) \(\ldots\text{,}\) \(x_k\text{,}\) each with probability \(P(X=x_i) = \frac{1}{k}\text{.}\)

Consider the following, very common, member of this family.

A fair die is rolled and a random variable \(X\) is defined to be the number that appears. Show that \(X\) has a discrete uniform probability distribution.

Solution

The random variable has six values, \(X = 1, 2, 3, 4, 5, \text{ and } 6\text{.}\) So, for each of these values, \(P(X=x) = \frac{1}{6}\text{.}\) Therefore, this is a discrete uniform distribution with \(k=6\text{.}\) To help visualize this, we construct the following probability histogram.

A histogram with one bar for each outcome, all having exactly the same heights.
Figure 3.2.3. Probability Histogram for \(X\)

The second reason to study families of distributions is to develop general rules for computation. For example, a discrete uniform distribution has the following property.

Let's apply this to the die example above.

A fair die is rolled and a random variable \(X\) is defined to be the number that appears. Find the expected value (mean) and standard deviation of \(X\text{.}\)

Solution

To show that the mean and standard deviation really are as stated above, we will do the computation two ways.

  1. First, we use the probability distribution method of Section 3.1:

    \(x\) \(P(X=x)\) \(x\times P(X=x)\) \((x-\mu)^2\times P(X=x)\)\)
    \(1\) \(\frac{1}{6} \approx 0.1667\) \(1(0.1667) = 0.1667\) \((1-3.5)^2(0.1667) = 1.0417\)
    \(2\) \(\frac{1}{6} \approx 0.1667\) \(2(0.1667) = 0.3333\) \((2-3.5)^2(0.1667) = 0.3750\)
    \(3\) \(\frac{1}{6} \approx 0.1667\) \(3(0.1667) = 0.5000\) \((3-3.5)^2(0.1667) = 0.0417\)
    \(4\) \(\frac{1}{6} \approx 0.1667\) \(4(0.1667) = 0.6667\) \((4-3.5)^2(0.1667) = 0.0417\)
    \(5\) \(\frac{1}{6} \approx 0.1667\) \(5(0.1667) = 0.8333\) \((5-3.5)^2(0.1667) = 0.3750\)
    \(6\) \(\frac{1}{6} \approx 0.1667\) \(6(0.1667) = 1.0000\) \((6-3.5)^2(0.1667) = 1.0417\)
    \(\mu = 3.5\) \(\sigma^2= 2.9168\)
    Table 3.2.6. Distribution for \(X\)

    So from the table to the left, we can read off the mean (expected value) and variance, which gives:

    \begin{align*} \mu \amp = 3.5\\ \sigma \amp = \sqrt{2.9168} = 1.71 \end{align*}
  2. Next, we use the methods from Subsection 1.3.4.

    We first compute the mean:

    \begin{equation*} \mu = \frac{1+2+3+4+5+6}{6} = \frac{21}{6} = 3.5 \end{equation*}

    Then we can construct the table below to find the standard deviation.

    \(x\) \((x-\mu)\) \((x-\mu)^2\)
    \(1\) \(-2.5\) \(6.25\)
    \(2\) \(-1.5\) \(2.25\)
    \(3\) \(-0.5\) \(0.25\)
    \(4\) \(0.5\) \(0.25\)
    \(5\) \(1.5\) \(2.25\)
    \(6\) \(2.5\) \(6.25\)
    Total: \(17.5\)
    Table 3.2.7. Finding the Variance

    Based on that table,

    \begin{align*} \sigma^2 \amp = \frac{17.5}{6} \approx 2.9167\\ \Rightarrow \sigma \amp = \sqrt{2.9167} \approx 1.71\text{.} \end{align*}

Note that the answers found agreed.

The reason we may be interested in the result from the last example is that the method from Subsection 1.3.4 is more familiar and involves fewer computations. This allows us to cut down on the amount of work necessary to find the expected value of a random variable with a discrete uniform distribution.

Figure 3.2.8. Discrete Uniform Distributions I
Figure 3.2.9. Discrete Uniform Distributions II

A discrete random variable \(X\) has possible values 1, 3, 5, and 7.

Question: if \(X\) is to be a discrete uniform random variable, what is \(P(X=3)\text{?}\)

Answer

\(\frac{1}{4}\)

A discrete uniform random variable \(X\) has several possible values including the value \(4\text{.}\) Suppose that \(P(X=4) = \frac{1}{9}\text{.}\)

Question: how many possible values does \(X\) have?

A discrete uniform random variable \(X\) has possible values 1, 4, 7, and 12.

Question: What is the expected value of \(X\text{?}\)

Subsection 3.2.2 Bernoulli Trials

The family of binomial probability distributions is based on one of the simplest random processes around. This basic process is called a Bernoulli trial. Bernoulli Trial.

Definition 3.2.13.

A Bernoulli trial is a random process in which there are only two possible outcomes: success and failure. The probability of a success is denoted by \(p\text{,}\) and the probability of a failure by \(q = 1 - p\text{.}\)

The key property of a Bernoulli trial is that there are only two possible outcomes. We call them “success” and “failure,” but there is nothing inherently good or bad about these outcomes. We simply refer to the particular outcome that we wish to observe as a success, and the other outcome as a failure.

A coin is weighted so that a heads is twice as likely as a tails. The coin is flipped and the result noted. Is this a Bernoulli Trial?

Solution

There are only two outcomes, so this is a Bernoulli trial. We could choose to call flipping a heads a success and a tails a failure. In that case, \(p = \frac{2}{3}\) and \(q = 1 - \frac{2}{3} = \frac{1}{3}\text{.}\)

It is possible to have a Bernoulli trial even when there are more than two outcomes. To do this, you must designate a certain set of outcomes as the desired or successful outcomes, and lump all others together into the failures. Consider the following example.

An urn contains 15 marbles: 6 red, 4 green, 3 blue, and 2 white. A single marble is drawn at random and we note either:

  1. the color of the marble drawn, or

  2. whether we got a red or white marble, or something else.

Which of these, if either of them, is a Bernoulli trial?

Solution

While the underlying experiment is the same, the result variable is different in these two cases.

  1. If we note the color drawn, then the results are either \(R\text{,}\) \(G\text{,}\) \(B\text{,}\) or \(W\text{.}\) Since there are more then two possible outcomes, this is not a Bernoulli trial.

  2. In this case the outcomes are categorized as either a success (\(R\) or \(W\)) or failure (\(G\) or \(B\)). Therefore this is a Bernoulli trial with \(p = \frac{8}{15}\) and \(q = 1 - \frac{8}{15} = \frac{7}{15}\text{.}\)

Figure 3.2.16. Bernoulli Processes I
Figure 3.2.17. Bernoulli Processes II

Consider the following random processes.

  1. A weighted coin is flipped and the outcome of heads or tails is recorded.

  2. A fair die is rolled and the outcome 1-6 is recorded.

  3. A marble is drawn from an urn containing 10 marbles: 4 red, 3 blue, and 3 green and the color of the marble is noted.

  4. Two cards are drawn from a deck of 52 cards and note whether the cards are of the same suit.

Question: identify each process as a Bernoulli trial or not a Bernoulli trial.

Answer
  1. Bernoulli trial

  2. Not a Bernoulli trial

  3. Not a Bernoulli trial

  4. Bernoulli trial

Refer to the definition of a Bernoulli Trial and compare it with te following statements.

  1. The random variable is discrete.

  2. The random variable is continuous.

  3. The random variable has at least three values.

  4. The random variable has only two outcomes.

Question: which of these is the defining characteristic of a Bernoulli Trial?

Answer

Statement (d)

Suppose that you are told that in a Bernoulli trial, the probability of a success is \(0.17\text{.}\)

Question: what is \(q\text{?}\)

Answer

\(0.83\)

Subsection 3.2.3 Binomial Process

There are many situations in which we wish to repeat a string of identical Bernoulli processes and determine the number of times that we were successful.

  • You take a 20 question multiple choice exam in which each question has 4 possible answers, only one of which is correct.

  • You survey 100 individuals from a large population asking them if they own a truck.

  • You examine a sample of 50 widgets taken from a production batch and determine how many of them are defective.

What do each of these examples have in common? They are all examples of a binomial process resulting in a binomial random variable.

Definition 3.2.21.

A binomial process is a random process with the following four characteristics.

  1. The process consists of a fixed number of Bernoulli trials, referred to as \(n\text{.}\)

  2. Each trial has the same probability of a success, called \(p\text{.}\)

  3. The Bernoulli trials are all independent of each other.

  4. The result variable is the number of trials which result in successes.

If a random variable is the result variable for a binomial process, then it too has a special name.

Definition 3.2.22.

A binomial random variable is a random variable representing the number of successes in a binomial process. Such a variable is said to have a binomial distribution.

Let's take a look at each of the examples above to verify that they are indeed binomial processes and identify the binomial random variable that goes with each process.

You take a 20 question multiple choice exam in which each question has 4 possible answers, only one of which is correct. You randomly answer each question and define a random variable \(X\) to be the number of correctly answered questions. Is \(X\) a binomial random variable?

Solution

To verify that this is a binomial process, we go through the four characteristics mentioned above.

  1. There are 20 questions and answering each question is a Bernoulli trial since we are either right (success) or wrong (failure).

  2. Since we are guessing, the probability of getting a question right is \(p=\frac{1}{4}\) and the probability of getting question wrong is \(q = \frac{3}{4}\text{.}\)

  3. How we answer one question does not affect how we answer the next question (remember, we are guessing) so the trials are independent.

  4. Finally, the variable \(X\) represents the number of questions we get right—that is the number of successes in our 20 trials.

This is a binomial process and \(X\) is a binomial random variable.

You survey 100 individuals from a large population asking them if they own a truck. You define a random variable \(Y\) to be the number of individuals who own a truck. Is \(Y\) a binomial random variable?

Solution

We again verify the four characteristics of a binomial process.

  1. There are 100 individuals who are asked if they own are truck. They will either answer “yes” or “no”, so this is a binomial process with \(n = 100\) trials.

  2. We do not know what the probability of a success is, but since the population is large, it is safe to assume that \(p\) will not change as we ask each person (see below).

  3. How one person answers is not affected by how another person answers the question. Therefore the trials are independent.

  4. Again, the random variable \(Y\) is the number of successes (people who had a truck).

Therefore this is a binomial process and \(Y\) is a binomial random variable.

Before we look at our next example, consider the answer to question 2 above. Why did we say that it is safe to assume that the probability will not change? How can the sample size effect how much a probability changes? Consider the following.

An urn contains 20 marbles: 10 blue and 10 red. You draw 5 marbles without replacement noting how many red marbles are drawn. Is this a binomial process?

Solution

Let's review the four questions again.

  1. We are repeating the act of drawing a marble a fixed number of times (\(n = 5\)).

  2. There are only two options, red or blue, and since we are counting the number of red marbles drawn, we will count drawing a red marble as a success. On the first draw, the probability of a success is \(\frac{10}{20}\) since there are 10 red marbles.

  3. However, the trials are not independent and the probabilities change. To see this, note that on the second draw the probability of a success is either \(\frac{9}{19}\text{,}\) if we drew a red on the first draw, or \(\frac{10}{19}\text{,}\) if we drew a blue marble on the first draw. So the probabilities change and the new probabilities depend on what happened on the previous draw.

  4. Our random variable is the number of successes (red marbles).

Because the probabilities change between trials and depend on what happend on previous trials, this process does not meet criteria 2 or 3. It is therefore not a binomial process.

The above example was not a binomial process because of the small population from which we are drawing our sample. In Example 3.2.24 the population was assumed to be large. Suppose that there were 10,000 people in the population and that 5,000 of them own trucks. When we pick our first individual for the sample, the probability they will own a truck is \(\frac{5000}{10000} = 0.5\text{.}\) If our first pick owned a truck, the probability the second individual will also own a truck is now \(\frac{4999}{9999} = 0.499995\text{.}\) While this is not \(0.5\text{,}\) it is extremely close. Because the population is so large, the probabilities did not change by very much at all. In fact, as we pick our 100 individuals, the probability of getting a person with a truck will stay very close to \(0.5\) no matter which people we select.

How do we tell how large a population needs to be in order to assume that as we draw individuals for our sample we do not change the probabilities? The following rule is a good way to determine this.

In Example 3.2.25 we selected 5 marbles from 20. This gives \(\frac{5}{20} = 0.25\text{,}\) which means we are sampling 25% of the population. This is far too big to assume this is a binomial process.

You examine a sample of 50 widgets taken from a production batch and determine how many of them are defective. How many widgets must be in the batch in order to claim that this is a binomial process?

Solution

The batch must contain at least \(N\) widgets so that 50 is less than 5% of \(N\text{.}\) We find \(N\) by solving the inequality

\begin{equation*} \frac{50}{N} \lt 0.05 \quad \Rightarrow \quad 50 \lt 0.05N \quad \Rightarrow \quad N \gt 1000\text{.} \end{equation*}

If we do not have more than 1000 widgets to select from, then the trials of selecting individual widgets without replacement will not be independent and this will not be a binomial process.

Figure 3.2.28. Binomial Processes I
Figure 3.2.29. Binomial Processes II

An urn contains 12 marbles: 6 red, 4 white, and 2 blue. You draw 4 marbles without replacement and note the number of blue marbles drawn.

Question: which, if any, characteristics of a binomial process does this experiment have?

Answer

It has a fixed number of Bernoulli trials and the random variable is the number of successes.

A 60% free-throw shooter decides to practice by shooting free-throws until he has made 3 in a row. A random variable \(Y\) is defined to be the number of free-throw attempts that he makes.

Question: which, if any, characteristics of a binomial process does this experiment have?

Answer

The probability of a success is fixed and the trials are independet of each other.

A statistics student wishes to conduct a survey of fellow students at her university. She decides to contact 65 randomly selected students, ask them if they have taken a mathematics class and let a random variable \(X\) represent the number who have taken a mathematics class.

Question: how large must the student body be for this study to be a binomial process?

Answer

1300

Subsection 3.2.4 Binomial Probability Formula

Once we have identified a binomial random variable, we want to be able to quickly compute probabilities for that variable. The key to doing that is the general multiplication rule. Because one of the assumptions is that the Bernoulli trials are independent, we can use this multiplication rule to compute the probability of a given string of outcomes. Consider the following example.

A coin is weighted so that the probability of getting a heads is twice that of getting a tails. The coin is tossed four times, and the random variable \(X\) is defined to be the number of heads. Find \(P(X=1)\) and \(P(X=2)\text{.}\)

Solution

Notice that each flip is a Bernoulli trial with \(p = \frac{2}{3}\) and \(q = \frac{1}{3}\) when we consider getting a heads to be a success. We repeat this trial four times, and each time \(p\) and \(q\) remain unchanged. Because each flip is independent and \(X\) is the number of successes (heads), \(X\) is a binomial random variable.

  • Finding \(P(X=1)\).

    Since we want \(X=1\) head, we compute the probability to be

    \begin{equation*} \frac{2}{3}\times\frac{1}{3}\times\frac{1}{3}\times\frac{1}{3} = \left(\frac{2}{3}\right)^1\left(\frac{1}{3}\right)^3\text{.} \end{equation*}

    The problem with this is that it only counts the probability of HTTT. We could also get THTT, TTHT, and TTTH. In fact, a little counting shows that there are \(C(4,1)\) ways to pick the one toss out of 4 that will be heads. So the total probability is:

    \begin{equation*} C(4,1)\left(\frac{2}{3}\right)^1\left(\frac{1}{3}\right)^3 = \frac{8}{81} \approx 0.0988\text{.} \end{equation*}
  • Finding \(P(X=2)\).

    This analysis is very similar except that we get a heads twice and a tails twice and we have to choose 2 of 4 four flips that will be our tails. This results in the probability:

    \begin{equation*} C(4,2)\left(\frac{2}{3}\right)^2\left(\frac{1}{3}\right)^2 = \frac{24}{81} \approx 0.2963\text{.} \end{equation*}

Hopefully you see that there is a pattern to determining the probability of a certain value of \(X\text{.}\) We first choose which of the \(n\) trials are going to be our successes. Then we multiply that by \(p\) for each success and by \(q\) for each failure. This results in the following formula.

Let's put this formula into use with a few more examples.

A scientist has determined that 40% of plants in a certain field are infected with a disease. You randomly select 10 plants from this large field. Find the probability that:

  1. none of the plants are infected

  2. half of the plants are infected

  3. all of the plants are infected.

Solution

First note that \(p = 0.40\text{,}\) \(q = 0.60\text{,}\) and \(n = 10\) in this binomial process. Let the random variable \(Y\) equal the number of plants that are infected in the 10 that we select. Then, the probabilities (rounded to four decimal places) are:

  1. \begin{equation*} P(Y=0) = C(10,0)(0.40)^0(0.60)^{10} = 1(1)(0.0060) = 0.0060 \end{equation*}
  2. \begin{equation*} P(Y=5) = C(10,5)(0.40)^5(0.60)^5 = 252(0.0102)(0.0778) \approx 0.2007 \end{equation*}
  3. \begin{equation*} P(Y=10) = C(10,10)(0.40)^{10}(0.60)^0 = 1(0.0001)(1) = 0.0001 \end{equation*}

Note that there is a pattern in this formula. \(C(n,x)\) is always a combination where we choose from the total number of trials the \(x\) that are to be successful. Next we raise \(p\) to the \(x\) power and \(q\) to the \(n-x\) power. But note that \(p + q\) is always one. In the above example, \(0.40 + 0.60 = 1\text{.}\) Also, note that the exponents of \(p\) and \(q\) add up to \(n\text{,}\) the total number of trials. These patters can be good ways to check your work and make sure you didn't miss anything.

Figure 3.2.36. Binomial Probability Formula I
Figure 3.2.37. Binomial Probability Formula II

A binomial random variable \(X\) measures the number of successes in \(n=10\) trials in which the probability of a success is \(p=0.40\text{.}\)

Question: what is \(P(X=3)\text{?}\) Round your answer to four decimal places.

Answer

0.2150

A McDonald's worker has determined that when asked the question “would you like fries with that?” 72% of customers will say yes. In a given day, the worker serves 20 customers, asking each one of them “would you like fries with that?” Let the random variable \(Y\) be the number who answered yes.

Question: what is \(P(Y=15)\text{?}\) Round your answer to four decimal places.

Answer

0.1933

According to a recent study, 32.2% of adult Americans are obese. You randomly select 16 Americans and determine if they are obese.

Question: what is the probability exactly 2 of the 16 selected Americans are obese?

Answer

0.0540

Subsection 3.2.5 At Least and At Most

The binomial probability formula from the last page tells us the probability that \(X\) will be exactly equal to \(x\text{.}\) When we say \(P(X=3)\text{,}\) for example, this gives the probability that \(X\) will equal 3—not more than 3, not less than 3, but exactly 3. It is important to keep that in mind because there are many examples in which we really want at least or at most some number of successes. Consider the following.

Only 4% of widgets produced at a certain manufacturing plant are defective. In a quality control sample of 20 widgets, what is the probability that at most 2 of them will be defective?

Solution

Note that this is a binomial process with \(n=20\) trials, assuming that the number of widgets produced is large. If we let a success mean getting a defective widget, then \(p = 0.04\) and \(q = 0.96\text{.}\) Let the random variable \(X\) be the number of defective widgets in our sample of 20 widgets.

We wish to find the probably that \(X\) is less than or equal to 2. This means that \(X\) can be 0, 1, or 2. We use the binomial probability formula to compute \(P(X=0)\text{,}\) \(P(X=1)\text{,}\) and \(P(X=2)\text{.}\) These are mutually exclusive events (we can&t have exactly one defective widget and exactly two defective widgets at the same time), so we then use the addition rule to get our final probability. The computation is shown below.

\begin{align*} P(X\leq 2) \amp = P(X=0) + P(X=1) + P(X=2)\\ \amp = C(20,0)(0.04)^0(0.96)^{20} + C(20,1)(0.04)^1(0.96)^{19} + C(20,2)(0.04)^2 (0.96)^{18}\\ \amp \approx 1(1)(0.4420) + 20(0.04)(0.4604) + 190(0.0016)(0.4796)\\ \amp \approx 0.4420 + 0.3683 + 0.1458\\ \amp = 0.9561\text{.} \end{align*}

Since “at most” includes the 2, we had to add together the probabilities of 0, 1, or 2 defective widgets. Be sure that you don't forget the possibility that there are 0 successes, or in this case 0 defective widgets! What if the question above had asked for the probability of at least 2 defective widgets? We could follow the same procedure, but the values of \(X\) that are at least 2 include \(2\text{,}\) \(3\text{,}\) \(4\text{,}\) \(\dots\text{,}\) \(20\text{.}\) That is 18 different probabilities to compute! Luckily, there is a short-cut. You can always use the complement rule to ease in a computation such as this. The following example shows how this works.

A biologist claims that at least 30% of fish in a certain large lake have been contaminated. To test his theory, he randomly samples 10 fish and determines the number that are contaminated. If only 20% of fish are actually contaminated, what is the probability that the biologist will mistakenly confirm his claim?

Solution

First note that this is a binomial process with \(n=10\) fish. The true proportion of contaminated fish is \(p = 0.20\text{,}\) so \(q = 0.80\text{.}\) We wish to know the probability that the biologist finds at least 30% of his 10 fish contaminated. That means, we want the probability that \(X\) is at least 3.

We could compute this directly by finding \(P(X=3)\text{,}\) \(P(X=4)\text{,}\) \(P(X=5)\text{,}\) \(P(X=6)\text{,}\) \(P(X=7)\text{,}\) \(P(X=8)\text{,}\) \(P(X=9)\text{,}\) and \(P(X=10)\text{.}\) However, this is a lot of work! The complement of “at least 3” is “less than 3”. That would mean \(X\) could be 0, 1, or 2. This includes only three probabilities to compute, which is a lot less work. So we will use the complement rule as follows.

\begin{align*} P(X\geq 3) \amp = 1 - P(X \lt 3)\\ \amp = 1 - ( P(X=0) + P(X=1) + P(X=2) )\\ \amp = 1 - ( C(10,0)(0.2)^0(0.8)^{10}+ C(10,1)(0.2)^1(0.8)^9+ C(10,2) (0.2)^2(0.8)^8)\\ \amp\approx 1 - ( 1(1)(0.1074) + 10(0.2)(0.1342) + 45(0.04)(0.1678) )\\ \amp\approx 1 - ( 0.1074 + 0.2684 + 0.3020 )\\ \amp= 1 - 0.6778\\ \amp= 0.3222\text{.} \end{align*}
Figure 3.2.43. Binomial Probability Ranges I
Figure 3.2.44. Binomial Probability Ranges II

A coin is weighted so that the probability of a heads is \(\frac{5}{7}\text{.}\) You flip the coin 10 times.

Question: what is the probability that you get at least one tails? Round your answer to four decimal places.

Answer

0.9654

A binomial random variable \(X\) counts the number of successes in \(n=7\) trials in which the probability of a success is \(p=0.63\text{.}\)

Question: what is \(P(X > 5)\text{?}\) Round your answer to four decimal places.

Answer

0.2013

A fair die is rolled 8 times and the number of times that it shows a number greater than 4 is counted.

Question: find the probability that it shows a number greater than 4 at most once.

Answer

0.1951

Subsection 3.2.6 Mean and Standard Deviation

Our last task in this lesson is to determine the mean and standard deviation of a binomial random variable. We could certainly use the general technique of finding these by constructing a probability distribution. This method is shown in the following example.

A basketball player is a 70% free-throw shooter, meaning that the probability he will make a given shot is 0.70. Suppose that this player takes 10 free-throws in a certain game. If we assume that these free-throws are independent Bernoulli processes, how many of the 10 shots should the basketball player expect to make?

Solution

The number of shots made, we will call it \(X\text{,}\) is a binomial random variable with \(n = 10\text{,}\) and \(p = 0.70\text{.}\) To find the expected value of \(X\)—the expected number of shots the player will make, we use the probability distribution method shown below.

\(x\) \(P(X=x)\) \(x\times P(X=x)\)
\(0\) \(C(10,0)(0.7)^0(0.3)^{10} \approx 0.0000\) \(0.0000\)
\(1\) \(C(10,1)(0.7)^1(0.3)^9 \approx 0.0001\) \(0.0001\)
\(2\) \(C(10,2)(0.7)^2(0.3)^8 \approx 0.0014\) \(0.0029\)
\(3\) \(C(10,3)(0.7)^3(0.3)^7 \approx 0.0090\) \(0.0270\)
\(4\) \(C(10,4)(0.7)^4(0.3)^6 \approx 0.0368\) \(0.1470\)
\(5\) \(C(10,5)(0.7)^5(0.3)^5 \approx 0.1029\) \(0.5146\)
\(6\) \(C(10,6)(0.7)^6(0.3)^4 \approx 0.2001\) \(1.2007\)
\(7\) \(C(10,7)(0.7)^7(0.3)^3 \approx 0.2668\) \(1.8678\)
\(8\) \(C(10,8)(0.7)^8(0.3)^2 \approx 0.2335\) \(1.8678\)
\(9\) \(C(10,9)(0.7)^9(0.3)^1 \approx 0.1211\) \(1.0895\)
\(10\) \(C(10,10)(0.7)^{10}(0.3)^0 \approx 0.0282\) \(0.2825\)
Table 3.2.49. Computing the Mean

Summing the final column, we get \(E(X) = 7\text{.}\) So the mean of this binomial random variable is 7 and we expect the player to make 7 out of 10 shots.

But wait a minute. Couldn't you have guessed that using intuition? A 70% free-throw shooter should make 70% of the 10 shots, which is 7 shots. It turns out that there are extremely simple formulas for finding both the mean and the standard deviation of a normal distribution. They are stated below.

We now apply these formulas to another example, similar to the one above.

The same basketball player from Example 3.2.48 takes 350 free-throws during the course of the season.

  1. How many of those free-throws do you expect him to make?

  2. Would it be unusual for him to make 300 or more shots?

Solution
  1. To determine the number of shots we expect, we compute the mean of the random variable.

    \begin{equation*} \mu = n\times p = (350)(0.70) = 245\text{.} \end{equation*}
  2. Recall that in Subsection 1.4.3 we used z-scores to decide if a particular value of a variable was unusual. If that value had a z-score bigger than 3 or less than -3, then it is more than 3 standard deviations away from the mean and should be considered unusual. The formula for a z-score, using the formula above for standard deviation, is:

    \begin{equation*} z = \frac{x - \mu}{\sigma} = \frac{300 - 245}{\sqrt{350(0.7)(0.3)}} \approx \frac{55}{8.5732} \approx 6.42\text{.} \end{equation*}

    So yes, making 300 or more shots would definitely be unusual. If this happens, we might suspect that he has raised his free-throw percentage above the 70% we were given.

Figure 3.2.52. Mean and Standard Deviation I
Figure 3.2.53. Mean and Standard Deviation II

Forty-eight percent of registered voters in a certain county support a new property tax. You decide to randomly select 250 registered voters and ask them if they support a new property tax.

Question: how many would you expect to say that yes, they support the tax?

Answer

120

A coin is weighted so that the probability of a heads is \(\frac{1}{4}\text{.}\) You repeatedly flip the coin 200 times and let \(X\) be the number of heads observed. Assume that \(X\) has a mound-shaped distribution.

Question: sixty-eight percent of the time the number of heads will be in a range from some minimum to some maximum number. What is the minimum number in this range? Round your answer to one decimal place.

Answer

43.9

An 80 question multiple choice test has five possible answers on each question, only one of which is correct. You decide to take the test by randomly selecting an answer on each question. A random variable \(Y\) is defined to be your final score on the test—that is, the number you answered correctly.

Question: what is the standard deviation of \(Y\text{?}\) Round your answer to one decimal place.

Answer

3.58