Saturday, April 19, 2014

LINEAR PROGRAMMING: WORD PROBLEMS ----FORM FOUR BY. MWL. JAPHET MASATU.

LINEAR   PROGRAMMING: WORD   PROBLEMS.






INTRODUCTION:

  • A calculator company produces a scientific calculator and a graphing calculator. Long-term projections indicate an expected demand of at least 100 scientific and 80 graphing calculators each day. Because of limitations on production capacity, no more than 200 scientific and 170 graphing calculators can be made daily. To satisfy a shipping contract, a total of at least 200 calculators much be shipped each day.
  • If each scientific calculator sold results in a $2 loss, but each graphing calculator produces a $5 profit, how many of each type should be made daily to maximize net profits?
    The question asks for the optimal number of calculators, so my variables will stand for that:
      x: number of scientific calculators producedy: number of graphing calculators produced




    Since they can't produce negative numbers of calculators, I have the two constraints, x > 0 and y > 0. But in this case, I can ignore these constraints, because I already have that x > 100 and y > 80. The exercise also gives maximums: x < 200 and y < 170. The minimum shipping requirement gives me x + y > 200; in other words, y >x + 200. The revenue relation will be my optimization equation: R = –2x + 5y. So the entire system is:
      R = –2x + 5y, subject to:
      100 < x < 200 
      80 <  y < 170
       
      y >x + 200
       
    The feasibility region graphs as:   Copyright © Elizabeth Stapel 2006-2011 All Rights Reserved
      feasibility region
When you test the corner points at (100, 170), (200, 170), (200, 80), (120, 80), and (100, 100), you should obtain the maximum value of R = 650 at (x, y) = (100, 170). That is, the solution is "100 scientific calculators and 170 graphing calculators".



  • You need to buy some filing cabinets. You know that Cabinet X costs $10 per unit, requires six square feet of floor space, and holds eight cubic feet of files. Cabinet Y costs $20 per unit, requires eight square feet of floor space, and holds twelve cubic feet of files. You have been given $140 for this purchase, though you don't have to spend that much. The office has room for no more than 72 square feet of cabinets. How many of which model should you buy, in order to maximize storage volume?
  • The question ask for the number of cabinets I need to buy, so my variables will stand for that:
      x: number of model X cabinets purchasedy: number of model Y cabinets purchased
    Naturally, x > 0 and y > 0. I have to consider costs and floor space (the "footprint" of each unit), while maximizing the storage volume, so costs and floor space will be my constraints, while volume will be my optimization equation.
      cost: 10x + 20y < 140, or y < –( 1/2 )x + 7
      space:
      6x + 8y < 72, or y < –( 3/4 )x + 9
      volume:
      V = 8x + 12y
    This system (along with the first two constraints) graphs as:
      feasibility region
When you test the corner points at (8, 3), (0, 7), and (12, 0), you should obtain a maximal volume of 100 cubic feet by buying eight of model X and three of model Y .               Linear Programming: More Word Problems Sections: Optimizing linear systems, Setting up word problems



  • In order to ensure optimal health (and thus accurate test results), a lab technician needs to feed the rabbits a daily diet containing a minimum of 24 grams (g) of fat, 36 g of carbohydrates, and 4 g of protien. But the rabbits should be fed no more than five ounces of food a day.
  • Rather than order rabbit food that is custom-blended, it is cheaper to order Food X and Food Y, and blend them for an optimal mix. Food X contains 8 g of fat, 12 g of carbohydrates, and 2 g of protein per ounce, and costs $0.20 per ounce. Food Y contains 12 g of fat, 12 g of carbohydrates, and 1 g of protein per ounce, at a cost of $0.30 per ounce.
    What is the optimal blend?
    Since the exercise is asking for the number of ounces of each food required for the optimal daily blend, my variables will stand for the number of ounces of each:




      x: number of ounces of Food Xy: number of ounces of Food Y
    Since I can't use negative amounts of either food, the first two constrains are the usual ones: x > 0 and y > 0. The other constraints come from the grams of fat, carbohydrates, and protein per ounce:
      fat:        8x + 12y > 24
      carbs:  
      12x + 12y > 36
      protein:  
      2x +   1y >   4
    Also, the maximum weight of the food is five ounces, so:

      x + y < 5
    The optimization equation will be the cost relation C = 0.2x + 0.3y, but this time I'll be finding the minimum value, not the maximum.
    After rearranging the inequalities, the system graphs as:
      feasibility region
    (Note: One of the lines above is irrelevant to the system. Can you tell which one?)
When you test the corners at (0, 4), (0, 5), (3, 0), (5, 0), and (1, 2), you should get a minimum cost of sixty cents per daily serving, using three ounces of Food X only.


Sometimes you'll have more than just two things to deal with. The next example has three things to juggle; the next page provides an example of juggling four things.

  • You have $12,000 to invest, and three different funds from which to choose. The municipal bond fund has a 7% return, the local bank's CDs have an 8% return, and the high-risk account has an expected (hoped-for) 12% return. To minimize risk, you decide not to invest any more than $2,000 in the high-risk account. For tax reasons, you need to invest at least three times as much in the municipal bonds as in the bank CDs. Assuming the year-end yields are as expected, what are the optimal investment amounts?
  • Since the question is asking me to find the amount of money for each account, my variables will need to stand for those amounts. Since I'd like to deal with smaller numbers, I'll count by thousands, so:
      x: amount (in thousands) invested in bondsy: amount (in thousands) invested in CDs
    Um... now what? I only have two variables, but I have three accounts. To handle this, I need the "how much is left" construction:
      12 – x – y: amount (in thousands) invested in the high-risk account
    I can't invest negative amounts of money, so the first two constraints are the usual ones: x > 0 and y > 0. The amount in the high-risk account can't be negative either, so 12 – x – y > 0, which simplifies as:
      y <x + 12
    Also, the upper limit on the high-risk account gives me the inequality (12 – x – y) < 2. This simplifies as:   Copyright © Elizabeth Stapel 2006-2011 All Rights Reserved
      y >x + 10
    And the tax requirements give me y < ( 1/3 )x. The optimization equation will be the total investment yield, Y = 0.07x + 0.08y + 0.12(12 – x – y) = 1.44 – 0.05x – 0.04y. The entire system is then as follows:
      Maximize Y = 1.44 – 0.05x – 0.04y, subject to:
        
      x > 0
        y > 0

        y >x + 10

        y <x + 12

        y < ( 1/3 )x
    The feasibility region graphs as:

feasibility region
When you test the corner points at (9, 3), (12, 0), (10, 0), and (7.5, 2.5), you should get an optimal return of $965 when you invest $7,500 in municipal bonds, $2,500 in CDs, and the remaining $2,000 in the high-risk account.

LINEAR PROGRAMMING----- FORM FOUR.

LINEAR    PROGRAMMING-----FORM  FOUR

Sections: Optimizing linear systems, Setting up word problems

INTRODUCTION:

Linear programming is the process of taking various linear inequalities relating to some situation, and finding the "best" value obtainable under those conditions. A typical example would be taking the limitations of materials and labor, and then determining the "best" production levels for maximal profits under those conditions.
In "real life", linear programming is part of a very important area of mathematics called "optimization techniques". This field of study (or at least the applied results of it) are used every day in the organization and allocation of resources. These "real life" systems can have dozens or hundreds of variables, or more. In algebra, though, you'll only work with the simple (and graphable) two-variable linear case.
The general process for solving linear-programming exercises is to graph the inequalities (called the "constraints") to form a walled-off area on the x,y-plane (called the "feasibility region"). Then you figure out the coordinates of the corners of this feasibility region (that is, you find the intersection points of the various pairs of lines), and test these corner points in the formula (called the "optimization equation") for which you're trying to find the highest or lowest value.




  • Find the maximal and minimal value of z = 3x + 4y subject to the following constraints:
    • x + 2y <= 14, 3x - y >= 0, x - y <= 2
    The three inequalities in the curly braces are the constraints. The area of the plane that they mark off will be the feasibility region. The formula "z = 3x + 4y" is the optimization equation. I need to find the (x, y) corner points of the feasibility region that return the largest and smallest values of z.
    My first step is to solve each inequality for the more-easily graphed equivalent forms:
      y <= -(1/2)x + 7, y <= 3x, y >= x - 2
    It's easy to graph the system:   Copyright © Elizabeth Stapel 2006-2011 All Rights Reserved
      graph of inequalities, with lines labelled and feasibility region shaded in
    To find the corner points -- which aren't always clear from the graph -- I'll pair the lines (thus forming a system of linear equations) and solve:
      y = –( 1/2 )x + 7y = 3x
      y = –( 1/2 )x + 7y = x – 2
      y = 3x
      y
      = x – 2
      –( 1/2 )x + 7 = 3xx + 14 = 6x14 = 7x
      2 =
      x
      y = 3(2) = 6
      –( 1/2 )x + 7 = x – 2
      x + 14 = 2x – 4
      18 = 3
      x
      6 =
      x
      y = (6) – 2 = 4
      3x = x – 2
      2
      x = –2x = –1
      y = 3(–1) = –3
      corner point at (2, 6)
      corner point at (6, 4)
      corner pt. at (–1, –3)
    So the corner points are (2, 6), (6, 4), and (–1, –3).
    Somebody really smart proved that, for linear systems like this, the maximum and minimum values of the optimization equation will always be on the corners of the feasibility region. So, to find the solution to this exercise, I only need to plug these three points into "z = 3x + 4y".
      (2, 6):      z = 3(2)   + 4(6)   =   6 + 24 =   30
      (6, 4):      
      z = 3(6)   + 4(4)   = 18 + 16 =   34
      (–1, –3):  z = 3(–1) + 4(–3) = –3 – 12 = –15
    Then the maximum of z = 34 occurs at (6, 4),
    and
    the minimum of z = –15 occurs at (–1, –3).
    • Given the following constraints, maximize and minimize the value of z = –0.4x + 3.2y.
      • x >= 0, y >= 0, x <= 5, x + y <= 7, x + 2y >= 4, y <= x + 5
      First I'll solve the fourth and fifth constraints for easier graphing:
        x >= 0, y >= 0, x <= 5, y <= -x + 7, y >= -(1/2)x + 2, y <= x + 5
      The feasibility region looks like this:
        feasibility region
      From the graph, I can see which lines cross to form the corners, so I know which lines to pair up in order to verify the coordinates. I'll start at the "top" of the shaded area and work my way clockwise around the edges:
        y = –x + 7y = x + 5
        y = –x + 7x = 5
        x = 5y = 0
        x + 7 = x + 5
        2 = 2
        x
        1 =
        x
        y = (1) + 5 = 6
        y = –(5) + 7 = 2
        [nothing to do]
        corner at (1, 6)
        corner at (5, 2)
        corner at (5, 0)

        y = 0y = –( 1/2 )x + 2
        y = –( 1/2 )x + 2x = 0
        x = 0y = x + 5
        –( 1/2 )x + 2 = 0
        2 = (1/2)x
        4 = x
        y = –( 1/2 )(0) + 2
        y = 0 + 2
        y = 2
        y = (0) + 5 = 5
        corner at (4, 0)
        corner at (0, 2)
        corner at (0, 5)
      Now I'll plug each corner point into the optimization equation, z = –0.4x + 3.2y:
        (1, 6):  z = –0.4(1) + 3.2(6) = –0.4 + 19.2 = 18.8
        (5, 2):  z = –0.4(5) + 3.2(2) = –2.0 + 6.4   =   4.4

        (5, 0):  z = –0.4(5) + 3.2(0) = –2.0 + 0.0   = –2.0

        (4, 0):  z = –0.4(4) + 3.2(0) = –1.6 + 0.0   = –1.6

        (0, 2):  z = –0.4(0) + 3.2(2) = –0.0 + 6.4   =   6.4

        (0, 5):  z = –0.4(0) + 3.2(5) = –0.0 + 16.0 = 16.0
      Then the maximum is 18.8 at (1, 6) and the minimum is –2 at (5, 0).

Given the inequalities, linear-programming exercise are pretty straightforward, if sometimes a bit long. The hard part is usually the word problems, where you have to figure out what the inequalities are. So I'll show how to set up some typical linear-programming word problems.
  • At a certain refinery, the refining process requires the production of at least two gallons of gasoline for each gallon of fuel oil. To meet the anticipated demands of winter, at least three million gallons of fuel oil a day will need to be produced. The demand for gasoline, on the other hand, is not more than 6.4 million gallons a day.
  • If gasoline is selling for $1.90 per gallon and fuel oil sells for $1.50/gal, how much of each should be produced in order to maximize revenue?
    The question asks for the number of gallons which should be produced, so I should let my variables stand for "gallons produced".

    ADVERTISEMENT


      x: gallons of gasoline producedy: gallons of fuel oil produced
    Since this is a "real world" problem, I know that I can't have negative production levels, so the variables can't be negative. This gives me my first two constraints: namely, x > 0 and y > 0.
    Since I have to have at least two gallons of gas for every gallon of oil, then x > 2y.
    For graphing, of course, I'll use the more manageable form "y < ( 1/2 )x".
    The winter demand says that y > 3,000,000; note that this constraint eliminates the need for the "y > 0" constraint. The gas demand says that x < 6,400,000.

    I need to maximize revenue R, so the optimization equation is R = 1.9x + 1.5y. Then the model for this word problem is as follows:
      R = 1.9x + 1.5y, subject to:
      x
      > 0

      x < 6,400,000
        Copyright © Elizabeth Stapel 2006-2011 All Rights Reserved
      y > 3,000,000
      y < ( 1/2 )x
    Using a scale that counts by millions (so "y = 3" on the graph means "y is three million"), the above system graphs as follows:
      feasibility region
    Taking a closer look, I can see the feasibility region a little better:
      close-up of feasibility region
When you test the corner points at (6.4m, 3.2m), (6.4m, 3m), and (6m, 3m), you should get a maximal solution of R = $16.96m at (x, y) = (6.4m, 3.2m).

Wednesday, April 16, 2014

PROBABILITY ----- FORM FOUR ----- BY. MWL. JAPHET MASATU

  1. Introduction
  2. Basic Concepts
  3. Conditional Probability Demo
  4. Gambler's Fallacy Simulation
  5. Permutations and Combinations
  6. Birthday Simulation
  7. Binomial Distribution
  8. Binomial Demonstration
  9. Poisson Distribution
  10. Multinomial Distribution
  11. Hypergeometric Distribution
  12. Base Rates
  13. Bayes' Theorem Demonstration
  14. Monty Hall Problem Demonstration
  15. Statistical Literacy
  16. Exercises


Probability is an important and complex field of study. Fortunately, only a few basic issues in probability theory are essential for understanding statistics at the level covered in this book. These basic issues are covered in this chapter.

The introductory section discusses the definitions of probability. This is not as simple as it may seem. The section on basic concepts covers how to compute probabilities in a variety of simple situations. The Gambler's Fallacy Simulation provides an opportunity to explore this fallacy by simulation. The Birthday Demonstration illustrates the probability of finding two or more people with the same birthday. The Binomial Demonstration shows the binomial distribution for different parameters. The section on base rates discusses an important but often-ignored factor in determining probabilities. It also presents Bayes' Theorem. The Bayes' Theorem Demonstration shows how a tree diagram and Bayes' Theorem result in the same answer. Finally, the Monty Hall Demonstration lets you play a game with a very counterintuitive result.


 

PROBABILITY ----- FORM FOUR BY. MWL. JAPHET MASATU.

INTRODUCTION    TO    PROBABILITY.
Learning Objectives
  1. Compute probability in a situation where there are equally-likely outcomes
  2. Apply concepts to cards and dice
  3. Compute the probability of two independent events both occurring
  4. Compute the probability of either of two independent events occurring
  5. Do problems that involve conditional probabilities
  6. Compute the probability that in a room of N people, at least two share a birthday
  7. Describe the gambler's fallacy
Probability of a Single Event
If you roll a six-sided die, there are six possible outcomes, and each of these outcomes is equally likely. A six is as likely to come up as a three, and likewise for the other four sides of the die. What, then, is the probability that a one will come up? Since there are six possible outcomes, the probability is 1/6. What is the probability that either a one or a six will come up? The two outcomes about which we are concerned (a one or a six coming up) are called favorable outcomes. Given that all outcomes are equally likely, we can compute the probability of a one or a six using the formula:

In this case there are two favorable outcomes and six possible outcomes. So the probability of throwing either a one or six is 1/3. Don't be misled by our use of the term "favorable," by the way. You should understand it in the sense of "favorable to the event in question happening." That event might not be favorable to your well-being. You might be betting on a three, for example.
The above formula applies to many games of chance. For example, what is the probability that a card drawn at random from a deck of playing cards will be an ace? Since the deck has four aces, there are four favorable outcomes; since the deck has 52 cards, there are 52 possible outcomes. The probability is therefore 4/52 = 1/13. What about the probability that the card will be a club? Since there are 13 clubs, the probability is 13/52 = 1/4.
Let's say you have a bag with 20 cherries: 14 sweet and 6 sour. If you pick a cherry at random, what is the probability that it will be sweet? There are 20 possible cherries that could be picked, so the number of possible outcomes is 20. Of these 20 possible outcomes, 14 are favorable (sweet), so the probability that the cherry will be sweet is 14/20 = 7/10. There is one potential complication to this example, however. It must be assumed that the probability of picking any of the cherries is the same as the probability of picking any other. This wouldn't be true if (let us imagine) the sweet cherries are smaller than the sour ones. (The sour cherries would come to hand more readily when you sampled from the bag.) Let us keep in mind, therefore, that when we assess probabilities in terms of the ratio of favorable to all potential cases, we rely heavily on the assumption of equal probability for all outcomes.
Here is a more complex example. You throw 2 dice. What is the probability that the sum of the two dice will be 6? To solve this problem, list all the possible outcomes. There are 36 of them since each die can come up one of six ways. The 36 possibilities are shown below.

Die 1 Die 2 Total   Die 1 Die 2 Total   Die 1 Die 2 Total
1 1 2   3 1 4   5 1 6
1 2 3   3 2 5   5 2 7
1 3 4   3 3 6   5 3 8
1 4 5   3 4 7   5 4 9
1 5 6   3 5 8   5 5 10
1 6 7   3 6 9   5 6 11
2 1 3   4 1 5   6 1 7
2 2 4   4 2 6   6 2 8
2 3 5   4 3 7   6 3 9
2 4 6   4 4 8   6 4 10
2 5 7   4 5 9   6 5 11
2 6 8   4 6 10   6 6 12

You can see that 5 of the 36 possibilities total 6. Therefore, the probability is 5/36.
If you know the probability of an event occurring, it is easy to compute the probability that the event does not occur. If P(A) is the probability of Event A, then 1 - P(A) is the probability that the event does not occur. For the last example, the probability that the total is 6 is 5/36. Therefore, the probability that the total is not 6 is 1 - 5/36 = 31/36.
Probability of Two (or more) Independent Events
Events A and B are independent events if the probability of Event B occurring is the same whether or not Event A occurs. Let's take a simple example. A fair coin is tossed two times. The probability that a head comes up on the second toss is 1/2 regardless of whether or not a head came up on the first toss. The two events are (1) first toss is a head and (2) second toss is a head. So these events are independent. Consider the two events (1) "It will rain tomorrow in Houston" and (2) "It will rain tomorrow in Galveston" (a city near Houston). These events are not independent because it is more likely that it will rain in Galveston on days it rains in Houston than on days it does not.
Probability of A and B
When two events are independent, the probability of both occurring is the product of the probabilities of the individual events. More formally, if events A and B are independent, then the probability of both A and B occurring is:
P(A and B) = P(A) x P(B)
where P(A and B) is the probability of events A and B both occurring, P(A) is the probability of event A occurring, and P(B) is the probability of event B occurring.
If you flip a coin twice, what is the probability that it will come up heads both times? Event A is that the coin comes up heads on the first flip and Event B is that the coin comes up heads on the second flip. Since both P(A) and P(B) equal 1/2, the probability that both events occur is
1/2 x 1/2 = 1/4
Let's take another example. If you flip a coin and roll a six-sided die, what is the probability that the coin comes up heads and the die comes up 1? Since the two events are independent, the probability is simply the probability of a head (which is 1/2) times the probability of the die coming up 1 (which is 1/6). Therefore, the probability of both events occurring is 1/2 x 1/6 = 1/12.
One final example: You draw a card from a deck of cards, put it back, and then draw another card. What is the probability that the first card is a heart and the second card is black? Since there are 52 cards in a deck and 13 of them are hearts, the probability that the first card is a heart is 13/52 = 1/4. Since there are 26 black cards in the deck, the probability that the second card is black is 26/52 = 1/2. The probability of both events occurring is therefore 1/4 x 1/2 = 1/8.
See the section on conditional probabilities on this page to see how to compute P(A and B) when A and B are not independent.
Probability of A or B
If Events A and B are independent, the probability that either Event A or Event B occurs is:
P(A or B) = P(A) + P(B) - P(A and B)
In this discussion, when we say "A or B occurs" we include three possibilities:
  1. A occurs and B does not occur
  2. B occurs and A does not occur
  3. Both A and B occur
This use of the word "or" is technically called inclusive or because it includes the case in which both A and B occur. If we included only the first two cases, then we would be using an exclusive or.
(Optional) We can derive the law for P(A-or-B) from our law about P(A-and-B). The event "A-or-B" can happen in any of the following ways:
  1. A-and-B happens
  2. A-and-not-B happens
  3. not-A-and-B happens.
The simple event A can happen if either A-and-B happens or A-and-not-B happens. Similarly, the simple event B happens if either A-and-B happens or not-A-and-B happens. P(A) + P(B) is therefore P(A-and-B) + P(A-and-not-B) + P(A-and-B) + P(not-A-and-B), whereas P(A-or-B) is P(A-and-B) + P(A-and-not-B) + P(not-A-and-B). We can make these two sums equal by subtracting one occurrence of P(A-and-B) from the first. Hence, P(A-or-B) = P(A) + P(B) - P(A-and-B).

Now for some examples. If you flip a coin two times, what is the probability that you will get a head on the first flip or a head on the second flip (or both)? Letting Event A be a head on the first flip and Event B be a head on the second flip, then P(A) = 1/2, P(B) = 1/2, and P(A and B) = 1/4. Therefore,
P(A or B) = 1/2 + 1/2 - 1/4 = 3/4.
If you throw a six-sided die and then flip a coin, what is the probability that you will get either a 6 on the die or a head on the coin flip (or both)? Using the formula,
P(6 or head) = P(6) + P(head) - P(6 and head)
             = (1/6) + (1/2) - (1/6)(1/2)
             = 7/12
An alternate approach to computing this value is to start by computing the probability of not getting either a 6 or a head. Then subtract this value from 1 to compute the probability of getting a 6 or a head. Although this is a complicated method, it has the advantage of being applicable to problems with more than two events. Here is the calculation in the present case. The probability of not getting either a 6 or a head can be recast as the probability of
(not getting a 6) AND (not getting a head).
This follows because if you did not get a 6 and you did not get a head, then you did not get a 6 or a head. The probability of not getting a six is 1 - 1/6 = 5/6. The probability of not getting a head is 1 - 1/2 = 1/2. The probability of not getting a six and not getting a head is 5/6 x 1/2 = 5/12. This is therefore the probability of not getting a 6 or a head. The probability of getting a six or a head is therefore (once again) 1 - 5/12 = 7/12.
If you throw a die three times, what is the probability that one or more of your throws will come up with a 1? That is, what is the probability of getting a 1 on the first throw OR a 1 on the second throw OR a 1 on the third throw? The easiest way to approach this problem is to compute the probability of
NOT getting a 1 on the first throw
AND not getting a 1 on the second throw
AND not getting a 1 on the third throw.
The answer will be 1 minus this probability. The probability of not getting a 1 on any of the three throws is 5/6 x 5/6 x 5/6 = 125/216. Therefore, the probability of getting a 1 on at least one of the throws is 1 - 125/216 = 91/216.
Conditional Probabilities
Often it is required to compute the probability of an event given that another event has occurred. For example, what is the probability that two cards drawn at random from a deck of playing cards will both be aces? It might seem that you could use the formula for the probability of two independent events and simply multiply 4/52 x 4/52 = 1/169. This would be incorrect, however, because the two events are not independent. If the first card drawn is an ace, then the probability that the second card is also an ace would be lower because there would only be three aces left in the deck.
Once the first card chosen is an ace, the probability that the second card chosen is also an ace is called the conditional probability of drawing an ace. In this case, the "condition" is that the first card is an ace. Symbolically, we write this as:
P(ace on second draw | an ace on the first draw)
The vertical bar "|" is read as "given," so the above expression is short for: "The probability that an ace is drawn on the second draw given that an ace was drawn on the first draw." What is this probability? Since after an ace is drawn on the first draw, there are 3 aces out of 51 total cards left. This means that the probability that one of these aces will be drawn is 3/51 = 1/17.
If Events A and B are not independent, then P(A and B) = P(A) x P(B|A).
Applying this to the problem of two aces, the probability of drawing two aces from a deck is 4/52 x 3/51 = 1/221.
One more example: If you draw two cards from a deck, what is the probability that you will get the Ace of Diamonds and a black card? There are two ways you can satisfy this condition: (a) You can get the Ace of Diamonds first and then a black card or (b) you can get a black card first and then the Ace of Diamonds. Let's calculate Case A. The probability that the first card is the Ace of Diamonds is 1/52. The probability that the second card is black given that the first card is the Ace of Diamonds is 26/51 because 26 of the remaining 51 cards are black. The probability is therefore 1/52 x 26/51 = 1/102. Now for Case B: the probability that the first card is black is 26/52 = 1/2. The probability that the second card is the Ace of Diamonds given that the first card is black is 1/51. The probability of Case B is therefore 1/2 x 1/51 = 1/102, the same as the probability of Case A. Recall that the probability of A or B is P(A) + P(B) - P(A and B). In this problem, P(A and B) = 0 since a card cannot be the Ace of Diamonds and be a black card. Therefore, the probability of Case A or Case B is 1/102 + 1/102 = 2/102 = 1/51. So, 1/51 is the probability that you will get the Ace of Diamonds and a black card when drawing two cards from a deck.
Birthday Problem
If there are 25 people in a room, what is the probability that at least two of them share the same birthday. If your first thought is that it is 25/365 = 0.068, you will be surprised to learn it is much higher than that. This problem requires the application of the sections on P(A and B) and conditional probability.
This problem is best approached by asking what is the probability that no two people have the same birthday. Once we know this probability, we can simply subtract it from 1 to find the probability that two people share a birthday.
If we choose two people at random, what is the probability that they do not share a birthday? Of the 365 days on which the second person could have a birthday, 364 of them are different from the first person's birthday. Therefore the probability is 364/365. Let's define P2 as the probability that the second person drawn does not share a birthday with the person drawn previously. P2 is therefore 364/365. Now define P3 as the probability that the third person drawn does not share a birthday with anyone drawn previously given that there are no previous birthday matches. P3 is therefore a conditional probability. If there are no previous birthday matches, then two of the 365 days have been "used up," leaving 363 non-matching days. Therefore P3 = 363/365. In like manner, P4 = 362/365, P5 = 361/365, and so on up to P25 = 341/365.
In order for there to be no matches, the second person must not match any previous person and the third person must not match any previous person, and the fourth person must not match any previous person, etc. Since P(A and B) = P(A)P(B), all we have to do is multiply P2, P3, P4 ...P25 together. The result is 0.431. Therefore the probability of at least one match is 0.569.
Gambler's Fallacy
A fair coin is flipped five times and comes up heads each time. What is the probability that it will come up heads on the sixth flip? The correct answer is, of course, 1/2. But many people believe that a tail is more likely to occur after throwing five heads. Their faulty reasoning may go something like this: "In the long run, the number of heads and tails will be the same, so the tails have some catching up to do." The flaws in this logic are exposed in the simulation in this chapter.
Question 2 out of 12.
You have a bag of marbles. There are 3 red marbles, 2 green marbles, 7 yellow marbles, and 3 blue marbles. What is the probability of drawing a yellow or red marble?