• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Stumbling Intuition #1: The Monty Hall Problem

Shadow Wolf

Certified People sTabber
I can't drive. And I don't know how to care for a goat. It don't matter if I get a goat or a car I'm still getting something that I don't know what to do with. Might as well stick with the door i originally chose
The car gets around, the goat gets you dinner. Amd it's been so long since I've had goat tacos I might settle for that goat.:joycat:
 

Sedim Haba

Outa here... bye-bye!
Starting with this thread, I'm planning to initiate a series of threads about counterintuitive problems in mathematics (or programming, if I encounter suitable ones), both for fun and as an opportunity to engage my passion for math. :D

The original Monty Hall problem is worded thus:



Monty Hall problem - Wikipedia

I posted the Wikipedia article to properly cite my source for the above wording, but since it contains the answer to the problem, I recommend not clicking the link before you have come up with your answer!

Before posting your answer, take your time to analyze it! This problem is an example of a veridical paradox, and it is so counterintuitive that Paul Erdős, one of the most prolific mathematicians of the 20th century, wasn't convinced of the correct solution until he saw computer simulations demonstrating its correctness.



Paradox - Wikipedia

Also, please only post your answer in spoiler tags so that you don't potentially reveal the solution to others!

So, if Monty Hall asked you the above question, would you open the door you initially chose, or would you switch to the other one?

Does not say what kind of car.Toy car? Wrecked car? Expensive car I can't even afford
to insure, maintain (those cars can only be serviced by factory dealerships) or gas up?

I'm keeping the goat. I love my goats, always room for more.
 

lewisnotmiller

Grand Hat
Staff member
Premium Member
Considering that legions of mathematicians argued against this back in the day I assume probably over 99% of people get it wrong at first or are lying.

I agree with the OP...it's counterintuitive. It's also super easy to test, which of course I did...lol
 

Shadow Wolf

Certified People sTabber
I agree with the OP...it's counterintuitive. It's also super easy to test, which of course I did...lol
I think it's realizing how the manipulation makes it so this is not a free or fair random pick as we understand it, and never actually is, is what makes this tricky. We can pick from non-randomness, and then it changes to make it less random amd more certain. If this were truly a free and fair random pick, the host would know nothing, would be able to offer nothing, and it's a straightforward 1/3 chance of being right. But that is not the circumstances of this scenario.
 

9-10ths_Penguin

1/10 Subway Stalinist
Premium Member
One should think so. After all the chance of tails in a coin flip is 0.5, no matter what the previous outcome was. The events are independent. After the goat was revealed it is a new experiment with two possible outcomes, each with a 0.5 chance.
That's where the error lies.
There are two possible outcomes, but the chances aren't equal.

The key point to consider is that the game show host has given you extra information about the doors you didn't choose. The host didn't open a door randomly; they knew which door hides the car and deliberately opened a door you didn't choose that hid a goat.
 

Subduction Zone

Veteran Member
I know the answer, but I invariably forget the reasoning behind such, even if I did understand it at the time. :oops:
A non-mathematical explanation:

Think of your original pick as an attempt to find the goat. You know that Monty already knows where the car is. That means that no matter which box he reveals it will reveal a goat. You originally had a two chance out of three to find a goat. The odds of that choice does not change by Monty's later actions. He always could reveal a goat in the remaining two boxes. So the odds of the car being in one of those two boxes does not change.. That is the misdirection of the MHP. The action taken makes one mistakenly think that the odds have changed when they have not.

By switching boxes you are in effect getting to guess at two boxes instead of just one.

Or you could think of it this way. There are three boxes. Monty lets you choose two. You know that at least one of your boxes has a goat. Monty shows that at least one of your boxes has a goat. Would you change your two boxes for the other one? You already knew that you had at least one goat, how does showing you one goat change that? Remember, Monty knew which box has the car in it. He was going to show you a goat no matter what. I would keep the "two boxes". That is what one is doing when one switches in the first example. One is in effect "keeping two boxes".
 

Stevicus

Veteran Member
Staff member
Premium Member
They chose the door with the goat.

iduh-showtime.gif
 

9-10ths_Penguin

1/10 Subway Stalinist
Premium Member
I know the answer, but I invariably forget the reasoning behind such, even if I did understand it at the time. :oops:
The way I think about it (in spoiler tags so that you don't need to see it unless you want to):

Assuming the game is fair and the car was placed randomly, the odds are 1/3 that the original door you picked had a car and 2/3 that you didn't.

Revealing the goat doesn't change those initial odds, but the extra information means that the whole 2/3 odds that the car is behind a door you didn't pick gets assigned to one door instead of two.
 

Stevicus

Veteran Member
Staff member
Premium Member
Some may remember the episode of The Odd Couple where Monty Hall appears as a guest star. This is how they resolved the Monty Hall Problem:

 

Debater Slayer

Vipassana
Staff member
Premium Member
As I understand, one can take a problem of likeliness, and plug it into an equation. Maybe take an example like that, and show me how you'd plug it into the equation

Okay, the first thing to note here is that at the heart of Bayesian inference is the notion of looking at different events in terms of their likelihood or probability, even if they may seem extremely unlikely or almost impossible. That is, if you were to tell me that you encountered two aliens on your way home from work today, I wouldn't think of your statement in binary terms of "possible vs. impossible"; instead, Bayesian inference would have me thinking, "How likely is it that he encountered two aliens, and how likely is it that he's just bluffing?"

Now, further suppose that there are other events—which we will call variables in this case—that depend on the likelihood that you have indeed met two aliens, or the likelihood that you are bluffing. Those events would be dependent variables: when the probability of an event changes based on the occurrence of another, it is said to be a dependent variable. If you show me a bunch of scales and say, "These belong to the two aliens I met today," the question then has a new factor: how likely is it that you have indeed met two aliens given that you have shown me a bunch of reptilian-looking scales?

Bayes' theorem, which deals with conditional probability and how evidence affects the probability of two events, is one of the most significant tools in data science (and also medicine, but I'm focusing on my field of study because I'm more familiar with it) because, among other things, it allows you to continually improve your calculation of the probability that a data point will occur or has occurred based on the probability of another data point. It is worded thus:

bayes_theorem.jpg


Alternatively, the rule is formulated as seen on the left side of this image:

1*tHEjQcnf1s6-_VHBSgdbBg.png


At the core of this is reliance on observation and new evidence. In our example, let's call the event that you met two aliens A. Let's also call the event that you showed me the reptilian-looking scales B.

What we now need, in order to form an equation and plug numerical values into it, is to have the values of the probability of A, denoted by P(A), and the probability of B, denoted by P(B). Then, since the probability that you met two aliens changes based on the evidence (the scales), we will connect the two probabilities using Bayes' theorem and calculate the conditional probability that you met two aliens given that you have shown me reptilian scales. This is denoted by P(A|B): the probability of A given B.

Note that in probability, the likelihood of an event ranges from 0 (impossible or certain to not occur/to not have occurred) to 1 (certain to happen/has already happened). When we talk about the weather, we can't say that there's a 150% chance of rain tomorrow or that there's a -10% chance that tomorrow will be overcast: the most we can possibly say is 100% and the least is 0%, and that's in an ideal scenario. In reality, many situations just involve various levels of uncertainty and likelihood, such as 50%, 60%, etc.

So, in our example, let's say that there's 40% chance before considering any evidence that you have indeed met two aliens. 40% in decimal form is 0.4. Let's also say that your probability of collecting the reptilian scales is 50%, which is 0.5. Finally, assume that this probability increases to 60%, or 0.6, provided that you have already met the two aliens: this is denoted by P(B|A), meaning the probability that you have collected the reptilian scales given that you have already met two aliens.

This gives us P(A|B) = (0.6 x 0.4)/(0.5) = 0.48, or 48%. Notice that before considering the evidence, which is the scales you collected, the probability that you had met two aliens was only 40%. It increased by 8% after considering the evidence at hand. (By the way, I made up this example from scratch, so I'm sure there are more robust examples online from the real world. I just used this one to illustrate the core concepts of Bayesian probability.)

I hope this helps! Of course, feel free to let me know if you have any other questions.
 

VoidCat

Pronouns: he/they/it/neopronouns
Okay, the first thing to note here is that at the heart of Bayesian inference is the notion of looking at different events in terms of their likelihood or probability, even if they may seem extremely unlikely or almost impossible. That is, if you were to tell me that you encountered two aliens on your way home from work today, I wouldn't think of your statement in binary terms of "possible vs. impossible"; instead, Bayesian inference would have me thinking, "How likely is it that he encountered two aliens, and how likely is it that he's just bluffing?"

Now, further suppose that there are other events—which we will call variables in this case—that depend on the likelihood that you have indeed met two aliens, or the likelihood that you are bluffing. Those events would be dependent variables: when the probability of an event changes based on the occurrence of another, it is said to be a dependent variable. If you show me a bunch of scales and say, "These belong to the two aliens I met today," the question then has a new factor: how likely is it that you have indeed met two aliens given that you have shown me a bunch of reptilian-looking scales?

Bayes' theorem, which deals with conditional probability and how evidence affects the probability of two events, is one of the most significant tools in data science (and also medicine, but I'm focusing on my field of study because I'm more familiar with it) because, among other things, it allows you to continually improve your calculation of the probability that a data point will occur or has occurred based on the probability of another data point. It is worded thus:

bayes_theorem.jpg


Alternatively, the rule is formulated as seen on the left side of this image:

1*tHEjQcnf1s6-_VHBSgdbBg.png


At the core of this is reliance on observation and new evidence. In our example, let's call the event that you met two aliens A. Let's also call the event that you showed me the reptilian-looking scales B.

What we now need, in order to form an equation and plug numerical values into it, is to have the values of the probability of A, denoted by P(A), and the probability of B, denoted by P(B). Then, since the probability that you met two aliens changes based on the evidence (the scales), we will connect the two probabilities using Bayes' theorem and calculate the conditional probability that you met two aliens given that you have shown me reptilian scales. This is denoted by P(A|B): the probability of A given B.

Note that in probability, the likelihood of an event ranges from 0 (impossible or certain to not occur/to not have occurred) to 1 (certain to happen/has already happened). When we talk about the weather, we can't say that there's a 150% chance of rain tomorrow or that there's a -10% chance that tomorrow will be overcast: the most we can possibly say is 100% and the least is 0%, and that's in an ideal scenario. In reality, many situations just involve various levels of uncertainty and likelihood, such as 50%, 60%, etc.

So, in our example, let's say that there's 40% chance before considering any evidence that you have indeed met two aliens. 40% in decimal form is 0.4. Let's also say that your probability of collecting the reptilian scales is 50%, which is 0.5. Finally, assume that this probability increases to 60%, or 0.6, provided that you have already met the two aliens: this is denoted by P(B|A), meaning the probability that you have collected the reptilian scales given that you have already met two aliens.

This gives us P(A|B) = (0.6 x 0.4)/(0.5) = 0.48, or 48%. Notice that before considering the evidence, which is the scales you collected, the probability that you had met two aliens was only 40%. It increased by 8% after considering the evidence at hand. (By the way, I made up this example from scratch, so I'm sure there are more robust examples online from the real world. I just used this one to illustrate the core concepts of Bayesian probability.)

I hope this helps! Of course, feel free to let me know if you have any other questions.
This is a bit too smart for me. And I don't grasp it enough to ask a question. I doubt I would ever understand it. But I enjoyed reading you geeking out over it. So that's nice.
 

PureX

Veteran Member
This strikes me more as a psychological problem then a math problem. The key being that "Monty" knows where the car is.
 
Last edited:

vulcanlogician

Well-Known Member
This is something I studied when I began learning more into gambling odds to sharpen my poker game.
I personally find it more impossible to explain to others than it is counterintuitive. Even after running a simulation to demonstrate to people this does actually work in the end I've still ran into people who just will not accept it works that way even if they don't understand why. The numbers are right in front of them but still they deny it.:facepalm:

It's counterintuitive to people because they are left with two remaining doors. From their perspective, it is a 50/50 choice. But they forget that something changes when the first goat is revealed. They don't take into account the fact that the revelation of the goat is additional information that they didn't have at the outset. Instead, it is more natural to think that the choice has changed from 1/3 to 1/2.

But, of course, that isn't the case...
 
Top