Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
moebius2778
May 3, 2013

GAINING WEIGHT... posted:

Fair enough, I can't argue with the brute numbers. I wonder what was at fault with my "third gender" conception - seems like it should come out the same, yeah? It just uses a different word for "girl that isn't the child the man was talking about", right?

Because it counts "GF" and "FG" as separate cases. If you translate it back as F = G, it turns out you're saying that "GG" and "GG" are separate cases.

Adbot
ADBOT LOVES YOU

wateroverfire
Jul 3, 2010

Phyzzle posted:

A man flips two coins. One comes up H, but the other one rolls under a desk. The other one has a 50% chance of being T.

A man flips two coins, and they both roll under a desk. His wife looks under the desk and says, "It's dark, but I can see one of them, and it's H". The other one now has a 67% chance of being T. For the man. Since he doesn't know which coin came up H, there are three equal possibilities for him:
TT, TH, and HT.

But since his wife is looking directly at one specific coin rather than hearing a report of "at least one H", the other coin still has a 50% chance of being T. But only for her.

Something seems amiss here . . .

You have the following possible combinations:

TT
TH
HT
HH

In the first case, the man sees one coin land H so the probability the other is T is 50%. So far so good.

In the second case, the wife reports at least one coin came up H, elminiating TT and leaving

TH
HT
HH

Which you interpreted as being a 67% chance of T (prompting the WTF, this doesn't seem right moment). But 67% is the probability that at least one of two unknown flips comes up T, and in this case we know the result of one flip was H (whether the husband saw it or not, assuming he can trust his wife). Conditional on that information, only one flip is in doubt and the chance of T is 50%.

Or you could think of it this way: One of TH or HT is impossible if we know the result of one of the throws was H, because TH requires that the second coin comes up heads while HT requires that the first one does. If you eliminate one or the other, the possibilities are TH, HH or HT, HH. 50% chance the other coin came up T either way.

King of Bleh
Mar 3, 2007

A kingdom of rats.

Phyzzle posted:

A man flips two coins. One comes up H, but the other one rolls under a desk. The other one has a 50% chance of being T.

A man flips two coins, and they both roll under a desk. His wife looks under the desk and says, "It's dark, but I can see one of them, and it's H". The other one now has a 67% chance of being T. For the man. Since he doesn't know which coin came up H, there are three equal possibilities for him:
TT, TH, and HT.

But since his wife is looking directly at one specific coin rather than hearing a report of "at least one H", the other coin still has a 50% chance of being T. But only for her.

Something seems amiss here . . .

If the coins are marked in some way, then the wife is looking at "one specific coin" and the probabilities are as you said, due to the two parties having different information. If they aren't, then your statement of "specific coin" is false, and the probabilities are symmetric for both parties.

e: this very quickly turned into the "math is hard" thread, all we need is a point nine repeating argument for bingo

wateroverfire
Jul 3, 2010

King of Bleh posted:

If the coins are marked in some way, then the wife is looking at "one specific coin" and the probabilities are as you said, due to the two parties having different information. If they aren't, then your statement of "specific coin" is false, and the probabilities are symmetric for both parties.

e: this very quickly turned into the "math is hard" thread, all we need is a point nine repeating argument for bingo

I think the probabilities are the same for both parties regardless of marked coins.

King of Bleh
Mar 3, 2007

A kingdom of rats.

wateroverfire posted:

I think the probabilities are the same for both parties regardless of marked coins.

If one is red and one is blue, the wife has two bits of information and can rule out 2/4 possibilities. (both aren't tails, red isn't tails because she's looking at it).

Unless she communicates the color, the man only has one bit of information and can only rule out one possibility -- two tails. Thus, asymmetry.

Phyzzle
Jan 26, 2008

VitalSigns posted:

It's twice as likely for there to be HT/TH than TT. But if it's TT she's twice as likely to see a tails so they cancel out. Her husband knows she only saw one coin so he has the same information she does.

Ah, I see. Because the wife is not an accurate Heads detector, but is randomly sampling one of the coins, then that probability goes in.

wateroverfire
Jul 3, 2010

King of Bleh posted:

If one is red and one is blue, the wife has two bits of information and can rule out 2/4 possibilities. (both aren't tails, red isn't tails because she's looking at it).

Unless she communicates the color, the man only has one bit of information and can only rule out one possibility -- two tails. Thus, asymmetry.

I think that second part is not correct.

The possibilities for 2 independant coin flips before the event are:

pre:
Coin 1  Coin 2
   H      H 
   H      T
   T      H
   T      T
Say he throws the two coins and can't see the result. His wife reports that one coin is H.

He knows that TT is impossible. But he also knows that one of TH or HT is impossible given that one coin is, concretely, H.

So if it happens to be Coin 1 that's H, the possibilities left are HH, HT. If it happens to be Coin 2, the possibilities are HH, TH.

In either case the probability that the other coin comes up T is 50%, so it doesn't matter which coin is observed.

Alternately:

The probability that at least 1 coin comes up T from two independant flips is 67%, but once one coin is revealed to be heads there is only 1 uncertain flip left, so the probability is 50%.

wateroverfire fucked around with this message at 20:16 on Sep 5, 2016

King of Bleh
Mar 3, 2007

A kingdom of rats.
edit: actually, I think you may be right. The odds of randomly observing a head result is not the same as at least one head having been flipped (it's lower), so it raises the odds of the two-heads result accordingly.

Hmm

King of Bleh fucked around with this message at 23:51 on Sep 5, 2016

moebius2778
May 3, 2013

wateroverfire posted:

I think that second part is not correct.

The possibilities for 2 independant coin flips before the event are:

pre:
Coin 1  Coin 2
   H      H 
   H      T
   T      H
   T      T
Say he throws the two coins and can't see the result. His wife reports that one coin is H.

He knows that TT is impossible. But he also knows that one of TH or HT is impossible given that one coin is, concretely, H.

So if it happens to be Coin 1 that's H, the possibilities left are HH, HT. If it happens to be Coin 2, the possibilities are HH, TH.

In either case the probability that the other coin comes up T is 50%, so it doesn't matter which coin is observed.

Alternately:

The probability that at least 1 coin comes up T from two independant flips is 67%, but once one coin is revealed to be heads there is only 1 uncertain flip left, so the probability is 50%.

So, uh, what's the difference between the saying "there is at least one head" and "there is a head" in terms of amount of information conveyed when describing the result of two coin flips?

And if they're the same, what's the difference between saying "it is not the case that both coins are tails" and "there is at least one head"?

And if those are the same, are you saying that P(HH | !TT) = 0.5?

King of Bleh
Mar 3, 2007

A kingdom of rats.
My (current) understanding of the desk problem is that there is a difference between "out of both coins, there is at least one head" and "I randomly looked at one result and it is a head". The former has likelihood 75% and assumes knowledge of both flip results, but the latter only has likelihood 50%. Thus, I believe P(HH | random heads observation) is strictly greater than P(HH | !TT) .

VitalSigns
Sep 3, 2011

wateroverfire posted:



Alternately:

The probability that at least 1 coin comes up T from two independant flips is 67%, but once one coin is revealed to be heads there is only 1 uncertain flip left, so the probability is 50%.

No the probability of getting at least one heads is 75%.

If you look at one coin at random then the probability the other is heads is 50%

If someone looks at both coins and deliberately shows you a tails, the probability the other is heads is 67%

botany
Apr 27, 2013

by Lowtax
oh god this thread is giving me flashbacks to me teaching an undergrad class about the monty hall problem. jesus that took a while.

VitalSigns
Sep 3, 2011

moebius2778 posted:

So, uh, what's the difference between the saying "there is at least one head" and "there is a head" in terms of amount of information conveyed when describing the result of two coin flips?

And if they're the same, what's the difference between saying "it is not the case that both coins are tails" and "there is at least one head"?

And if those are the same, are you saying that P(HH | !TT) = 0.5?

The difference isn't the wording, the difference is how we got that information. If she just reports the first coin she sees and it happened to be heafs, or if she is looking for heads and only reports if she finds one

OwlFancier
Aug 22, 2013

Phyzzle posted:

Yes.

What are the odds that the last coin flip is heads? What are the odds that one of the hundred coin flips is heads?

In a series where you have stated that the previous 99 were tails, 50/50, because that is the same question.

The question you are asking is "what is the probability of this next coin flip" and that you have stated the outcome of many previous coin flips does not affect that, the previous flips do not affect the outcome.

You aren't asking "what is the chance of at least 1 in 100 coin flips being heads" it's "I have flipped a coin 99 times and it has come up tails, what is the probability of the next one being heads?"

Which is either 50/50, or possibly "very unlikely if your previous 99 attempts were anything to go by"

CountFosco
Jan 9, 2012

Welcome back to the Liturgigoon thread, friend.

AARO posted:

Hilbert's hotel demonstrates the pardoxes which arise with an actual infinite.


William Lane Craig concludes "Hilbert’s Hotel is absurd. But if an actual infinite were metaphysically possible, then such a hotel would be metaphysically possible. It follows that the real existence of an actual infinite is not metaphysically possible."

I'm not sure if his conclusion is valid. I think there may be some problem with how they keep calling the hotel "full". Can a hotel with an infinite number of rooms ever really be full? There would always be an infinite number of rooms available no matter how many people checked in. If there are always an infinite number of rooms available how can you ever say the hotel is full?

Also in the first case the owner switches the guest in room #1 to room #2 and the guest in room #2 to room #3 on into infinity. He then moves the new guest into room #1. However, this "then" would never actualize as the owner would have to be switching guests into the room next door for an infinite amount of time.

Infinity leads to absurdities. Most people when they think about infinity, aren't thinking big enough.

I think we could call this thread "The Incoherence of the Goons."

VitalSigns
Sep 3, 2011

Infinite time spent moving guests isn't a problem because you can construct a solution that doesn't even require you to leave the front desk. Write a note that says "move into the next higher room number, give this note to the guest already in that room". Tell your new guest move into room 1 and give the note to the guest already there.

AARO
Mar 9, 2005

by Lowtax

VitalSigns posted:

Infinite time spent moving guests isn't a problem because you can construct a solution that doesn't even require you to leave the front desk. Write a note that says "move into the next higher room number, give this note to the guest already in that room". Tell your new guest move into room 1 and give the note to the guest already there.

But I assume the guest in room #1 can't move into room #2 until room #2 is vacated. Ad Infinitum. This process of moving guests to a higher numbered room would take an infinite amount of time. I suppose this is no problem for checking in one new guest but if you were to check in an infinite amount of new guests, that task could never be completed.

The other problem is in saying the Hotel is full. How can a hotel that posses an infinite number of rooms ever be full? No matter how many guests are checked in there would always be an infinite number of rooms available.

Dr. Arbitrary
Mar 15, 2006

Bleak Gremlin
Suppose a LOT of guests show up to Hilbert's Hotel, an infinite number, one corresponding to every single real number to be exact.
Is there still going to be enough room for them all?

botany
Apr 27, 2013

by Lowtax
I mean, at some point you might as well chuck the hotel metaphor and just talk about the underlying mathematics - the natural numbers plus the natural numbers still map one-one onto the naturals, while the reals don't.

Brutal Garcon
Nov 2, 2014



"I'm really bad at maths" isn't a paradox.

VitalSigns
Sep 3, 2011

AARO posted:

But I assume the guest in room #1 can't move into room #2 until room #2 is vacated. Ad Infinitum. This process of moving guests to a higher numbered room would take an infinite amount of time.
Okay, announce on the intercom: "all guests please move to the next higher room number, now", they do it simultaneously and you send the new guest to room 1.

AARO posted:

I suppose this is no problem for checking in one new guest but if you were to check in an infinite amount of new guests, that task could never be completed.

If an infinite number of guests arrive, no problem. Get on the intercom: "attention all guests: vacate your room, multiply your room number by two, this is your new room number, please proceed to your new room". Since there are equally as many odd numbers as even numbers (an infinite number of each) you now have an infinite number of empty rooms, so each new guest can go to the lowest available odd numbered room.

AARO posted:

The other problem is in saying the Hotel is full. How can a hotel that posses an infinite number of rooms ever be full? No matter how many guests are checked in there would always be an infinite number of rooms available.

It depends on the rule you use to assign guests to rooms. If you assign each guest a number (1, 2, 3,...) and match those up with rooms 1, 2, 3,... then you'll have infinitely many rooms each with infinitely many guests and no empty rooms. But you could assign them differently. You could assign each guest an even number (guests 1,2,3,... are in rooms 2,4,6...) and you have infinitely many full rooms (the even numbers) and infinitely many empty rooms (the odd numbers). No guest will be without a room because for any given guest k, there exists a room 2(k) for her.

So if you say the hotel is "full" what you mean is you've chosen a mapping function f(x) = 1x.

The weird thing to map your mind around is that that the set of all positive integers is exactly the same size as the set of all positive even integers, even though all positive integers contains all even integers plus more (the odd integers). For a better explanation, read up on cardinality and countable sets

Comstar
Apr 20, 2007

Are you happy now?

prick with tenure posted:

My favorite paradox is still the "unexpected examination" (often called the "unexpected hanging," thanks to this being Quine's favorite version). Here's the basic idea:

Is the professor an economist or is the student doing an Arts degree? This is important.

Phyzzle
Jan 26, 2008

Dzhay posted:

"I'm really bad at maths" isn't a paradox.

The paradox is that anyone is good at maths.

Seriously, that's called Quine's Paradox or something. That manipulating little symbols by human-made rules and syntax can match the behavior of nature so well.

botany
Apr 27, 2013

by Lowtax

Phyzzle posted:

The paradox is that anyone is good at maths.

Seriously, that's called Quine's Paradox or something. That manipulating little symbols by human-made rules and syntax can match the behavior of nature so well.

you're thinking of the Miracle of Applied Mathematics, which goes back to the physicist Eugene Wigner.

Inferior Third Season
Jan 15, 2005

Phyzzle posted:

The paradox is that anyone is good at maths.

Seriously, that's called Quine's Paradox or something. That manipulating little symbols by human-made rules and syntax can match the behavior of nature so well.
This thread has destroyed the very concept of what a paradox is, rendering the word "paradox" meaningless. But how is it that can we understand that the word "paradox" is without meaning when the word itself no longer means anything?

I call this the Paradox Paradox.

Phyzzle
Jan 26, 2008
Nah, the concept of paradox was already destroyed thoroughly enough that they added the third definition here:

http://www.merriam-webster.com/dictionary/paradox

: a statement that seems to say two opposite things but that may be true

I mean, "The Twin Paradox" in Relativity has been around since the 1920's, but it never actually indicated a contradiction.

wateroverfire
Jul 3, 2010

VitalSigns posted:

No the probability of getting at least one heads is 75%.

Yeah, I am a dumb.

VitalSigns posted:

If you look at one coin at random then the probability the other is heads is 50%

And this is right.

VitalSigns posted:

If someone looks at both coins and deliberately shows you a tails, the probability the other is heads is 67%

But I'm not following you here. Each flip is an independent event, right? If someone shows you that one flip was T, the probability that the other flip is H is still 50%. That's what it means for events to be independent of one another.

botany
Apr 27, 2013

by Lowtax

wateroverfire posted:

But I'm not following you here. Each flip is an independent event, right? If someone shows you that one flip was T, the probability that the other flip is H is still 50%. That's what it means for events to be independent of one another.

HH
HT
TH
TT

I show you a T. Viable options left:

HT
TH
TT

in two of those options the other coin is H, in one of them the other coin is T. 67%.

wateroverfire
Jul 3, 2010

botany posted:

HH
HT
TH
TT

I show you a T. Viable options left:

HT
TH
TT

in two of those options the other coin is H, in one of them the other coin is T. 67%.

IDK, Think of it this way. I show you a T, and we arbitrarily call that coin Coin 1.

The viable options are:
TT
TH

But not HT, because we are staring at Coin 1 and it is T rather than H. The probability that the other coin is H is 50% based on the viable options.

So then we say no, that T is arbitrarily Coin 2.

Then the viable options are:
TT
HT

But not TH, because we are staring at Coin 2 and we know it is T. So the probability that the other coin is H is 50% based on the viable options.

So whether Coin 1 or Coin 2 is revealed, the probability of the other coin being H is 50%. The outcome of the other flip is independent of the flip we observed.

botany
Apr 27, 2013

by Lowtax
alright honestly this has been explained over and over in the thread, but the example here is so simple that you can actually just do it yourself.

take 2 coins out of your wallet, throw them around the house like 20 times. if they land HH, ignore that throw. then make a list of how often TT comes up vs how often TH (or HT) comes up.

wateroverfire
Jul 3, 2010

botany posted:

alright honestly this has been explained over and over in the thread, but the example here is so simple that you can actually just do it yourself.

take 2 coins out of your wallet, throw them around the house like 20 times. if they land HH, ignore that throw. then make a list of how often TT comes up vs how often TH (or HT) comes up.

Yeah but my dude, that does not address the probability in question.

twodot
Aug 7, 2005

You are objectively correct that this person is dumb and has said dumb things
This is an English problem and not a math problem. It is true that if you have a bunch of pairs of coin flips, eliminate all HH pairs, you get what would otherwise be a wonky distribution of Hs, but that shouldn't be at all surprising. The problem is when you say "Bill has two children, one is a girl, what is the probability of the other being a boy?", it's not all apparent whether Bill is a member of a set of people with two children where we have excluded all people with two boys, or if you just pulled Bill off the street and interrogated him about the status of his children. For all I know, you were equally prepared to tell me "Bill has two children, one is a boy, what is the probability of the other being a girl?".

twodot fucked around with this message at 17:37 on Sep 6, 2016

Phyzzle
Jan 26, 2008

twodot posted:

This is an English problem and not a math problem.

Yeah, or an 'information' problem. There are some subtleties of information that seem difficult to hold onto.

"I examined both coins, and I'm willing to tell you for sure that at least one came up H."

"I found one coin [that randomly bounced to where I can see it], and it was H, so now I can tell you for sure that at least one came up H."

The conclusions are the same, but there is information lurking in the premises that you can use on that set of two coins.

King of Bleh
Mar 3, 2007

A kingdom of rats.

twodot posted:

This is an English problem and not a math problem. It is true that if you have a bunch of pairs of coin flips, eliminate all HH pairs, you get what would otherwise be a wonky distribution of Hs, but that shouldn't be at all surprising. The problem is when you say "Bill has two children, one is a girl, what is the probability of the other being a boy?", it's not all apparent whether Bill is a member of a set of people with two children where we have excluded all people with two boys, or if you just pulled Bill off the street and interrogated him about the status of his children. For all I know, you were equally prepared to tell me "Bill has two children, one is a boy, what is the probability of the other being a girl?".

This is true and interesting, the criteria used to select Bill can be arbitrarily complex and thus modulate the actual probability arbitrarily. You could have picked Bill not from the set of theoretical 2-child parents, but from the set of all people named "Bill", in which case the answer would be wildly different.

wateroverfire posted:

IDK, Think of it this way. I show you a T, and we arbitrarily call that coin Coin 1.

The identity of each coin is not arbitrary and is what's tripping you up here. Consider, how do you decide which coin to show if both are tails, versus if only one is, and how does that affect the probabilities of the two mutually-exclusive scenarios you outlined?

King of Bleh fucked around with this message at 18:24 on Sep 6, 2016

VitalSigns
Sep 3, 2011


I flip two coins.
Let's say I show one to you randomly and it's tails. Let's list the independent ways this could happen:

TH and I show you the first coin (let's call this 1TH).
HT and I show you the second one (2HT)
TT and I show you the first coin (1TT)
TT and I show you the second coin (2TT)

25% chance of each case. Two of the cases the unseen coin is tails, the other two cases the unseen coin is heads. 50% probability.


============================================
I flip two coins.
I look at both and tell you "they aren't both heads", and to prove it I select a tails and show it to you. Let's list the independent ways this could happen:

TH
HT
TT

Each of these are equal probability, so there is a 67% chance the coin I didn't show you is heads. Which coin I show you isn't an independent event like it is in the first case, I will decide which coin to show you based on how the flip turned out.
============================================

Why are these situations different? In the first situation, if the coins land TT you're twice as likely to observe a tails when you look than you are if they land HT or TH.

In the second situation I will always show you tails regardless of whether the coins landed HT, TH, or TT. If they land TT I'm not more likely to show you tails because as long as they're not both heads we already know I will show you a tails 100% of the time.

If you already understand the Monty Hall problem, it's the same principle. If he randomly opens a second door and it happens to be a goat there's no advantage to switching (it's more likely you didn't pick the car, but he was more likely to show you a goat if you picked the car, these cancel). If he deliberately shows you a goat then you have a 67% chance to win if you switch (he is not more likely to show you a goat if you pick the car because he will always show you a goat).

VitalSigns fucked around with this message at 03:14 on Sep 7, 2016

VitalSigns
Sep 3, 2011

wateroverfire posted:

But I'm not following you here. Each flip is an independent event, right? If someone shows you that one flip was T, the probability that the other flip is H is still 50%. That's what it means for events to be independent of one another.

TLDR from my last post, because this is the crux of your problem. There are three independent events here if I randomly choose a coin: how each coin landed, and which one I pull out to show you. If I look at the coins and choose to show you a tails, there are only two independent events, because I decide which coin to show you based on how the flip turned out.

If I randomly choose a coin, that gives you no information about the other one, so you have a 50-50 shot at guessing right. But if I choose to show you a coin because it's tails then you know that I didn't want to show you the other one (this might be because they're TT and I'm indifferent between them, but it's more likely they're TH or HT and I didn't want to show you the other because it's heads). You can now guess the other coin with a greater likelihood of being right (67% likelihood in fact).

VitalSigns fucked around with this message at 04:18 on Sep 7, 2016

Strom Cuzewon
Jul 1, 2010

Phyzzle posted:

Nah, the concept of paradox was already destroyed thoroughly enough that they added the third definition here:

http://www.merriam-webster.com/dictionary/paradox

: a statement that seems to say two opposite things but that may be true

I mean, "The Twin Paradox" in Relativity has been around since the 1920's, but it never actually indicated a contradiction.

Doesn't that mean we just have to be a bit more careful in distinguishing between logical paradoxes and intuitive ones?

wateroverfire
Jul 3, 2010

VitalSigns posted:

I flip two coins.
I look at both and tell you "they aren't both heads", and to prove it I select a tails and show it to you. Let's list the independent ways this could happen:

TH
HT
TT

Each of these are equal probability, so there is a 67% chance the coin I didn't show you is heads. Which coin I show you isn't an independent event like it is in the first case, I will decide which coin to show you based on how the flip turned out.

Hear me out, man, and follow the logic below to the end.

I know this looks intuitive but it's wrong. TH and HT are outcomes that become mutually exclusive when information is added about the outcome of one of the coins - even if we don't know which coin we have information about.

So say you tell me "at least 1 coin is T" and show me a T, so we end up with the table of results in your quote above.

1) If Coin 1 is T, the only possible outcomes are TH and TT. This is correct and should also be intuitive if you think about it, but if you disagree then show how and let's talk it through.

2) If Coin 2 is T, the only possible outcomes are HT and TT by the same reasoning.

Therefore, the probability that we are in a world in which the other coin is H is the following:

P(Coin you showed is Coin 1) * P(TH conditional on Coin1 being T) + P(Coin you showed is Coin 2)*P(HT conditional on Coin 2 being T)

We have only two coins so

P(Coin you showed is Coin 1) = 50%
P(Coin you showed is Coin 2) = 50%

Following 1), P(TH conditional on Coin1 being T) = 50%
Following 2), P(HT conditional on Coin 2 being T) = 50%

Plug everything in and 50%*50% +50%*50% = 25% + 25% = 50% probability the other coin is an H. Proving what we know from having defined the coin flips as independent events.

The above is NOT the Monty Hall problem in concept.

In Monty Hall, the values of the "flips" are contingent - there is only 1 Car and the other two "flips" must be Goat. So revealing a Goat gives you information about the potential values of the other "flips" where revealing a T in the coin flip problem gives you nothing.

edit: Notice, if you work through the formulas, it doesn't matter how I come by the information that one coin is T.

If you pick a coin at random and it happens to be T, the results tables are the same and the probability calculation is the same 50%.

If you deliberately show me a T, the results tables are the same and the probability calculation is the same 50%.

If you deliberately show me Coin 1 that happens to be a T, the results tables are the same and the P(Coin you showed me is Coin 2) is 0% while the P(Coin you showed me is Coin 1) is 100%. The probability calculation works out to the same 50%.

If you deliberately show me Coin 2 that happens to be a T, the results tables are the same and the P(Coin you showed me is Coin 1) is 0% while the P(Coin you showed me is Coin 2) is 100%. The probability calculation works out to the same 50%.

wateroverfire fucked around with this message at 14:56 on Sep 7, 2016

Phyzzle
Jan 26, 2008

wateroverfire posted:

Hear me out, man, and follow the logic below to the end.

I know this looks intuitive but it's wrong. TH and HT are outcomes that become mutually exclusive when information is added about the outcome of one of the coins - even if we don't know which coin we have information about.

So say you tell me "at least 1 coin is T" and show me a T, so we end up with the table of results in your quote above.

1) If Coin 1 is T, the only possible outcomes are TH and TT. This is correct and should also be intuitive if you think about it, but if you disagree then show how and let's talk it through.

2) If Coin 2 is T, the only possible outcomes are HT and TT by the same reasoning.

Therefore, the probability that we are in a world in which the other coin is H is the following:

P(Coin you showed is Coin 1) * P(TH conditional on Coin1 being T) + P(Coin you showed is Coin 2)*P(HT conditional on Coin 2 being T)

We have only two coins so

P(Coin you showed is Coin 1) = 50%
P(Coin you showed is Coin 2) = 50%

It's this step here. P(Coin you showed is Coin 2) is not 50%.

If VitalSigns looked at Coin 1, and sees that it's not tails, he moves on to Coin 2. He only shows you Coin 2 in the case of HT, and this does not have a 50% chance of happening.

Adbot
ADBOT LOVES YOU

Phyzzle
Jan 26, 2008

Strom Cuzewon posted:

Doesn't that mean we just have to be a bit more careful in distinguishing between logical paradoxes and intuitive ones?

Sure, though I might call logical paradoxes "inconsistencies" instead. Like Russel's Paradox on "all sets that don't contain themselves" showed an inconsistency in naive set theory.

  • Locked thread