PDA

View Full Version : Math/Logic Puzzles



FreakyCheeseMan
2013-03-06, 09:14 PM
I'll offer up one; everyone else feel free to submit your own, or solve one already offered. Spoiler solutions please, in case someone else wants to solve on their own.

Ten Pirates:
Ten Piraates (A captain, and first through ninth mates) happen upon a supply of one hundred gold pieces. It is up to the captain to propose how these pieces are to be distributed. Once the captain has made his proposal, the pirates vote whether or not they accept his proposal; if they do not, the captain is killed, and it is up to the first mate to come up with a new distribution, with the same conditions; if the first mate's proposal is rejected, the duty falls to the second mate, etc. In the event of a tie vote, the proposal is accepted.

Assuming that every pirate both divides the loot and votes in their own self-interest, how will the gold be distributed?

The Extinguisher
2013-03-06, 09:37 PM
The eight mate gets it all

If a pirate rejects the proposal, they get a bigger share, so each pirate but the one making the proposal votes to reject it until there are only two left
The eight mate then gets to propose he takes it all. Since a tie vote is counted as an acceptence.

FreakyCheeseMan
2013-03-06, 09:54 PM
The eight mate gets it all

If a pirate rejects the proposal, they get a bigger share, so each pirate but the one making the proposal votes to reject it until there are only two left
The eight mate then gets to propose he takes it all. Since a tie vote is counted as an acceptence.

Flaw. Under your reasoning, pirates would have a reason to approve the proposals, in order to escape death. Consider the case of four pirates; obviously, the first votes yes; the third and fourth vote no. But, if your logic is correct, the second would actually vote yes, as in the three-pirate case, he would be killed.

Simpler explanation - consider the first vote, offered by the captain. Were your logic correct, the first through seventh mates would all end up dying if the captain's proposal was rejected, so it's in their best interests to approve it, even if it gives them nothing.

kurokotetsu
2013-03-06, 09:59 PM
The eight mate gets it all

If a pirate rejects the proposal, they get a bigger share, so each pirate but the one making the proposal votes to reject it until there are only two left
The eight mate then gets to propose he takes it all. Since a tie vote is counted as an acceptence.

How many "steps ahead" would the pirates think? If that is the obvious outcome, all the other pirates would try to stop it, as it benefits only one of them.

A ten piece agreement would sound logical on a first viewing. No body get's killed and all get a fair share. But let me think better about it, maybe there is a Nash equilibrium or an optimum game.

The Extinguisher
2013-03-06, 10:05 PM
Pirates dont think ahead :smalltongue:

Dr.Epic
2013-03-06, 10:07 PM
I show up, threaten them all with a ray gun, and take the gold.

snoopy13a
2013-03-06, 10:26 PM
If I was the captain, I'd keep all the gold and bet my life that the 2nd through 4th mates will approve because they are too scared of the 5th through 9th mates.

FreakyCheeseMan
2013-03-06, 10:53 PM
Figured I'd go ahead and post the solution.

This is gonna take a while, as the only way I know to do this is iterative.

For the base case - 1 pirate - he obviously takes all of the gold.

For n+1 pirates, the first pirate must pick ((n+1)/2 rounded up)-1 of the other pirates, selecting the ones who got the least from the case of n pirates, and give them each 1 more gold than they would otherwise have gotten, then keep the rest for himself. So... I'm gonna do this as a chart, why not.
{table=head]Captain | 1st | 2nd | 3rd | 4th | 5th | 6th | 7th | 8th | 9th
| | | | | | | | | 100
| | | | | | | | 100 | 0
| | | | | | | 99 | 0 | 1
| | | | | | 99 | 0 | 1 | 0
| | | | | 98 | 0 | 1 | 0 | 1
| | | | 98 | 0 | 1 | 0 | 1 | 0
| | | 97 | 0 | 1 | 0 | 1 | 0 | 1
| | 97 | 0 | 1 | 0 | 1 | 0 | 1 | 0
| 96 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1
96 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0
[/table]

The last line gives the answer. At each level, at least half of the pirates do better than they would in the case above them (which represents the result if they reject the proposal).

So, it turns out the captain does pretty well.


Next challenge - The Three Guards:
There are two roads ahead of you, one of which leads to death and one of which leads to happiness. The crossroads are guarded by three brothers, one of whom always lies, one who always tells the truth, and one who says whatever he feels like, at random. What is the minimum number of questions with which you can know for certain which road is correct, and what questions do you ask?

snoopy13a
2013-03-06, 11:04 PM
Question 1: Are you three brothers?

Truth-telling brother will answer yes
Lying brother will answer no
Random brother may say anything.

Normal scenario, the random brother says some random nonsense and I can ask the truth-telling brother which road leads to happiness (or ask the liar and take the other road).

Worst-case scenario, the random brother answers "yes" or "no." Thus, I have two yeses or two nos.

If two brothers say "yes," then I ask the brother who said "no" ( who must be the lying brother) which road leads to happiness. I then take the opposite one.

If two brothers say "no," I ask the brother who said "yes," and take the road that he says leads to happiness.

So, two questions.

Coidzor
2013-03-06, 11:12 PM
I'll offer up one; everyone else feel free to submit your own, or solve one already offered. Spoiler solutions please, in case someone else wants to solve on their own.

Ten Pirates:
Ten Piraates (A captain, and first through ninth mates) happen upon a supply of one hundred gold pieces. It is up to the captain to propose how these pieces are to be distributed. Once the captain has made his proposal, the pirates vote whether or not they accept his proposal; if they do not, the captain is killed, and it is up to the first mate to come up with a new distribution, with the same conditions; if the first mate's proposal is rejected, the duty falls to the second mate, etc. In the event of a tie vote, the proposal is accepted.

Assuming that every pirate both divides the loot and votes in their own self-interest, how will the gold be distributed?

I'm probably missing something crucial here, mind. If a pirate is dividing the loot with his own self-interest in mind, then he presumably wants to survive, so he's not going to make a deal that's certain to get him killed.

Since a tie results in a success for the pirate making the proposal, any given proposal only needs to actually please half of the pirates in terms of their share of the booty, so the captain could divide it equally amongst the first 5 pirates.

On the other hand, if the captain gets killed the first mate becomes the pirate in charge, so there is some incentive to vote against the captain unless we rule out non-gold piece based personal advancement as motive for voting. So the captain has incentive to bribe him off with a larger share than the others so that the first mate doesn't side with the pirates who get nothing in the deal, but this offers up vulnerabilities to the other 3 pirates who win somewhat in this arrangement deciding that they want a bigger piece of the pie, but the first step only adds a potential of 2 coins to the winning half and the second step bumps it up to 5 extra coins in an equitable split amongst half of the pirates, but this is tempered by bewaring the greediness of those who would become part of the resultant half and if it gets out of hand could result in them getting the axe themselves.

...I think I lost where I was, so I'm thinking something like 15 to the captain, 13 to the first mate, 12 to Pirates 3-6, and 6 to Pirates 7-10, as the captain has more to worry about the people closer to him, cannot predict if his first mate will turn on him and so is hedging against him by not completely screwing Pirate 10 and expanding a decent share of the gold to Pirate 6 even though he doesn't need either of them on the surface, since Pirate 10 doesn't have anything to look forward to if things get out of hand and Pirate 6 is being coopted against him voting against the proposal so that he gets included in the upper-half of the pirates if the first mate has his shot.

kurokotetsu
2013-03-06, 11:13 PM
Figured I'd go ahead and post the solution.

This is gonna take a while, as the only way I know to do this is iterative.

For the base case - 1 pirate - he obviously takes all of the gold.

For n+1 pirates, the first pirate must pick ((n+1)/2 rounded up)-1 of the other pirates, selecting the ones who got the least from the case of n pirates, and give them each 1 more gold than they would otherwise have gotten, then keep the rest for himself. So... I'm gonna do this as a chart, why not.
{table=head]Captain | 1st | 2nd | 3rd | 4th | 5th | 6th | 7th | 8th | 9th
| | | | | | | | | 100
| | | | | | | | 100 | 0
| | | | | | | 99 | 0 | 1
| | | | | | 99 | 0 | 1 | 0
| | | | | 98 | 0 | 1 | 0 | 1
| | | | 98 | 0 | 1 | 0 | 1 | 0
| | | 97 | 0 | 1 | 0 | 1 | 0 | 1
| | 97 | 0 | 1 | 0 | 1 | 0 | 1 | 0
| 96 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1
96 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 0
[/table]

The last line gives the answer. At each level, at least half of the pirates do better than they would in the case above them (which represents the result if they reject the proposal).

So, it turns out the captain does pretty well.


Next challenge - The Three Guards:
There are two roads ahead of you, one of which leads to death and one of which leads to happiness. The crossroads are guarded by three brothers, one of whom always lies, one who always tells the truth, and one who says whatever he feels like, at random. What is the minimum number of questions with which you can know for certain which road is correct, and what questions do you ask?
Interesting answer and for half of the pirates it is an equilibrium, supposing that progression. But why accept that when you could push for a fairer distribution. The problem is that a Nash Equilibrium (like this one) supposes just one step, but if the even mates could agree that unless the Captain gives them more, vote "no" and try their luck with the first mate or at last with the second mate, if all though the same. A fair distribution seems better in the long run for everybody. That solution even supposes that everybody follows the same strategy all the time, which isn't necessarily true, as if the second mate would distribute 84/2/2/2/2/2/2/2/2 the solution isn't better for anyone but the captain.

And I would say one question is enough, very close to the "original" question:

"Which roads would your brother tell me to take if I asked them which is the right road?"

Truthful brother: One that is a lie (because of the liar brother) and I don't know (he can't predict random)

Liar brother: A lie (because of the liar brother) and randomly one of the other paths (because the random brother he wouldn't know the answer so he could make it up)

Random brother: He either answers with a lie which would only be one "I don't know", because he does know what both of the brothers would tell me, or truth which would point me one to the right and one to the wrong path.

I follow the opposite road of the guard that told me one road and one I don't know. he is the truthful guard and is telling me which one is the lie

FreakyCheeseMan
2013-03-06, 11:17 PM
Question 1: Are you three brothers?

Truth-telling brother will answer yes
Lying brother will answer no
Random brother may say anything.

Normal scenario, the random brother says some random nonsense and I can ask the truth-telling brother which road leads to happiness (or ask the liar and take the other road).

Worst-case scenario, the random brother answers "yes" or "no." Thus, I have two yeses or two nos.

If two brothers say "yes," then I ask the brother who said "no" ( who must be the lying brother) which road leads to happiness. I then take the opposite one.

If two brothers say "no," I ask the brother who said "yes," and take the road that he says leads to happiness.

So, two questions.



Works for me.

Crazy Game Show #1:
You're on a game show; the host presents you with three doors. Behind one of them is a $1,000 prize. Behind the other two is... nothing.

The way the show works is this: The contestant picks one door, but does not open it. The host then opens one of the other two doors, showing it to be empty (the host knows which door holds the prize, and will always choose an empty door.) The contestant is then given the option to stick with his current door, or pay $50 to trade. Is it in his best interests to trade?

FreakyCheeseMan
2013-03-06, 11:25 PM
So, Coidzor - not sure if you read over the answer I proposed, but I think it covers most of what you're describing - whoever makes the offer distributes it as greedily as he can.

kurokotetsu - I think that your proposal depends on at least a degree of trust. If you gave the pirates the ability to sign a binding public contract that they would act a certain way, it would change the layout - but as is, there's no incentive not to betray at the bottom levels, so the progression remains the same.

Actually, that gives me an idea for another puzzle.

kurokotetsu
2013-03-06, 11:26 PM
Works for me.

Crazy Game Show #1:
You're on a game show; the host presents you with three doors. Behind one of them is a $1,000 prize. Behind the other two is... nothing.

The way the show works is this: The contestant picks one door, but does not open it. The host then opens one of the other two doors, showing it to be empty (the host knows which door holds the prize, and will always choose an empty door.) The contestant is then given the option to stick with his current door, or pay $50 to trade. Is it in his best interests to trade? Stick with it.

The expected value of you door is door 1000*.5=500$
The expected value of the other door is 950*.5=475$

Stick with the best expected value if the probabilities are the same.

kurokotetsu - I think that your proposal depends on at least a degree of trust. If you gave the pirates the ability to sign a binding public contract that they would act a certain way, it would change the layout - but as is, there's no incentive not to betray at the bottom levels, so the progression remains the same.

Actually, that gives me an idea for another puzzle.Not necessarily. If they didn't stick to the proposal they would end up being ditched like happened with the captain, which is the worst case scenario for them. So following the previous agreement is optimal for the proposer, not only the "propose-es" which if didn't follow the agreement would end probably being betrayed too. Also, you can always cut one of the lower members (the eight or tenth mate) because they have the best reasons to betray to someone that betrayal means they could be the nest in line. I doubt there is really an analytical solution to such an open ended problem.

snoopy13a
2013-03-06, 11:27 PM
Crazy game show:

Actually, I've heard this one before.

It is in the person's best interest to trade.

Initially, the person had a 1/3 chance of winning.

Now after the game show host revealed an empty door, if the person does nothing, they'll still have a 1/3 chance of winning. If they trade, they have a 1/2 chance of winning.

This increases their expected value from $333 to $450 ($500 less $50).

This problem is very counter-inituitive.

FreakyCheeseMan
2013-03-06, 11:30 PM
Crazy game show:

Actually, I've heard this one before.

It is in the person's best interest to trade.

Initially, the person had a 1/3 chance of winning.

Now after the game show host revealed an empty door, if the person does nothing, they'll still have a 1/3 chance of winning. If they trade, they have a 1/2 chance of winning.

This increases their expected value from $333 to $450 ($500 less $50).

This problem is very counter-inituitive.



Correct in answer, a little bit off in explanation.

The person has a 1/3 chance of being correct initially; that chance is not modified by the host's actions. Since there is only one other door, their chances of success if they move to it are inverted, so switching gives them a 2/3 chance, not a 1/2 chance.

The expected value actually increases to 616 (50 less than 666.)

thubby
2013-03-06, 11:34 PM
Works for me.

Crazy Game Show #1:
You're on a game show; the host presents you with three doors. Behind one of them is a $1,000 prize. Behind the other two is... nothing.

The way the show works is this: The contestant picks one door, but does not open it. The host then opens one of the other two doors, showing it to be empty (the host knows which door holds the prize, and will always choose an empty door.) The contestant is then given the option to stick with his current door, or pay $50 to trade. Is it in his best interests to trade?

yes.
on your first pick you have a 2/3 of the time you will pick a losing door. in those instances, switching automatically gets you the car.
you have a 1/3 chance of losing if your first pick is right under this strategy.

given the price of switching, you could actually make the odds a lot worse and still come up winning if this were a repeated test.

FreakyCheeseMan
2013-03-06, 11:35 PM
Crazy Game Show #2:
This one has two parts; the second part is harder.

Basic Setup: There is a game show in which there are two envelopes; one contains an amount of money, the other contains twice that amount. The contestant picks one envelope at random.

In the first part, the contestant opens the envelope, and sees that it contains $500. He is then given the chance to trade for the other, unopened envelope. Is it in his best interests to trade?

In the second part, the contestant is not allowed to open the envelope he picks initially, but is still given the option to trade. Is it in his best interests? (For full credit on this one, I want an explanation as to why the math from the first part does not apply.)

kurokotetsu
2013-03-06, 11:36 PM
Correct in answer, a little bit off in explanation.

The person has a 1/3 chance of being correct initially; that chance is not modified by the host's actions. Since there is only one other door, their chances of success if they move to it are inverted, so switching gives them a 2/3 chance, not a 1/2 chance.

The expected value actually increases to 616 (50 less than 666.)
OK, yeah I forgot the first step, But now that I think of it wouldn't there be an expected loss of -333$ (the chance of being right in the first place) if you switched, lowering the expected value of switching? Because then sticking would still be the right answer.

thubby
2013-03-06, 11:50 PM
Crazy Game Show #2:
This one has two parts; the second part is harder.

Basic Setup: There is a game show in which there are two envelopes; one contains an amount of money, the other contains twice that amount. The contestant picks one envelope at random.

In the first part, the contestant opens the envelope, and sees that it contains $500. He is then given the chance to trade for the other, unopened envelope. Is it in his best interests to trade?

In the second part, the contestant is not allowed to open the envelope he picks initially, but is still given the option to trade. Is it in his best interests? (For full credit on this one, I want an explanation as to why the math from the first part does not apply.)

you've misrepresented the problem. the amount known needs to be either odd, or it/2 must be in order for a specific answer to arise.

wherever you're taking the problem from probably says to switch when it's open. there is a theoretically infinite amount of money you could gain by switching, but only 500/2 different instances where it is less.

kurokotetsu
2013-03-06, 11:54 PM
Crazy Game Show #2:
This one has two parts; the second part is harder.

Basic Setup: There is a game show in which there are two envelopes; one contains an amount of money, the other contains twice that amount. The contestant picks one envelope at random.

In the first part, the contestant opens the envelope, and sees that it contains $500. He is then given the chance to trade for the other, unopened envelope. Is it in his best interests to trade?

In the second part, the contestant is not allowed to open the envelope he picks initially, but is still given the option to trade. Is it in his best interests? (For full credit on this one, I want an explanation as to why the math from the first part does not apply.)

I'm very rusty in probability, but witht hat setup it seem that both have the same answer.. If your envolope has x money in it, then the expected value of the other envolope is:

-x/2*.5+2x*.5=.5*(3x/2)= 3x/4, which if x isn't 0 then it is less than the expected value of your envolope.

FreakyCheeseMan
2013-03-06, 11:56 PM
you've misrepresented the problem. the amount known needs to be either odd, or it/2 must be in order for a specific answer to arise.

wherever you're taking the problem from probably says to switch when it's open. there is a theoretically infinite amount of money you could gain by switching, but only 500/2 different instances where it is less.

The amount of money being odd or even has nothing to do with it - that isn't the answer intended, I just want you to work out the probabilities.

FreakyCheeseMan
2013-03-06, 11:59 PM
I'm very rusty in probability, but witht hat setup it seem that both have the same answer.. If your envolope has x money in it, then the expected value of the other envolope is:

-x/2*.5+2x*.5=.5*(3x/2)= 3x/4, which if x isn't 0 then it is less than the expected value of your envolope.

Yeah, you've messed up some of the probabilities there. Let me clean it up for you.

.5X * .5 + 2X * .5 = .25X + X = 1.25X, so the expected value would be higher.
Mind, that doesn't answer the problem, except (sort of) the first one.

kurokotetsu
2013-03-07, 12:08 AM
Yeah, you've messed up some of the probabilities there. Let me clean it up for you.

.5X * .5 + 2X * .5 = .25X + X = 1.25X, so the expected value would be higher.
Mind, that doesn't answer the problem, except (sort of) the first one.Pretty sure that doesn't add up, because there is a "loss" in case of having the first envelop being the highest, that is the reason of the negative sign. There is half of the probability that you would lose half of the money you have, so the expected value is lower. And it solves the first one, as the other envelop has either 250$ (a loss of 250$) or 1000 (a win of 500) with the same probability...

Ah now I see an error, the prospect of a win isn't 2x, it is just x, so it goes:

-(x/2)(1/2)+(x)(1/2)=x/4 expected value (knowing or not knowing, I see not why should it have any effect).

DraPrime
2013-03-07, 12:15 AM
A tough one. Many cookies to the one who cracks it.
lwrsvbmdhsrckmlrolloaistqltigmicstdlrngbdtmrcupiot piobdpifwliuppaktssysuolmdlrs

FreakyCheeseMan
2013-03-07, 12:17 AM
Pretty sure that doesn't add up, because there is a "loss" in case of having the first envelop being the highest, that is the reason of the negative sign. There is half of the probability that you would lose half of the money you have, so the expected value is lower. And it solves the first one, as the other envelop has either 250$ (a loss of 250$) or 1000 (a win of 500) with the same probability...

Ah now I see an error, the prospect of a win isn't 2x, it is just x, so it goes:

-(x/2)(1/2)+(x)(1/2)=x/4 expected value (knowing or not knowing, I see not why should it have any effect).

So, there are two ways to go about it; either you can analyze it as an objective measure of what you end up with (in which case the result is 1.25 X, so more than the 1 X you get without switching), or you can look at it as the gain/loss, in which case you get a gain of .25X.

Only problem was, in your first round, you sort-of combined the two ways, and ended up comparing the calculated gain to the base value - in which case, the fact that it was lower is (mostly) meaningless, it's just the fact that it was positive that matters.

Durr, you already figured that out.

So, suppose you don't open the envelope - you're predicting a gain of .25X, trading one closed envelope for another. But... after you've traded, doesn't the same math still apply? So wouldn't you want to trade back?

Blue Ghost
2013-03-07, 12:25 AM
In the first case, there is a 50% chance that the second envelope contains $1000, and a 50% chance that the second envelope contains $250, so the average payoff for the second envelope would be $625, so it would be advantageous to switch. This would apply for any known amount in the first envelope, so in the case where the first envelope's contents are known, it would be advantageous to switch.

I'm quite sure that in the second case, there is no advantage to switching. I'm not entirely sure the principle behind it, but I think it goes something like this: One envelope contains an unknown x amount, and the other contains 2x. If the amount in the first envelope is known, then the value of x is limited to two values with equal probability, and one can calculate the expected payoff from switching based on those probabilities. But if the amount in the first envelope is not known, then there are infinite possibilities for the change in payoff from switching, so the calculation must be kept in terms of x. Switching will either get you a net gain of x or a net loss of x, with no indication of what x is, so the net payoff of switching is zero. Is that right?

FreakyCheeseMan
2013-03-07, 12:29 AM
In the first case, there is a 50% chance that the second envelope contains $1000, and a 50% chance that the second envelope contains $250, so the average payoff for the second envelope would be $625, so it would be advantageous to switch. This would apply for any known amount in the first envelope, so in the case where the first envelope's contents are known, it would be advantageous to switch.

I'm quite sure that in the second case, there is no advantage to switching. I'm not entirely sure the principle behind it, but I think it goes something like this: One envelope contains an unknown x amount, and the other contains 2x. If the amount in the first envelope is known, then the value of x is limited to two values with equal probability, and one can calculate the expected payoff from switching based on those probabilities. But if the amount in the first envelope is not known, then there are infinite possibilities for the change in payoff from switching, so the calculation must be kept in terms of x. Switching will either get you a net gain of x or a net loss of x, with no indication of what x is, so the net payoff of switching is zero. Is that right?

*Reads and re-reads*

Can't quite tell, based on your wording.

So, clearly there can't be an advantage to switching one closed envelope for another - they're functionally identical, any math that says otherwise is wrong.

The Math That Says Otherwise:
The envelope yo're holding contains X dollars. The other envelope contains either .5 X or 2X, so either a net gain of X, or a net loss of .5 X - so, switching increases your expected value by .25X

Can you describe exactly why that math is incorrect?

Blue Ghost
2013-03-07, 12:34 AM
*Reads and re-reads*

Can't quite tell, based on your wording.

So, clearly there can't be an advantage to switching one closed envelope for another - they're functionally identical, any math that says otherwise is wrong.

The Math That Says Otherwise:
The envelope yo're holding contains X dollars. The other envelope contains either .5 X or 2X, so either a net gain of X, or a net loss of .5 X - so, switching increases your expected value by .25X

Can you describe exactly why that math is incorrect?



The value of X would be different depending on whether the envelope you're holding contains the larger or smaller amount, and we have to assume a constant X in calculations. The correct calculation would be that either the envelope you're holding contains X and the other contains 2X, or that the envelope you're holding contains 2X and the other contains X.

If the amount in your envelope is known, then you are limited to two values of X, and can calculate the expected value based on those two values. But if you do not have any limitations on X, you must keep calculations in terms of X.

FreakyCheeseMan
2013-03-07, 12:46 AM
The value of X would be different depending on whether the envelope you're holding contains the larger or smaller amount, and we have to assume a constant X in calculations. The correct calculation would be that either the envelope you're holding contains X and the other contains 2X, or that the envelope you're holding contains 2X and the other contains X.

If the amount in your envelope is known, then you are limited to two values of X, and can calculate the expected value based on those two values. But if you do not have any limitations on X, you must keep calculations in terms of X.

Yes.

"X" in this case is actually a probability distribution; the distribution for the larger values is different than that for the smaller values, so the actual math would be more like "I could gain X or lose .5 Y, where Y is, on average, worth 2 X."

Fun With Infinite Hotels:
This is a set of problems of escalating difficulty, see how many you can answer: some or all may be impossible, if so, explain why.

Problem 1: You are the curator of a hotel with infinite rooms; unfortunately, they're all booked. A man just walked in and demanded that you give him a room to himself; can you accommodate him? You're not allowed to kick anyone out, but you can re-arrange them to your heart's desire.

Problem 2: You are again curator of an infinite hotel. Another infinite hotel, across the street, just burned down, but thankfully everyone escaped, and are now seeking housing in your hotel (which is again full.) Can you make room for everyone?

Problem 3: A tornado just hit the infinitely long street across town from yours, which contained infinitely many infinite hotels, all of which were destroyed. All of the patrons are seeking lodging in your hotel. Can you make it happen?

Problem 4: A new infinite-roomed hotel has been built across the street, and has been booked by an infinite swingers convention; unfortunately, strict hotel policies and in-room video cameras have put a damper on the convention. The head of the convention recently came to your hotel, and requested to use your rooms for liasons. He requires that every possible combination of swingers (including solo acts) from the hotel across the street have a dedicated room within your hotel. Can you help him? Do you want to?

Astral Avenger
2013-03-07, 12:50 AM
Crazy Game Show #2:
This one has two parts; the second part is harder.

Basic Setup: There is a game show in which there are two envelopes; one contains an amount of money, the other contains twice that amount. The contestant picks one envelope at random.

In the first part, the contestant opens the envelope, and sees that it contains $500. He is then given the chance to trade for the other, unopened envelope. Is it in his best interests to trade?

In the second part, the contestant is not allowed to open the envelope he picks initially, but is still given the option to trade. Is it in his best interests? (For full credit on this one, I want an explanation as to why the math from the first part does not apply.)

Part 1) there are 2 possible outcomes of switching: $250 and $1000. As each are equally likely, that means the average value of the other envelope is $1250/2>$500. Ergo it is beneficial to trade.

Part 2) It does not matter. If the probability of picking either envelope is .5 then picking, not looking and switching is the same as picking the other one initially. Thus, switching has no benefit.

(I may edit in my math in the next 24 hours, if I don't then its not going to happen)

FreakyCheeseMan
2013-03-07, 12:52 AM
Part 1) there are 2 possible outcomes of switching: $250 and $1000. As each are equally likely, that means the average value of the other envelope is $1250/2>$500. Ergo it is beneficial to trade.

Part 2) It does not matter. If the probability of picking either envelope is .5 then picking, not looking and switching is the same as picking the other one initially. Thus, switching has no benefit.

(I may edit in my math in the next 24 hours, if I don't then its not going to happen)

Part 1: Correct.
Part 2: True, but insufficient. What is the flaw in applying the same logic as to part 1? (Possible gain of X, possible loss of only .5 X)?

Douglas
2013-03-07, 12:56 AM
The envelopes one
One envelope contains $X. The other contains $2X. The answer depends on how the value of X is chosen by the game show. The answer already presented for the first part presumes that the probabilities for X=250 and X=500 are equal. If they are, then the answer is correct - you should switch.

For an unknown value of X:
A) Assuming a finite probability distribution for X. Somewhere along the line of possible values for the amount of money in the first envelope, the probabilities for that amount and for half that amount are no longer equal. In the simple case of a uniform random distribution up to a maximum, if the amount of money in the envelope is above the cap on X then it must be the double value envelope and switching will always lose you money. This change in the probability of gain vs loss, and in particular the fact that it is dependent on the amount in the first envelope, reduces the expected gain exactly enough to make switching a zero sum proposition.
B) The only other possibility is that the probability distribution of X covers an infinite range. In this case, your expected gain is infinite either way. Infinite is not changed by multiplication with any finite positive number, so it doesn't matter.

kurokotetsu
2013-03-07, 01:00 AM
It is coming back. I agree with you in the math iin this specific moment. I calculated the expected gain, not the expected value. My bad (a couple of years since I've done this kind of problem). And indeed, then switching is the right option.

The problem is indeed that the math tells you to always switch, wether you know the amount or not. If you know the amount you know that switching again will net you an specific amount, not an expected value so the paradox dies there. But the second point the expected amount is always greater, whe doing the calculation, so one should switch always

But here thinking there is merit with not seeing the expected gain, but the expected value. If x is the amount in the envelop with less moneythen the expected value of an envelop is:

x(1/2)+2x(1/2)=3x/2

That is independent of the envelop chosen. If both have the same expected values then keeping you envelop is the right choice.

FreakyCheeseMan
2013-03-07, 01:01 AM
The envelopes one
One envelope contains $X. The other contains $2X. The answer depends on how the value of X is chosen by the game show. The answer already presented for the first part presumes that the probabilities for X=250 and X=500 are equal. If they are, then the answer is correct - you should switch.

For an unknown value of X:
A) Assuming a finite probability distribution for X. Somewhere along the line of possible values for the amount of money in the first envelope, the probabilities for that amount and for half that amount are no longer equal. In the simple case of a uniform random distribution up to a maximum, if the amount of money in the envelope is above the cap on X then it must be the double value envelope and switching will always lose you money. This change in the probability of gain vs loss, and in particular the fact that it is dependent on the amount in the first envelope, reduces the expected gain exactly enough to make switching a zero sum proposition.
B) The only other possibility is that the probability distribution of X covers an infinite range. In this case, your expected gain is infinite either way. Infinite is not changed by multiplication with any finite positive number, so it doesn't matter.

Correct on all counts. Let's give you a fun one.

Rainbow Hats Game:
This one is a step or six up from the problems offered thus far.

There is a game in which seven players sit in a circle and have a hat placed on their heads from behind. Hats may be one of seven colors, and colors may be repeated - so, three people could have an orange hat, everyone could have a blue hat, etc. Each player may see the color of everyone else's hat, but not their own. Players may strategize before the hats have been placed, but once the hats are on, they may not communicate in any way.

Simultaneously, each player makes a single guess as to the color of their own hat. If *any* player wins, everyone wins. Is there any strategy by which a set of players could win every possible game?

Astral Avenger
2013-03-07, 01:01 AM
Part 1: Correct.
Part 2: True, but insufficient. What is the flaw in applying the same logic as to part 1? (Possible gain of X, possible loss of only .5 X)?

Before you look at the value of your envelope it has an expected value (EV) of (x+.5x)/2=.75x where x is the value of the larger amount.
as you do not know which envelope you have, the expected value of the other envelope is still (x+.5x)/2=.75x.
the flaw in applying the logic of the first part comes from the fact that now our expected gain equals our expected loss, as the expected gain is (large amount - EV of envelope)=x-.75x=.25x
The expected loss is (small amount -EV)=.5x-.75x=-.25x
As both of these are weighted equally, the expected gain is simply the sum over 2, (EG+EL)/2. As EG+EL=0, there is no point in trading envelopes.
That a good enough explanation for ya? :smalltongue:

FreakyCheeseMan
2013-03-07, 01:04 AM
It is coming back. I agree with you in the math iin this specific moment. I calculated the expected gain, not the expected value. My bad (a couple of years since I've done this kind of problem). And indeed, then switching is the right option.

The problem is indeed that the math tells you to always switch, wether you know the amount or not. If you know the amount you know that switching again will net you an specific amount, not an expected value so the paradox dies there. But the second point the expected amount is always greater, whe doing the calculation, so one should switch always

But here thinking there is merit with not seeing the expected gain, but the expected value. If x is the amount in the envelop with less moneythen the expected value of an envelop is:

x(1/2)+2x(1/2)=3x/2

That is independent of the envelop chosen. If both have the same expected values then keeping you envelop is the right choice.

So, it's easy to come up with different ways of calculating it, that show that there is no gain. Problem is to show the specific flaw with the method discussed.

kurokotetsu
2013-03-07, 01:09 AM
Yes.

"X" in this case is actually a probability distribution; the distribution for the larger values is different than that for the smaller values, so the actual math would be more like "I could gain X or lose .5 Y, where Y is, on average, worth 2 X."

Fun With Infinite Hotels:
This is a set of problems of escalating difficulty, see how many you can answer: some or all may be impossible, if so, explain why.

Problem 1: You are the curator of a hotel with infinite rooms; unfortunately, they're all booked. A man just walked in and demanded that you give him a room to himself; can you accommodate him? You're not allowed to kick anyone out, but you can re-arrange them to your heart's desire.

Problem 2: You are again curator of an infinite hotel. Another infinite hotel, across the street, just burned down, but thankfully everyone escaped, and are now seeking housing in your hotel (which is again full.) Can you make room for everyone?

Problem 3: A tornado just hit the infinitely long street across town from yours, which contained infinitely many infinite hotels, all of which were destroyed. All of the patrons are seeking lodging in your hotel. Can you make it happen?

Problem 4: A new infinite-roomed hotel has been built across the street, and has been booked by an infinite swingers convention; unfortunately, strict hotel policies and in-room video cameras have put a damper on the convention. The head of the convention recently came to your hotel, and requested to use your rooms for liasons. He requires that every possible combination of swingers (including solo acts) from the hotel across the street have a dedicated room within your hotel. Can you help him? Do you want to?Russian hotel? You really like Kolmogorov, don't you?

1. Ask all of your tenant to move to the next room. SO move the first to the second, the second to the third... so on. Every one has a room and the new arrival gets the first.

2. Move each tenant to the number of room times two. THe first to the second room, the second to the fourth, etc. All the non-even rooms, which is an infinite number and as large as the number of total rooms, are free.

3...

Do i have to think of all the functions? Easier in all the cases you gave there is a countable number of guests moving in. As you have a countable number of rooms there is always the possibility to give everyone a room. They have the same cardinality so there is a biyective funtion between the sets.


So, it's easy to come up with different ways of calculating it, that show that there is no gain. Problem is to show the specific flaw with the method discussed.I would go for two possible values for the x in the expected gain.


Rainbow Hats Game:
This one is a step or six up from the problems offered thus far.

There is a game in which seven players sit in a circle and have a hat placed on their heads from behind. Hats may be one of seven colors, and colors may be repeated - so, three people could have an orange hat, everyone could have a blue hat, etc. Each player may see the color of everyone else's hat, but not their own. Players may strategize before the hats have been placed, but once the hats are on, they may not communicate in any way.

Simultaneously, each player makes a single guess as to the color of their own hat. If *any* player wins, everyone wins. Is there any strategy by which a set of players could win every possible game?What counts as a strategy? They place the hats or are they placed at random?

Douglas
2013-03-07, 01:19 AM
Fun With Infinite Hotels:
This is a set of problems of escalating difficulty, see how many you can answer: some or all may be impossible, if so, explain why.

Problem 1: You are the curator of a hotel with infinite rooms; unfortunately, they're all booked. A man just walked in and demanded that you give him a room to himself; can you accommodate him? You're not allowed to kick anyone out, but you can re-arrange them to your heart's desire.

Solution: Move everyone down one room.

Problem 2: You are again curator of an infinite hotel. Another infinite hotel, across the street, just burned down, but thankfully everyone escaped, and are now seeking housing in your hotel (which is again full.) Can you make room for everyone?

Solution: For every room number N, move the occupant of room N to room 2N. You now have an infinite number of odd-numbered empty rooms to offer.

Problem 3: A tornado just hit the infinitely long street across town from yours, which contained infinitely many infinite hotels, all of which were destroyed. All of the patrons are seeking lodging in your hotel. Can you make it happen?

Solution: Put Patron 1 from Hotel 1 (P1H1) in room 1. P2H1 goes in room 2. P1H2 goes in room 3. Then P3H1, P2H2, P1H3, P4H1, P3H2, P2H3, P1H4, and so on.

Problem 4: A new infinite-roomed hotel has been built across the street, and has been booked by an infinite swingers convention; unfortunately, strict hotel policies and in-room video cameras have put a damper on the convention. The head of the convention recently came to your hotel, and requested to use your rooms for liasons. He requires that every possible combination of swingers (including solo acts) from the hotel across the street have a dedicated room within your hotel. Can you help him? Do you want to?

Solution: This depends on exactly what you mean by "combination". If a combination is limited to 2 (or any finite number, really) people maximum, then it can be done. Use a procedure similar to problem 3. If there is no limit on the size of a combination, then you cannot help him. Proof as follows: assign a number to each possible combination by using binary. Each member of the convention corresponds to one binary digit. If that member is present in a combination, his bit is 1. If not, it's 0. This produces a set of numbers in binary that corresponds to the real numbers, not the integers, and mapping from the reals to the integers is not possible. I could go through the proof of that too, but I think I've done enough already.


Rainbow Hats Game:
This one is a step or six up from the problems offered thus far.

There is a game in which seven players sit in a circle and have a hat placed on their heads from behind. Hats may be one of seven colors, and colors may be repeated - so, three people could have an orange hat, everyone could have a blue hat, etc. Each player may see the color of everyone else's hat, but not their own. Players may strategize before the hats have been placed, but once the hats are on, they may not communicate in any way.

Simultaneously, each player makes a single guess as to the color of their own hat. If *any* player wins, everyone wins. Is there any strategy by which a set of players could win every possible game?
This sort of thing is a perennial favorite on the xkcd logic puzzles forum.

Assign a number to each color, 1 through 7. Assign a number to each person, 1 through 7. Each person will look around, add up the numbers for everyone else's hats, and guess the color that would make the sum including his own be exactly his assigned number above a multiple of 7.

kurokotetsu
2013-03-07, 01:26 AM
This is a set of problems of escalating difficulty, see how many you can answer: some or all may be impossible, if so, explain why.

Problem 1: You are the curator of a hotel with infinite rooms; unfortunately, they're all booked. A man just walked in and demanded that you give him a room to himself; can you accommodate him? You're not allowed to kick anyone out, but you can re-arrange them to your heart's desire.

Solution: Move everyone down one room.

Problem 2: You are again curator of an infinite hotel. Another infinite hotel, across the street, just burned down, but thankfully everyone escaped, and are now seeking housing in your hotel (which is again full.) Can you make room for everyone?

Solution: For every room number N, move the occupant of room N to room 2N. You now have an infinite number of odd-numbered empty rooms to offer.

Problem 3: A tornado just hit the infinitely long street across town from yours, which contained infinitely many infinite hotels, all of which were destroyed. All of the patrons are seeking lodging in your hotel. Can you make it happen?

Solution: Put Patron 1 from Hotel 1 (P1H1) in room 1. P2H1 goes in room 2. P1H2 goes in room 3. Then P3H1, P2H2, P1H3, P4H1, P3H2, P2H3, P1H4, and so on.

Problem 4: A new infinite-roomed hotel has been built across the street, and has been booked by an infinite swingers convention; unfortunately, strict hotel policies and in-room video cameras have put a damper on the convention. The head of the convention recently came to your hotel, and requested to use your rooms for liasons. He requires that every possible combination of swingers (including solo acts) from the hotel across the street have a dedicated room within your hotel. Can you help him? Do you want to?

Solution: This depends on exactly what you mean by "combination". If a combination is limited to 2 (or any finite number, really) people maximum, then it can be done. Use a procedure similar to problem 3. If there is no limit on the size of a combination, then you cannot help him. Proof as follows: assign a number to each possible combination by using binary. Each member of the convention corresponds to one binary digit. If that member is present in a combination, his bit is 1. If not, it's 0. This produces a set of numbers in binary that corresponds to the real numbers, not the integers, and mapping from the reals to the integers is not possible. I could go through the proof of that too, but I think I've done enough already.


This sort of thing is a perennial favorite on the xkcd logic puzzles forum.

Assign a number to each color, 1 through 7. Assign a number to each person, 1 through 7. Each person will look around, add up the numbers for everyone else's hats, and guess the color that would make the sum including his own be exactly his assigned number above a multiple of 7.
HotelsCombination to me suggests a finite number of partners. But you are right, if infinite partners then there are irrational numbers involved, and Cantor's Diagonalization shows that reals are not-countable.

FreakyCheeseMan
2013-03-07, 01:27 AM
Russian hotel? You really like Kolmogorov, don't you?

1. Ask all of your tenant to move to the next room. SO move the first to the second, the second to the third... so on. Every one has a room and the new arrival gets the first.

2. Move each tenant to the number of room times two. THe first to the second room, the second to the fourth, etc. All the non-even rooms, which is an infinite number and as large as the number of total rooms, are free.

3...

Do i have to think of all the functions? Easier in all the cases you gave there is a countable number of guests moving in. As you have a countable number of rooms there is always the possibility to give everyone a room. They have the same cardinality so there is a biyective funtion between the sets.
Yes, all of the functions, that's what makes it fun.

And the fourth problem has a different cardinality.



What counts as a strategy? They place the hats or are they placed at random?


A strategy is something of the form "If you see this, guess this..." The hats are chosen and assigned at random, with no foreknowledge on the part of the players.

Gray Mage
2013-03-07, 01:32 AM
Rainbow Hats Game:
This one is a step or six up from the problems offered thus far.

There is a game in which seven players sit in a circle and have a hat placed on their heads from behind. Hats may be one of seven colors, and colors may be repeated - so, three people could have an orange hat, everyone could have a blue hat, etc. Each player may see the color of everyone else's hat, but not their own. Players may strategize before the hats have been placed, but once the hats are on, they may not communicate in any way.

Simultaneously, each player makes a single guess as to the color of their own hat. If *any* player wins, everyone wins. Is there any strategy by which a set of players could win every possible game?

If the guess is made publicaly, just have one person guess the color that is on the player to his left/right/across/whatever, then have that person guess that color. :smalltongue:

This is probably not the expected answer, though. :smalltongue:

FreakyCheeseMan
2013-03-07, 01:34 AM
If the guess is made publicaly, just have one person guess the color that is on the player to his left/right/across/whatever, then have that person guess that color. :smalltongue:

Which is why I said "Simultaneously."

FreakyCheeseMan
2013-03-07, 01:35 AM
This is a set of problems of escalating difficulty, see how many you can answer: some or all may be impossible, if so, explain why.

Problem 1: You are the curator of a hotel with infinite rooms; unfortunately, they're all booked. A man just walked in and demanded that you give him a room to himself; can you accommodate him? You're not allowed to kick anyone out, but you can re-arrange them to your heart's desire.

Solution: Move everyone down one room.

Problem 2: You are again curator of an infinite hotel. Another infinite hotel, across the street, just burned down, but thankfully everyone escaped, and are now seeking housing in your hotel (which is again full.) Can you make room for everyone?

Solution: For every room number N, move the occupant of room N to room 2N. You now have an infinite number of odd-numbered empty rooms to offer.

Problem 3: A tornado just hit the infinitely long street across town from yours, which contained infinitely many infinite hotels, all of which were destroyed. All of the patrons are seeking lodging in your hotel. Can you make it happen?

Solution: Put Patron 1 from Hotel 1 (P1H1) in room 1. P2H1 goes in room 2. P1H2 goes in room 3. Then P3H1, P2H2, P1H3, P4H1, P3H2, P2H3, P1H4, and so on.

Problem 4: A new infinite-roomed hotel has been built across the street, and has been booked by an infinite swingers convention; unfortunately, strict hotel policies and in-room video cameras have put a damper on the convention. The head of the convention recently came to your hotel, and requested to use your rooms for liasons. He requires that every possible combination of swingers (including solo acts) from the hotel across the street have a dedicated room within your hotel. Can you help him? Do you want to?

Solution: This depends on exactly what you mean by "combination". If a combination is limited to 2 (or any finite number, really) people maximum, then it can be done. Use a procedure similar to problem 3. If there is no limit on the size of a combination, then you cannot help him. Proof as follows: assign a number to each possible combination by using binary. Each member of the convention corresponds to one binary digit. If that member is present in a combination, his bit is 1. If not, it's 0. This produces a set of numbers in binary that corresponds to the real numbers, not the integers, and mapping from the reals to the integers is not possible. I could go through the proof of that too, but I think I've done enough already.


This sort of thing is a perennial favorite on the xkcd logic puzzles forum.

Assign a number to each color, 1 through 7. Assign a number to each person, 1 through 7. Each person will look around, add up the numbers for everyone else's hats, and guess the color that would make the sum including his own be exactly his assigned number above a multiple of 7.

Correct on both counts.

Fun bit of bragging: I believe that I was among the first few hundred people to solve that rainbow hat problem. It was listed as an entry question into a math camp I attended ~7 years ago; that's probably where it got onto the xkcd forums from; the head of the camp had come up with the problem, and checked around to make sure it was not commonly offered.

Let's try another, from that same qualifying quiz:

Quadrilateral Packing:
This one is absolutely brutal. If you haven't seen it before, do not expect to solve it.

Is it possible to perfectly pack 2,001 convex quadrilaterals into a 9-by-9 equilateral triangle, if every side of every quadrilateral must be at least 1 in length?

Gray Mage
2013-03-07, 01:37 AM
Which is why I said "Simultaneously."

Ops, missed that. Well, this makes it trickier. I'll need to think for a while.

FreakyCheeseMan
2013-03-07, 01:38 AM
HotelsCombination to me suggests a finite number of partners. But you are right, if infinite partners then there are irrational numbers involved, and Cantor's Diagonalization shows that reals are not-countable.

RainbowNot sure that works all that well. Say blue is 1. If everybody but one has a blue hat (let's say orange 2), everybody would guess either a 7 (if they have a blue hat) or blu (if they have orange hat).

I meant it to include infinite combinations, but yeah, what you're saying is right.

On the rainbows - you're missing the fact that each player gets their own number, which also factors into the equation. Thus, the blue hat players would each guess a different number.

Douglas
2013-03-07, 01:40 AM
RainbowNot sure that works all that well. Say blue is 1. If everybody but one has a blue hat (let's say orange 2), everybody would guess either a 7 (if they have a blue hat) or blu (if they have orange hat).
Blue is 1, orange is 2, 6 blue hats, 1 orange, the orange is on person 7. The guesses will be:
Person 1: Blue
Person 2: Orange
Person 3: color 3
Person 4: color 4
Person 5: color 5
Person 6: color 6
Person 7: Blue

Person 1 is correct, so they win.

The nuclear option for hard logic puzzles: Blue Eyes
A group of people with assorted eye colors live on an island. They are all perfect logicians -- if a conclusion can be logically deduced, they will do it instantly. No one knows the color of their own eyes. Every night at midnight, a ferry stops at the island. Any islanders who have figured out the color of their own eyes then leave the island on the ferry that night, and the rest stay. Everyone can see everyone else at all times and keeps a count of the number of people they see with each eye color (excluding themselves), but they cannot otherwise communicate (except in the single event described below, in which only the communication described there occurs). There are 201 people on the island, one of whom is known as the Guru. The Guru never lies. Everyone on the island knows all the rules in this paragraph.

Of the 200 people on the island, 100 have blue eyes, 100 have brown, and the Guru has green. So any given blue-eyed person can see 100 people with brown eyes and 99 people with blue eyes (and one with green), but that does not tell him his own eye color; as far as he knows the totals could be 101 brown and 99 blue. Or 100 brown, 99 blue, and he could have red eyes.

The Guru is allowed to speak once (let's say at noon), on one day in all their endless years on the island. Standing before all the islanders, she says the following:

"I can see someone who has blue eyes."

Who leaves the island, and on what night, and why?

kurokotetsu
2013-03-07, 01:41 AM
I meant it to include infinite combinations, but yeah, what you're saying is right.

On the rainbows - you're missing the fact that each player gets their own number, which also factors into the equation. Thus, the blue hat players would each guess a different number.Yeah, a bit late. Re-read it and saw the error. I'll try the quadrilateral one later (but I'm rubish with that sort of thing, never been one for geometry and never studied tesalations).



The nuclear option for hard logic puzzles: Blue Eyes
A group of people with assorted eye colors live on an island. They are all perfect logicians -- if a conclusion can be logically deduced, they will do it instantly. No one knows the color of their own eyes. Every night at midnight, a ferry stops at the island. Any islanders who have figured out the color of their own eyes then leave the island on the ferry that night, and the rest stay. Everyone can see everyone else at all times and keeps a count of the number of people they see with each eye color (excluding themselves), but they cannot otherwise communicate (except in the single event described below, in which only the communication described there occurs). There are 201 people on the island, one of whom is known as the Guru. The Guru never lies. Everyone on the island knows all the rules in this paragraph.

Of the 200 people on the island, 100 have blue eyes, 100 have brown, and the Guru has green. So any given blue-eyed person can see 100 people with brown eyes and 99 people with blue eyes (and one with green), but that does not tell him his own eye color; as far as he knows the totals could be 101 brown and 99 blue. Or 100 brown, 99 blue, and he could have red eyes.

The Guru is allowed to speak once (let's say at noon), on one day in all their endless years on the island. Standing before all the islanders, she says the following:

"I can see someone who has blue eyes."

Who leaves the island, and on what night, and why? A couple of questions before leaving. First, the guru speaks only once, ever? Second, has anyone left the island before she speaks?

Douglas
2013-03-07, 01:51 AM
A couple of questions before leaving. First, the guru speaks only once, ever? Second, has anyone left the island before she speaks?
Only once, ever, and no one has ever left before.

FreakyCheeseMan
2013-03-07, 01:54 AM
The nuclear option for hard logic puzzles: Blue Eyes
[spoiler]A group of people with assorted eye colors live on an island. They are all perfect logicians -- if a conclusion can be logically deduced, they will do it instantly. No one knows the color of their own eyes. Every night at midnight, a ferry stops at the island. Any islanders who have figured out the color of their own eyes then leave the island on the ferry that night, and the rest stay. Everyone can see everyone else at all times and keeps a count of the number of people they see with each eye color (excluding themselves), but they cannot otherwise communicate (except in the single event described below, in which only the communication described there occurs). There are 201 people on the island, one of whom is known as the Guru. The Guru never lies. Everyone on the island knows all the rules in this paragraph.

Of the 200 people on the island, 100 have blue eyes, 100 have brown, and the Guru has green. So any given blue-eyed person can see 100 people with brown eyes and 99 people with blue eyes (and one with green), but that does not tell him his own eye color; as far as he knows the totals could be 101 brown and 99 blue. Or 100 brown, 99 blue, and he could have red eyes.

The Guru is allowed to speak once (let's say at noon), on one day in all their endless years on the island. Standing before all the islanders, she says the following:

"I can see someone who has blue eyes."

Who leaves the island, and on what night, and why?

The wording strikes me as being ambiguous; the guru has given no new information, as everyone already knows that she could see at least one person with blue eyes. If it were "I am looking at a person with blue eyes" that would be different.

Proceeding from the lack of information, I would say that on the first day, no one leaves; everyone looks at the group around them, sees that the other color outnumbers their own, and pick that. On the second day, everyone would realize that everyone else had been wrong, deduce that the reason was that they were following the same logic, and thus their own eye color must be among the observed minority, and everyone would leave.

Everyone except the guru. The guru would initially guess at random, then, on the second day, guess... hmm. The guru would realize that their own eye color was neither brown nor blue, because then the group they belonged to would be an even split, as they would see the same as the guru, and half of them would have gotten it right. So, the guru would proceed to guess colors at random; if they knew that green was a common pigment, they would leave on the second day with everyone else.

Actually, it's all ambiguous, because some of the probabilities aren't well defined - like the increased probability of there being an even distribution of eye color in such a contrived setting. :P

Douglas
2013-03-07, 01:57 AM
Further clarification: probability is not involved in this one. No guesses are allowed, the islanders do and must present proof to the ferryman. This is also among the "everyone knows" rules. Anyone who cannot present proof doesn't leave.

FreakyCheeseMan
2013-03-07, 01:58 AM
Further clarification: probability is not involved in this one. No guesses are allowed, the islanders do and must present proof to the ferryman. This is also among the "everyone knows" rules.

Well, that changes everything.

Forget everything I just said, then. :P

...except for my complaint about the statement containing no information. >_<

Douglas
2013-03-07, 02:06 AM
...except for my complaint about the statement containing no information. >_<
Which is incorrect. The puzzle does, in fact, have a solution. And that solution is not "nothing happens".

Hey, I said it was hard.:smallbiggrin:

FreakyCheeseMan
2013-03-07, 02:09 AM
...after careful consideration, I think that more information is required.

What are people's motivations? Does everyone (including the guru) care only about getting themselves off of the island as soon as possible? Does the guru want to get everyone off the island? Does the guru care who leaves when?

The guru's statement contained no information, except that the guru said it - which then raises the question "Why did the guru say it?", which requires knowledge of the guru's motivations.

EDIT: The statement itself - not the fact that it was made - contained no information. Everyone already knew the guru could see someone with blue eyes, as they could all see someone with blue eyes. So, what matters is not "The guru gave us this information" but "The guru chose to say this particular thing", which then requires knowledge of the guru's own motivations.

Douglas
2013-03-07, 02:13 AM
Nope, the Guru's (and everyone else's) motivations are irrelevant. The Guru's statement itself contains no new information, true, but there is something about it that does - and the information conveyed is quite subtle and hard to understand.

FreakyCheeseMan
2013-03-07, 02:19 AM
...I'm gonna give you the benefit of the doubt, for now, and just think aloud for a bit.

...the statement in itself contains no information; the statement was not intended to produce a result, as the motivations of the guru are not known (with certainty) to the islanders. For all they knew, the guru picked a statement at random out of the infinite set of true statements.

So, I'm looking for important things that could be gleamed from a statement that contains no information, made by a person who could be a raving (but perfectly honest) loon.

I get the sinking feeling that this is going to be the pink ping pong balls joke of logic puzzles.

FreakyCheeseMan
2013-03-07, 02:35 AM
Alright, the fact that it's posted on XKCD gives me enough faith to sleep on it, but I still think that motivation has to matter.

The original doesn't say anything about proof, just "Figures out".

*stares at you with beady, suspicious little eyes*

To my great shame, my brain itched too much and I gave in and looked up an answer. That is a good one.

FreakyCheeseMan
2013-03-07, 02:47 AM
Thanks for that; it does a soul good to be humbled, once in a while.

kurokotetsu
2013-03-07, 10:04 AM
Sleeping a little, at least I've arrived at this:

If one blue eyed leaves, all the blue eye leaves (same for brown and green) because they are all perfect and all have the same information. From that everybody else at least knows that they don't have that eye colour.

Intuitively I say the Guru leaves, but I have to work on the important bit the why.

FreakyCheeseMan
2013-03-07, 10:30 AM
Sleeping a little, at least I've arrived at this:

If one blue eyed leaves, all the blue eye leaves (same for brown and green) because they are all perfect and all have the same information. From that everybody else at least knows that they don't have that eye colour.

Intuitively I say the Guru leaves, but I have to work on the important bit the why.

The Guru can't leave, ever. She can see no one with her own eye color; her perspective would be exactly the same if she had red eyes, so she can never prove she has green.

kurokotetsu
2013-03-07, 10:53 AM
The Guru can't leave, ever. She can see no one with her own eye color; her perspective would be exactly the same if she had red eyes, so she can never prove she has green.SO there are an "infinite" number of possibilities? There are "red" "grey" "black" and so on that can't be eliminated by saying not-"x". Thought so, but it makes it harder then. Back to thinking.

Wookieetank
2013-03-07, 11:48 AM
Let's try another, from that same qualifying quiz:

Quadrilateral Packing:
This one is absolutely brutal. If you haven't seen it before, do not expect to solve it.

Is it possible to perfectly pack 2,001 convex quadrilaterals into a 9-by-9 equilateral triangle, if every side of every quadrilateral must be at least 1 in length?

Short answer: no :smalltongue:

Long answer: Supposing that your 9 by 9 triangle is in inches (you and your naked numbers, you monster). From what I doodled and figured, it'll take 7 quads for a side, with and 8th being shared between 2 sides of the triangle. This will break up the first set of quads into 6 rhomboids (1 in sides each) and 1 trapezoid (2 in bottom, 1 in rest of the sides) where the trapezoids are in the corners (with one of the 1 insides of the trap being the start of the next side of the triangle). This gives you 21 quads for your first row, leaving you with a 7x7x7 equilateral triangle for your next row, a 5x5x5 for the third, 3x3x3 for a toatl of 39 quads and a 1x1x1 triangle for the middle (which is not a quad). If you didn't have to use the trapezoids for the corners (or fill the triangle completely) you could probably get an almost infinite number of rhombs in there of miniscule height.

FreakyCheeseMan
2013-03-07, 01:34 PM
Short answer: no :smalltongue:

Long answer: Supposing that your 9 by 9 triangle is in inches (you and your naked numbers, you monster). From what I doodled and figured, it'll take 7 quads for a side, with and 8th being shared between 2 sides of the triangle. This will break up the first set of quads into 6 rhomboids (1 in sides each) and 1 trapezoid (2 in bottom, 1 in rest of the sides) where the trapezoids are in the corners (with one of the 1 insides of the trap being the start of the next side of the triangle). This gives you 21 quads for your first row, leaving you with a 7x7x7 equilateral triangle for your next row, a 5x5x5 for the third, 3x3x3 for a toatl of 39 quads and a 1x1x1 triangle for the middle (which is not a quad). If you didn't have to use the trapezoids for the corners (or fill the triangle completely) you could probably get an almost infinite number of rhombs in there of miniscule height.

There's no reason you can't fit several thinner quadrilaterals into one corner of the triangle; in fact, you could fit all 2001 into a single corner as very thin rhombuses; the only problem is that it would not perfectly fill the triangle.

There are a lot of specific methods by which it will not work - but, the question either requires proof that *no* method will work, or a method that will.

Astral Avenger
2013-03-07, 02:02 PM
There's no reason you can't fit several thinner quadrilaterals into one corner of the triangle; in fact, you could fit all 2001 into a single corner as very thin rhombuses; the only problem is that it would not perfectly fill the triangle.

There are a lot of specific methods by which it will not work - but, the question either requires proof that *no* method will work, or a method that will.

Can the quadrilaterals overlap?

FreakyCheeseMan
2013-03-07, 02:16 PM
Can the quadrilaterals overlap?

No. "Perfect Packing" means that every bit of every quadrilateral must be inside the triangle; nothing can stick out, and no spaces can be left over.

An easier way to think of it might be "Could you cut a 9x9 equilateral triangle into 2001 convex quadrilaterals, with every side of every quadrilateral at least one unit in length, such that no pieces of the original triangle are left over."

Wookieetank
2013-03-07, 02:30 PM
There's no reason you can't fit several thinner quadrilaterals into one corner of the triangle; in fact, you could fit all 2001 into a single corner as very thin rhombuses; the only problem is that it would not perfectly fill the triangle.

There are a lot of specific methods by which it will not work - but, the question either requires proof that *no* method will work, or a method that will.

You still end up with a triangle in there, you can't fill a 3 sided object completely with 4 sided ones, unless I"m doing something horrifically wrong.

kurokotetsu
2013-03-07, 02:45 PM
You still end up with a triangle in there, you can't fill a 3 sided object completely with 4 sided ones, unless I"m doing something horrifically wrong.Yes, yes you can. Not all the exactly same shape and trapezoidial all of them, but at least quickly I covered a triangle with four quadrilaterals (maybe I can refine them to four trapezoids, which maybe are more manageable in the problem). Now, the problem are the constrains to the problem, the specific amounts given. both number being multiples of three give me reason to think that it is possible in some convoluted way.

Had to look the Blue Eyes. Once read the solution it was obvious how it is done.

Wookieetank
2013-03-07, 02:50 PM
Yes, yes you can. Not all the exactly same shape and trapezoidial all of them, but at least quickly I covered a triangle with four quadrilaterals. Now, the problem are the constrains to the problem, the specific amounts given.

Had to look the Blue Eyes. Once read the solution it was obvious how it is done.

Playing with trapezoids let me do it with three and make the answer quite obvious that it is possible. There was a reason I never really like geometry (other than the proofs *shudder*), give me equations any day over shapes. :smallredface:

FreakyCheeseMan
2013-03-07, 02:59 PM
I actually recommend against trying to solve the quadrilateral one. That way lies madness.

Hell, I'm going crazy just trying to remember the solution.

Douglas
2013-03-07, 03:02 PM
Aha! I think I've got it:
Rough image worked up in Paint to illustrate the concept:
http://img221.imageshack.us/img221/8726/packing.png

All corners of the central trapezoids are intended to line up in a vertical line with the corresponding corners of all the trapezoids they're stacked on, and their tops and bottoms are supposed to be perfectly horizontal. I just didn't feel like being that precise in my sketching. Their top corners are 1 unit horizontally closer to the center than the bottom corners, and their bottom corners are 1 unit horizontally away from the triangle's bottom corners (the drawing is not accurately scaled).

It should be obvious that the specifications above guarantee the minimum side length for left and right sides of the trapezoids and for all sides of the diamond shard like shapes connecting them to the triangle corners. They also leave room for bottom side length of up to 7 and top side length of up to 5 for the trapezoids, which is plenty to spare.

That leaves the requirement that all quadrilaterals be convex. I worked out the math, and all that is required for that is that each line from the triangle corner has a slope ever so slightly more than twice as steep as the line immediately below it. The fact that it's a ratio, with no flat difference involved, means I can compress the array vertically as far as I want, stacking an unlimited number of trapezoid + diamond shard combos in the bottom of the triangle. The top of the triangle can then be divided as shown into three more quadrilaterals.

This method can pack any multiple of 3 into the triangle. 2001 is a multiple of 3. Problem solved.

kurokotetsu
2013-03-07, 03:11 PM
Playing with trapezoids let me do it with three and make the answer quite obvious that it is possible. There was a reason I never really like geometry (other than the proofs *shudder*), give me equations any day over shapes. :smallredface:Actually I think that can be more or less solved then. Correct me if I'm wrong but I think that the triangle can be divided into three equal trapezoids, by drawing a parallel line to each side that crosses through the centre (as it is equilateral then all centres are the same, so take your pick). Now each of the resulting trapezoids can be tesselleted with smaller quadrilaterals.

But what has to be proven is that such tesselletion can be done with quadrilaterals of at least 1 in length (one in each side or only one side?) and with exactly 667 of them. That seems doable (if I'm in the right track).

And I'm with you, geometry isn't that much of my thing. AS a matter of fact I enjoyed it the less figures and constructions were involved (analytic geometry over classic and differential the most).


Aha! I think I've got it:
Rough image worked up in Paint to illustrate the concept:
http://img221.imageshack.us/img221/8726/packing.png

All corners of the central trapezoids are intended to line up in a vertical line with the corresponding corners of all the trapezoids they're stacked on, and their tops and bottoms are supposed to be perfectly horizontal. I just didn't feel like being that precise in my sketching. Their top corners are 1 unit horizontally closer to the center than the bottom corners, and their bottom corners are 1 unit horizontally away from the triangle's bottom corners (the drawing is not accurately scaled).

It should be obvious that the specifications above guarantee the minimum side length for left and right sides of the trapezoids and for all sides of the diamond shard like shapes connecting them to the triangle corners. They also leave room for bottom side length of up to 7 and top side length of up to 5 for the trapezoids, which is plenty to spare.

That leaves the requirement that all quadrilaterals be convex. I worked out the math, and all that is required for that is that each line from the triangle corner has a slope ever so slightly more than twice as steep as the line immediately below it. The fact that it's a ratio, with no flat difference involved, means I can compress the array vertically as far as I want, stacking an unlimited number of trapezoid + diamond shard combos in the bottom of the triangle. The top of the triangle can then be divided as shown into three more quadrilaterals.

This method can pack any multiple of 3 into the triangle. 2001 is a multiple of 3. Problem solved.Hmm, that seems to work, but at least is should be added that the trapezoids that you build have to be of 3 at the top and at most 5 the lower parallel side to assure that all the sides are over 1. With the Archimedean property that you can show that all sides are over 1, no matter how many trapezoids you form.

FreakyCheeseMan
2013-03-07, 03:38 PM
Aha! I think I've got it:
Rough image worked up in Paint to illustrate the concept:
http://img221.imageshack.us/img221/8726/packing.png

All corners of the central trapezoids are intended to line up in a vertical line with the corresponding corners of all the trapezoids they're stacked on, and their tops and bottoms are supposed to be perfectly horizontal. I just didn't feel like being that precise in my sketching. Their top corners are 1 unit horizontally closer to the center than the bottom corners, and their bottom corners are 1 unit horizontally away from the triangle's bottom corners.

It should be obvious that the specifications above guarantee the minimum side length for left and right sides of the trapezoids and for all sides of the diamond shard like shapes connecting them to the triangle corners. They also leave room for bottom side length of up to 7 and top side length of up to 5 for the trapezoids, which is plenty to spare.

That leaves the requirement that all quadrilaterals be convex. I worked out the math, and all that is required for that is that each line from the triangle corner has a slope ever so slightly more than twice as steep as the line immediately below it. The fact that it's a ratio, with no flat difference involved, means I can compress the array vertically as far as I want, stacking an unlimited number of trapezoid + diamond shard combos in the bottom of the triangle. The top of the triangle can than be divided as shown into three more quadrilaterals.

This method can pack any multiple of 3 into the triangle. 2001 is a multiple of 3. Problem solved.

Hmm... I don't think there's a solution that simple, and I recall this problems as having a lot of close-but-not-quites. Can you give me hard numbers on the heights of those vertical lines?

For the record, the "Official" solution (Which I only half-remember) goes like this:
Start by drawing two vertical dotted lines up from the base of the triangle, 3 and 6 units from the edge. Then, draw a solid line from the lower left corner to where the right-hand dotted line meets the upper right edge. Next, draw another line from the lower right hand corner, to where the previous line met the left-hand dotted line; continue to draw lines in this way, alternating corners, until you have 667 triangles. A cursory examination will show that every side of every triangle is at least three in length.

The part I can't remember is how you then break down those triangles into 3 quadrilaterals each.

kurokotetsu
2013-03-07, 03:54 PM
Hmm... I don't think there's a solution that simple, and I recall this problems as having a lot of close-but-not-quites. Can you give me hard numbers on the heights of those vertical lines?

For the record, the "Official" solution (Which I only half-remember) goes like this:
Start by drawing two vertical dotted lines up from the base of the triangle, 3 and 6 units from the edge. Then, draw a solid line from the lower left corner to where the right-hand dotted line meets the upper right edge. Next, draw another line from the lower right hand corner, to where the previous line met the left-hand dotted line; continue to draw lines in this way, alternating corners, until you have 667 triangles. A cursory examination will show that every side of every triangle is at least three in length.

The part I can't remember is how you then break down those triangles into 3 quadrilaterals each.


Three trapezoid each, that can be done as I said in a previous post, and those trapezoids could be shown to have the right dimensions.

But Douglas' answer still seems right to me. Let's say, the upper side of each trapezoid is 3 and the lower 5. By construction we know that then each "left over" bit is 1 in length. Drawing a vertical line can be shown with the triangle inequality that each of the slanted sides in the trapezoids is at leas one (as one side of the triangle is 1, the slanted side is over 1, no matter how "flat" do the trapezoids get, 1 is the limit, where they would stop being trapezoids to be flat lines), so all the central trapezoids can be shown to have at least 1 in length in all sides. The side quadrilateral follow a similar argument. The lines that connect to the vertices are all decrease but by the same reason they tend to one but are always larger. As all the quadrilaterals share sides, all sides are over 1. I wonder if I explained myself clearly.

FreakyCheeseMan
2013-03-07, 04:15 PM
Three trapezoid each, that can be done as I said in a previous post, and those trapezoids could be shown to have the right dimensions.

But Douglas' answer still seems right to me. Let's say, the upper side of each trapezoid is 3 and the lower 5. By construction we know that then each "left over" bit is 1 in length. Drawing a vertical line can be shown with the triangle inequality that each of the slanted sides in the trapezoids is at leas one (as one side of the triangle is 1, the slanted side is over 1, no matter how "flat" do the trapezoids get, 1 is the limit, where they would stop being trapezoids to be flat lines), so all the central trapezoids can be shown to have at least 1 in length in all sides. The side quadrilateral follow a similar argument. The lines that connect to the vertices are all decrease but by the same reason they tend to one but are always larger. As all the quadrilaterals share sides, all sides are over 1. I wonder if I explained myself clearly.

It definitely works in terms of length, but I'm not convinced about them all being convex. Been doing some back-of-the-envelope calculations on the limits of the sizes, but my brain is all meh and lazy.

Honestly, the reason I doubt it is because I know that a lot of people who were much, much smarter than me got stumped by it for weeks, and they would have found a "simple" solution just by fooling around; also, when I was working on it, I ran into any number of things that "Seemed" like they should work, but did not. Not exactly a rigorous proof, but enough to make me want to check the hard numbers.

Douglas
2013-03-07, 04:34 PM
Argument for convexness:
The top part is obvious, and so are all the trapezoids. The top 2 corners and bottom corner of each diamond shard is also obvious. The only corner that needs more careful attention is the middle corner of each diamond shard.

Looking at just the ones on the left, that corner being convex is equivalent to the side right of the corner being steeper than the side left of the corner. If you don't care about side lengths, this is trivially easy to satisfy. The only problem is ensuring that this corner can be convex without breaking the length requirement for the top and right sides of the shard. Given the vertical alignment of the corners and the triangle inequality, the right side must be longer than the top side, so it is sufficient to show that the top side is long enough.

Slope of the top line is B, slope of bottom line is A. At horizontal distance 1 from the triangle corner, which is where these lines meet the corners, they are at heights B and A. The right side of the shard must have slope greater than A, yet take at least 1 horizontal unit to cover the vertical distance B - A.

B - A (vertical distance) > 1 * A (horizontal distance times slope)
B - A > A
B > 2A

Set B to double A plus an arbitrarily small difference, and there is room for the corner to be convex while still maintaining length requirements. It is purely a ratio of vertical factors, so it can be compressed vertically without limit.

FreakyCheeseMan
2013-03-07, 04:51 PM
...yeah, I think that works. Huh.

And I think I'm out of puzzles.

Douglas
2013-03-07, 05:29 PM
Have another hat puzzle, then.
Three people will each have a hat placed on them from behind. Each hat is either black or white with equal probability. Each person can see the other two hats but not his own. All three people must simultaneously either guess their own hat color or decline to guess. In order to win, there must be at least one correct guess and zero incorrect guesses.

The three people cannot communicate once the hats are placed, but are allowed to strategize before hand. What are the best odds they can get, and how? Hint: 50% by having a single designated guesser is not the answer.

FreakyCheeseMan
2013-03-07, 05:35 PM
Have another hat puzzle, then.
Three people will each have a hat placed on them from behind. Each hat is either black or white with equal probability. Each person can see the other two hats but not his own. All three people must simultaneously either guess their own hat color or decline to guess. In order to win, there must be at least one correct guess and zero incorrect guesses.

The three people cannot communicate once the hats are placed, but are allowed to strategize before hand. What are the best odds they can get, and how? Hint: 50% by having a single designated guesser is not the answer.

Probabilities:
WWW - 1/8
WWB - 3/8
WBB - 3/8
BBB - 1/8

If you see two the same, guess the opposite; if you see two different, keep quiet. 75% of games will be won.

FreakyCheeseMan
2013-03-07, 05:43 PM
Oh, came up with another one.

Three Guards (Hard):
There are two paths before you; one leads to victory, one leads to death. There are three brothers, one who always lies, one who always speaks the truth, and one who does either, at random. What is the least number of questions by which you can be certain of choosing the correct path? Each question must be asked to a single, specific guard.

DaedalusMkV
2013-03-07, 07:49 PM
Three Guards (Hard):
There are two paths before you; one leads to victory, one leads to death. There are three brothers, one who always lies, one who always speaks the truth, and one who does either, at random. What is the least number of questions by which you can be certain of choosing the correct path? Each question must be asked to a single, specific guard.

Depends on if the truth and lie brothers can accurately predict the random brother's choices. If they cannot, we simply ask each brother in turn which path their brothers would say leads to victory; truth brother will answer with the wrong path and "I don't know", lie brother will answer with the wrong path and some sort of gibberish answer because he cannot tell a foolproof lie about the random brother ("I don't know" is the truth, but selecting either path at random also has a 50% chance of telling the truth, violating his always lies principle). Random brother answers a random path both times, since whether he lies or tells the truth his brothers would always select a path. You select the opposite path of the one selected by whichever brother says "I don't know" or gives a gibberish answer in response to one brother's statement. This is basically one question asked to each brother, but you only need to ask until you get one "I don't know" or gibberish response, at which point you have verified the truth or lie brother respectively. Minimum questions asked is between 1 and 2, with only one question necessary unless you wind up asking the random brother the first question in which case you will need to ask one of the other brothers.

If the truth and lies brothers can accurately predict the random brother's choices, things get a little bit more difficult. Using the above method gets you no information whatsoever. Here, we ask them in order to state which path the random brother will say leads to victory. Whichever of the three answers disagrees with the other two is the liar (whether the random brother tells the truth or lies about his answer is irrelevent, because the truth and lie brothers must have known in advance which he would pick and will pick the same and the opposite answer he does, respectively); we can then ask him which path leads to victory and take the opposite of his answer. Here we must ask four questions, since we cannot accurately determine which brother is which until all brothers have been questioned and we cannot gain any relevent information on the paths until we have determined the role of at least one brother.

Those are the best answers I can come up with, anyways.

kurokotetsu
2013-03-07, 09:47 PM
Oh, came up with another one.

Three Guards (Hard):
There are two paths before you; one leads to victory, one leads to death. There are three brothers, one who always lies, one who always speaks the truth, and one who does either, at random. What is the least number of questions by which you can be certain of choosing the correct path? Each question must be asked to a single, specific guard.My first answer yields a number of three at most. The "What would your brothers say?", but maybe there is a more efficient one.

thubby
2013-03-07, 11:32 PM
Oh, came up with another one.

Three Guards (Hard):
There are two paths before you; one leads to victory, one leads to death. There are three brothers, one who always lies, one who always speaks the truth, and one who does either, at random. What is the least number of questions by which you can be certain of choosing the correct path? Each question must be asked to a single, specific guard.


2
"what would your non-random brother(s) say"
automatically identifies the random brother

then ask it again of one of the other 2 and go the other direction.

Quantum Glass
2013-03-08, 01:38 AM
Oh, came up with another one.

Three Guards (Hard):
There are two paths before you; one leads to victory, one leads to death. There are three brothers, one who always lies, one who always speaks the truth, and one who does either, at random. What is the least number of questions by which you can be certain of choosing the correct path? Each question must be asked to a single, specific guard.

One.

Indicating a specific path as, "Path A," you walk up to any brother and ask, "Is exactly one of the following statements true: "Your answer to this question will be a lie; Path A will lead me to victory"?

This creates a scenario where, regardless of the brother you choose, you'll always get the same answer. If the brother answers yes, Path A is the safe one. If he answers no, Path B.

FreakyCheeseMan
2013-03-08, 01:51 AM
One.

Indicating a specific path as, "Path A," you walk up to any brother and ask, "Is exactly one of the following statements true: "Your answer to this question will be a lie; Path A will lead me to victory"?

This creates a scenario where, regardless of the brother you choose, you'll always get the same answer. If the brother answers yes, Path A is the safe one. If he answers no, Path B.

Huh. Well done, so long as the random brother is entirely honest or dishonest within the scope of a single question.

bookguy
2013-03-08, 11:35 PM
Here's my favorite logic puzzle (not too hard in my opinion, but then again I found the guru problem obvious):

5 Couples
Joe and his wife go to a party. The only other people to attend are 4 other couples. All of the couples arrive simultaneously. When they arrive, everyone shakes hands with everyone they don't know. Then, Joe asks each of the 9 other partygoers how many hands they shook. Each one (truthfully) gives a different answer, from 0 to 8. How many hands did Joe's wife shake?

Douglas
2013-03-09, 01:00 AM
Here's my favorite logic puzzle (not too hard in my opinion, but then again I found the guru problem obvious):

5 Couples
Joe and his wife go to a party. The only other people to attend are 4 other couples. All of the couples arrive simultaneously. When they arrive, everyone shakes hands with everyone they don't know. Then, Joe asks each of the 9 other partygoers how many hands they shook. Each one (truthfully) gives a different answer, from 0 to 8. How many hands did Joe's wife shake?
4

The person who shook 8 hands must have shaken hands with everyone except his or her spouse. This person cannot be Joe's wife, because then she would have shaken hands with everyone else Joe asked and no one could have answered 0. The only person who could have shaken 0 hands must be this person's spouse.

The person who shook 7 hands must have shaken hands with everyone except his or her spouse and the 0 hands person. This person cannot be Joe's wife, because then she would have shaken hands with everyone else Joe asked except the 0 hands person, and in combination with the 8 hands person this means no one could have answered 1. The 1 hand person must be the 7 hands person's spouse.

Similar logic applies for 6 and 2, and 5 and 3. That leaves 4 as the only possibility for how many hands Joe's wife shook.

Incidentally, Joe also shook precisely 4 hands. More specifically, both he and his wife shook the hands of the people who shook 8, 7, 6, and 5 hands. They apparently do a lot of things together, because they knew the exact same set of people present. The other couples do not, to varying degrees. Joe and his wife know precisely one person in each couple.

FreakyCheeseMan
2013-03-09, 01:38 PM
Here's my favorite logic puzzle (not too hard in my opinion, but then again I found the guru problem obvious):

5 Couples
Joe and his wife go to a party. The only other people to attend are 4 other couples. All of the couples arrive simultaneously. When they arrive, everyone shakes hands with everyone they don't know. Then, Joe asks each of the 9 other partygoers how many hands they shook. Each one (truthfully) gives a different answer, from 0 to 8. How many hands did Joe's wife shake?

Easy. 4. 0 and 8 have to be married to each other; apart from them, seven had to shake the hands of everyone they weren't married to except for 0 and 8; as 1 didn't shake hands with anyone but 8, 1 and 7 are married. Work your way along that lines, each person is married to 8 minus their number, the higher numbers shook hands with Joe, the lower ones didn't, and four is married to him.

FreakyCheeseMan
2013-03-09, 02:54 PM
This one is kinda of complicated, but let's see if anyone can get it.

Poison Brownies
"Poison Brownies" is a two player game played over a grid of brownies; the brownie in the lower left is poisoned. On each turn, every player must select one brownie; they eat that brownie, as well as everything above and to the right of it (so, eating the poison brownie would mean eating the rest of the board, as well.) Whoever eats the poison brownie loses.

On an MxN board, assuming both players play perfectly, who will win - the player who goes first, or the player who goes second? (Feel free to give your answer as a function of M and N.) Proof required for credit.

DaedalusMkV
2013-03-09, 03:11 PM
This one is kinda of complicated, but let's see if anyone can get it.

Poison Brownies
"Poison Brownies" is a two player game played over a grid of brownies; the brownie in the lower left is poisoned. On each turn, every player must select one brownie; they eat that brownie, as well as everything above and to the right of it (so, eating the poison brownie would mean eating the rest of the board, as well.) Whoever eats the poison brownie loses.

On an MxN board, assuming both players play perfectly, who will win - the player who goes first, or the player who goes second? (Feel free to give your answer as a function of M and N.) Proof required for credit.


The first player will always win so long as the starting 'grid' is not made up of only a single poisoned brownie. If M and N are both greater than 1, he selects the brownie immediately above and to the right of the poison brownie, leaving two rows of brownies extending to the right and left of it. He then duplicates whatever move player 2 makes, leaving the same number of brownies remaining in that row as the row player 2 picked from. Regardless of how many brownies 2 takes and from which rows, all non-poison brownies will be eaten after player 2 takes the last non-poisoned brownie from one of the rows, leaving player 2 to take the poisoned one, since whichever row 2 picks from 1 always has more remaining brownies in the opposite.

If M or N is 1, player 1 has an easier time of it, since he can simply pick all but the poisoned brownie in the first move.

Player 2 can only win this 'game' if player 1 does not have any non-poisoned brownies to pick on his first move of the game.

Douglas
2013-03-09, 03:48 PM
The first player will always win so long as the starting 'grid' is not made up of only a single poisoned brownie. If M and N are both greater than 1, he selects the brownie immediately above and to the right of the poison brownie, leaving two rows of brownies extending to the right and left of it. He then duplicates whatever move player 2 makes, leaving the same number of brownies remaining in that row as the row player 2 picked from. Regardless of how many brownies 2 takes and from which rows, all non-poison brownies will be eaten after player 2 takes the last non-poisoned brownie from one of the rows, leaving player 2 to take the poisoned one, since whichever row 2 picks from 1 always has more remaining brownies in the opposite.

If M or N is 1, player 1 has an easier time of it, since he can simply pick all but the poisoned brownie in the first move.

Player 2 can only win this 'game' if player 1 does not have any non-poisoned brownies to pick on his first move of the game.

Not quite.
Your strategy assumes that M and N are equal. If they are not, player 2 can effectively swap the roles in your strategy on his first move by making them equal.

Capt Spanner
2013-03-09, 08:15 PM
This one is kinda of complicated, but let's see if anyone can get it.

Poison Brownies
"Poison Brownies" is a two player game played over a grid of brownies; the brownie in the lower left is poisoned. On each turn, every player must select one brownie; they eat that brownie, as well as everything above and to the right of it (so, eating the poison brownie would mean eating the rest of the board, as well.) Whoever eats the poison brownie loses.

On an MxN board, assuming both players play perfectly, who will win - the player who goes first, or the player who goes second? (Feel free to give your answer as a function of M and N.) Proof required for credit.

Solution:

The first player wins. Let the poison brownie be in position [1,1] of the MxN grid.

The first player chooses brownie [2,2]. This leaves three brownies: [1,1], [2,1] and [1,2].

The second player chooses one of [1,2] or [2,1] leaving two brownies.

The first player chooses the non-poisoned brownie and the second player must choose the poisoned brownie and dies.

The first player dies of a sugar overdose a few days later.

Advanced Pirating Skills

Much like the pirates from earlier, the pirates on the ship divide up loot as follows:
1. The highest ranking pirate proposes a way to split the loot.
2. All the pirates (including the highest ranking one) vote on whether or not to accept the split.
3. If the majority vote in favour or the vote is split the loot is divided as proposed. Otherwise the highest ranking pirate is killed and all the pirates move up one place in the ranking and the process is repeated.

Assume that:
1. The pirates are perfect logicians.
2. The pirates have the following priorities, from most important to least important:
2a) Survival (i.e. don't get killed);
2b) Loot, get as much of it as possible without getting killed;
2c) Blood, kill other pirates as long as it does not get you killed and as long as you don't lose loot over it.

The scenario:
There are six pirates and a single gold piece. How is the gold split up?

FreakyCheeseMan
2013-03-09, 08:55 PM
Solution:

The first player wins. Let the poison brownie be in position [1,1] of the MxN grid.

The first player chooses brownie [2,2]. This leaves three brownies: [1,1], [2,1] and [1,2].

The second player chooses one of [1,2] or [2,1] leaving two brownies.

The first player chooses the non-poisoned brownie and the second player must choose the poisoned brownie and dies.

The first player dies of a sugar overdose a few days later.

That only works on a 2x2 grid.



Advanced Pirating Skills

Much like the pirates from earlier, the pirates on the ship divide up loot as follows:
1. The highest ranking pirate proposes a way to split the loot.
2. All the pirates (including the highest ranking one) vote on whether or not to accept the split.
3. If the majority vote in favour or the vote is split the loot is divided as proposed. Otherwise the highest ranking pirate is killed and all the pirates move up one place in the ranking and the process is repeated.

Assume that:
1. The pirates are perfect logicians.
2. The pirates have the following priorities, from most important to least important:
2a) Survival (i.e. don't get killed);
2b) Loot, get as much of it as possible without getting killed;
2c) Blood, kill other pirates as long as it does not get you killed and as long as you don't lose loot over it.

The scenario:
There are six pirates and a single gold piece. How is the gold split up?


Same method as before. G means the pirate gets gold, D means the pirate gets dead, numbers mean the pirate gets to see that many other pirates dead. Each row represents the results after a vote, not the propsal put forth.
For the fun, I computed the result if "Ties go to murder", real chart next.
{table] 1 | 2 | 3 | 4 | 5 | 6 | Explanation
| | | | | G | One pirate, gives the gold to himself.
| | | | D | G | No way to convince the second pirate, first pirate dies.
| | | G | 0 | 0 | Second pirate doesn't wanna die, votes yes.
| | D | G | 1 | 1 | Needs to convince two other people, can't.
| 0 | 0 | 0 | 0 | G | Second pirate will vote in favor in order to live, last one can be bribed.
D | 1 | 1 | 1 | 1 | G | Need four votes, no one's in danger of dying, only one gold piece to give. [/table]

Actually, either of the last two could end up with the gold, depending on who the first made decided to bribe. Now the real one.

{table] 1 | 2 | 3 | 4 | 5 | 6 | Explanation
| | | | | G | One pirate, gets the gold
| | | | G | 0 | 1-1 tie, first pirate gets gold
| | | 0 | 0 | G | 2-1, first pirate bribes third.
| | 0 | G | 0 | 0 | 2-2 ties, first pirate bribes second (or third)
| D | 1 | G | 1 | 1 | 3-2, can't buy enough votes, pirates vote for blood.
0 | 0 | G | 0 | 0 | 0 | 3-3 tie, second pirate doesn't want to buy, third one can be bought. [/table]

Any of the latter four pirates can end up with the gold, depending on who bribes whom.

Capt Spanner
2013-03-09, 09:31 PM
That only works on a 2x2 grid.


... D'oh! You're right.





Same method as before. G means the pirate gets gold, D means the pirate gets dead, numbers mean the pirate gets to see that many other pirates dead. Each row represents the results after a vote, not the propsal put forth.
For the fun, I computed the result if "Ties go to murder", real chart next.
{table] 1 | 2 | 3 | 4 | 5 | 6 | Explanation
| | | | | G | One pirate, gives the gold to himself.
| | | | D | G | No way to convince the second pirate, first pirate dies.
| | | G | 0 | 0 | Second pirate doesn't wanna die, votes yes.
| | D | G | 1 | 1 | Needs to convince two other people, can't.
| 0 | 0 | 0 | 0 | G | Second pirate will vote in favor in order to live, last one can be bribed.
D | 1 | 1 | 1 | 1 | G | Need four votes, no one's in danger of dying, only one gold piece to give. [/table]

Actually, either of the last two could end up with the gold, depending on who the first made decided to bribe. Now the real one.

{table] 1 | 2 | 3 | 4 | 5 | 6 | Explanation
| | | | | G | One pirate, gets the gold
| | | | G | 0 | 1-1 tie, first pirate gets gold
| | | 0 | 0 | G | 2-1, first pirate bribes third.
| | 0 | G | 0 | 0 | 2-2 ties, first pirate bribes second (or third)
| D | 1 | G | 1 | 1 | 3-2, can't buy enough votes, pirates vote for blood.
0 | 0 | G | 0 | 0 | 0 | 3-3 tie, second pirate doesn't want to buy, third one can be bought. [/table]

Any of the latter four pirates can end up with the gold, depending on who bribes whom.

Correct.

FreakyCheeseMan
2013-03-09, 10:44 PM
Poison Brownies Solution:
In all but the 1x1 case, the first player wins.

Proof by contradiction: Suppose that the 2nd player had a winning strategy for any game; that would mean that, nomatter what move the first player made, the second player would have a response that would put them into the winning position.

The first player takes the single brownie in the top-right corner. The second player then makes a move that puts them in the winning position. As any move involves eating the top-right brownie (if it is still there), this winning position is indistungishable from if the first player had made that move to begin with. Thus, if the first player had made that move as his opening move, he would be in a winning position, so a winning strategy cannot exist for the second player.

Quantum Glass
2013-03-10, 01:03 AM
Poison Brownies Solution:
In all but the 1x1 case, the first player wins.

Proof by contradiction: Suppose that the 2nd player had a winning strategy for any game; that would mean that, nomatter what move the first player made, the second player would have a response that would put them into the winning position.

The first player takes the single brownie in the top-right corner. The second player then makes a move that puts them in the winning position. As any move involves eating the top-right brownie (if it is still there), this winning position is indistungishable from if the first player had made that move to begin with. Thus, if the first player had made that move as his opening move, he would be in a winning position, so a winning strategy cannot exist for the second player.

I'm suspicious of the assertion, "If there's no winning strategy for the second player, the first player always wins."

It seems like it'd be more dependent on the number of brownies than the play order--assume that the first player takes the brownie directly above and to the right of the poisoned brownie, leaving a sort of, "L" shape. Assuming both M and N are greater than one, if the second player takes the furthest, say, right brownie, then it'd be no different than if the second player had started the game and N had been, "N-1."

I'm too tired to work it out, though.

Alternatively, if the first player chooses the poisoned brownie, he'll probably survive, if only because after eating that many brownies he'd likely throw up. Even if he didn't, he'd probably welcome death at that point.

FreakyCheeseMan
2013-03-10, 02:06 AM
I'm suspicious of the assertion, "If there's no winning strategy for the second player, the first player always wins."

It seems like it'd be more dependent on the number of brownies than the play order--assume that the first player takes the brownie directly above and to the right of the poisoned brownie, leaving a sort of, "L" shape. Assuming both M and N are greater than one, if the second player takes the furthest, say, right brownie, then it'd be no different than if the second player had started the game and N had been, "N-1."

I'm too tired to work it out, though.

Alternatively, if the first player chooses the poisoned brownie, he'll probably survive, if only because after eating that many brownies he'd likely throw up. Even if he didn't, he'd probably welcome death at that point.

I didn't prove that assertion, but it is true - it's one of the basic assertions of Combinatorial Game Theory. (We are assuming optimal play. If the first player makes mistakes, all bets are off.)

Proof:
Some games are clearly "Winning Positions" for the player that is about to take their turn; we'll call these N positions, because the "Next" player will win.. For instance, having only two brownies remaining is an N position; the next player will take the non-poisoned one, and the player after that will take the other. There is also clearly such a thing as P positions, from which the previous player will win - for instance, there being only a single brownie, or a brownie with rows and columns of equal length.

By these definitions, a few things arise trivially. First, if is possible to move to a P Position, then you are in an N Position - you can take that choice, at which point you become the Previous Player, and are winning. Second, if all of your options move you to N Positions, you must be in a P Position - nomatter what you do, after your turn, the Next player (currently the Previous Player) will be winning.

It is sufficient to prove that any possible state of the game is either an N Position or a P Position. We'll do this recursively, based on the maximum number of moves remaining in the game; as every player must take at least one brownie per turn, this number must be finite.

The base case is a single poisoned brownie, which is clearly a P Position. Now consider the case of any tray with multiple brownies remaining; the current player's options all reduce this to a board with fewer brownies, which, by the recursive property, will be either N or P Positions. By the above statements, if there is a single move that will put the game into a P Position, the current game is an N Position. If not, then all of them are N Positions, and the current game is in a P Position.

Thus, all games are either P Positions or N Positions. This implies that there must be a winning strategy for one player or the other; if the starting board is an N Position, it exists for player one, if the starting board is a P Position, it exists for player two.

As for your specific assumptions, they work out if the first player takes the brownie above and to the right - in fact, for games where N does not equal M, such a move will leave Player 2 with a winning strategy. However, there is no reason to assume that Player 1 does that.

Jay R
2013-03-10, 12:06 PM
It's an iterative problem. Assume N >= M

1. If M = N, then eat the (2,2) brownie. You are now left with two equal legs. Whatever he does to one, you do to the other, and he will be left with the last brownie.

2. If M =/= N, then either making M = N (by eating the (1,M+1) brownie) or eating the (2,2) brownie is a losing first move, since in either case, the other player can return two equal legs to you.

3. Result 2 is still true even in the middle of the game, with any number of bites taken, as long as M =/= N and the (2,2) brownie still exists.Therefore, in a starting game with M=/= N, whoever either evens the sides or eats the (2,2) brownie loses.

From there I'm not sure how to generalize. I've compiled all the winning and losing position for games up to 3x5, but I'm not finding the pattern.

Teddy
2013-03-10, 12:08 PM
Advanced Pirating Skills

Much like the pirates from earlier, the pirates on the ship divide up loot as follows:
1. The highest ranking pirate proposes a way to split the loot.
2. All the pirates (including the highest ranking one) vote on whether or not to accept the split.
3. If the majority vote in favour or the vote is split the loot is divided as proposed. Otherwise the highest ranking pirate is killed and all the pirates move up one place in the ranking and the process is repeated.

Assume that:
1. The pirates are perfect logicians.
2. The pirates have the following priorities, from most important to least important:
2a) Survival (i.e. don't get killed);
2b) Loot, get as much of it as possible without getting killed;
2c) Blood, kill other pirates as long as it does not get you killed and as long as you don't lose loot over it.

The scenario:
There are six pirates and a single gold piece. How is the gold split up?


The third pirate can always win a voting by giving his coin to anyone of the fourth, fifth or sixth pirate, because 2 out of 4 is a majority. This means that only the first and second pirate run the risk of getting killed.

The second pirate could buy a vote with the gold, but this wouldn't save him anyway, because 2 out of 5 is a minority, and none of those below him run the risk of getting killed. The only way for him to survive is to vote for the first pirate and hope he wins.

The first pirate (the captain) already has 2 votes secured, one from himself and one from the second pirate. By buying himself a vote from anyone of the other 3 (it doesn't matter whom) with the single gold piece, he gets 3 out of 6 votes, and wins.


In case a true majority is needed, the captain will be killed and the second pirate will give the gold piece to anyone of the fourth, fifth or sixth pirate while counting the fear-vote from the third pirate to his as well.

kurokotetsu
2013-03-10, 02:51 PM
This one is kinda of complicated, but let's see if anyone can get it.

Poison Brownies
"Poison Brownies" is a two player game played over a grid of brownies; the brownie in the lower left is poisoned. On each turn, every player must select one brownie; they eat that brownie, as well as everything above and to the right of it (so, eating the poison brownie would mean eating the rest of the board, as well.) Whoever eats the poison brownie loses.

On an MxN board, assuming both players play perfectly, who will win - the player who goes first, or the player who goes second? (Feel free to give your answer as a function of M and N.) Proof required for credit.
The first player always wins.
If the matrixis is such that M=N, then just eat the (2,2) brownie first thinking the poison one is (1,1), and that guarantees victory.

If M=/=N the suppose that N>M (the solutions are symmetrical as eating the upper brownies on a matrix is the same as eating the right brownies on the transposed matrix) for simplicity. We can see easily that in the case 2x3 that the first player wins. If we use complete induction (to simplify this non-rigorous proof) get the matrix MxN and think that all previous cases are true. By eating the top most brownie (M,N) whatever brownie the second player eats, the first player may change the matrix to that of a previous case of eating after eating the top most brownie of that matrix quite easily. As that is the second step of a matrix he already wins (one after he has eaten the top most brownie and it is the turn of the second player), the first player wins.

Other possible strategy is to show that the first player wins in a NxN+1 matrix. The first play in a MxN would be to eat all the brownies so that the resulting matrix is that give an answer that is the same as the optimal move in a NxN+1 matrix. When reduced to that, the first player wins.

That isn't a very rigorous proof, and the "official solution" (with the backing theorem) is probably stronger but that is the solution I came up with.

And allow me to do a generalization of the problem (I haven't got the solution to what I'm proposing but it may be an interesting idea)

Extended Poison Brownies
"Poison Brownies" is a two player game played over a grid of brownies; the brownie in the lower left is poisoned. On each turn, every player must select one brownie; they eat that brownie, as well as everything above and to the right of it (so, eating the poison brownie would mean eating the rest of the board, as well.) Whoever eats the poison brownie loses.

On an MxN board, assuming both players play perfectly, who will win - the player who goes first, or the player who goes second? (Feel free to give your answer as a function of M and N.) Proof required for credit.

After that game consider the following:

a)There are n different grids of brownies each of which is a MnxNn board. If the first grid contains the poison brownie, then who wins?

b) The same as a) but the poison brownie is in an unkown grid. Who wins?

My thought (or my ideas of solution without rigorous proof, please feel free to find flaws, or even better come up with your own):

a)If n is even, then the second player wins as it can reduce the game to one where he is the first player in a n-1 case. If n is not even then the first payer wins (he reduces the game to n-1 where he is the second player).

b) There is no winning strategy. The grids will all be reduced to 1x1 grids and then is only luck.

FreakyCheeseMan
2013-03-12, 09:29 AM
As described, these games are equivalent to "1 MxN grid with the lower left brownie missing, plus a number of other grids of varying sizes; first person without a legal move loses. That might make it easier to work with...

I'm trying to remember the rules for adding combinatorial games together.

Astral Avenger
2013-03-14, 08:24 PM
I need to make some brownies now...

Going with the logic puzzle aspect of this thread:
(don't look at the second one until you have decided for the first :smallamused:)
Does a list of all lists that do not contain themselves contain itself?

Does a list of all and only lists that do not contain themselves contain itself?

Pie Guy
2013-03-14, 09:33 PM
Going with the logic puzzle aspect of this thread:
(don't look at the second one until you have decided for the first :smallamused:)
Does a list of all lists that do not contain themselves contain itself?

Does a list of all and only lists that do not contain themselves contain itself?

You've disproved math. High five.

kurokotetsu
2013-03-14, 09:39 PM
I need to make some brownies now...

Going with the logic puzzle aspect of this thread:
(don't look at the second one until you have decided for the first :smallamused:)
Does a list of all lists that do not contain themselves contain itself?

Does a list of all and only lists that do not contain themselves contain itself?

That is also math, my dear friend, or isn't Set Theory part of Math now?

First, there is no definite answer, as Russell's Paradox, phrased that way, there is neither a positive nor a negative answer. The second seems identical. Unless you are going for the slight ambiguity of the first that it may contain some lists that contain themselves, because it doesn't say "only" it may contain others, but that seems tricky and not reliant in Logic.

My Extended Brownies seem troublesome. From my initial idea I've done some thinking and it doesn't seem that what I though is right (unless all the other grids are of 1x1) but the idea that maybe a sub-optimal move in a different grid may be optimal in the whole problem is making analyzing the problem really hard.

Also, I'll look for a puzzle that I remember having some fun a while ago.


You've disproved math. High five.HIgh five... a century late though.

Astral Avenger
2013-03-14, 10:33 PM
That is also math, my dear friend, or isn't Set Theory part of Math now?

First, there is no definite answer, as Russell's Paradox, phrased that way, there is neither a positive nor a negative answer. The second seems identical. Unless you are going for the slight ambiguity of the first that it may contain some lists that contain themselves, because it doesn't say "only" it may contain others, but that seems tricky and not reliant in Logic.
[snip]

HIgh five... a century late though.

Semantics, you either love 'em or hate 'em, either way you got it.
The first one must contain itself, the second one is a paradox.

kurokotetsu
2013-03-15, 02:14 AM
Semantics, you either love 'em or hate 'em, either way you got it.
The first one must contain itself, the second one is a paradox.Sorry for sounding harsh. I don't hate them... but after working with sets for a while, well the first wording is just natural and seems kind of pointless to get caught on semantics.

Here is a fun one. I remember doing it as a teen in high school. It isn't easy but, from what I remember, the key was being thorough, and checking you are not making any assumptions, so maybe some of the trickier ones posted here are harder. It usually comes with a "statistic" in the worst sense (a made up number I'm sure) but I won't insult anyone's intelligence by giving it credit.

Einstein's Riddle
1. In a street there are five houses, painted five different colours.
2. In each house lives a person of different nationality
3. These five homeowners each drink a different kind of beverage, smoke different brand of cigar and keep a different pet.

THE QUESTION: WHO OWNS THE FISH?

HINTS

1. The Brit lives in a red house.
2. The Swede keeps dogs as pets.
3. The Dane drinks tea.
4. The Green house is next to, and on the left of the White house.
5. The owner of the Green house drinks coffee.
6. The person who smokes Pall Mall rears birds.
7. The owner of the Yellow house smokes Dunhill.
8. The man living in the centre house drinks milk.
9. The Norwegian lives in the first house.
10. The man who smokes Blends lives next to the one who keeps cats.
11. The man who keeps horses lives next to the man who smokes Dunhill.
12. The man who smokes Blue Master drinks beer.
13. The German smokes Prince.
14. The Norwegian lives next to the blue house.
15. The man who smokes Blends has a neighbour who drinks water.

Frozen_Feet
2013-03-15, 07:22 AM
Here is a collection of rather simple probability math problem, with real-life application! :smalltongue:

Assume that without any protection, each intercourse has 8,5% of causing pregnancy.
With typical use of a condom, the chance is 2,1%. With perfect use, it is 0,5%.
With typical use of pills, the chance is 0,8%. With perfect use, it is 0,03%.

By which age is a woman 50% likely to have become pregnant at least once, if she starts her active sex life:

a) at 13 years of age.
b) at 15 years of age.
c) at 17 years of age.
d) at 19 years of age.
e) at 21 years of age.

And has sex:

a) once per six months.
b) once per month.
c) once per two weeks.
d) once per week.
e) once per day.

Using:

a) no protection.
b) condom only.
c) pill only.
d) both pill and condom. (assume multiplicative protection).

a) Typically.
b) Perfectly.

?

Teddy
2013-03-15, 08:28 AM
Einstein's Riddle
1. In a street there are five houses, painted five different colours.
2. In each house lives a person of different nationality
3. These five homeowners each drink a different kind of beverage, smoke different brand of cigar and keep a different pet.

THE QUESTION: WHO OWNS THE FISH?

HINTS

1. The Brit lives in a red house.
2. The Swede keeps dogs as pets.
3. The Dane drinks tea.
4. The Green house is next to, and on the left of the White house.
5. The owner of the Green house drinks coffee.
6. The person who smokes Pall Mall rears birds.
7. The owner of the Yellow house smokes Dunhill.
8. The man living in the centre house drinks milk.
9. The Norwegian lives in the first house.
10. The man who smokes Blends lives next to the one who keeps cats.
11. The man who keeps horses lives next to the man who smokes Dunhill.
12. The man who smokes Blue Master drinks beer.
13. The German smokes Prince.
14. The Norwegian lives next to the blue house.
15. The man who smokes Blends has a neighbour who drinks water.

The German.

Took me aproximately 10 minutes to solve. Windows Journal is a really useful tool in situations such as these.


Here is a collection of rather simple probability math problem, with real-life application! :smalltongue:

Assume that without any protection, each intercourse has 8,5% of causing pregnancy.
With typical use of a condom, the chance is 2,1%. With perfect use, it is 0,5%.
With typical use of pills, the chance is 0,8%. With perfect use, it is 0,03%.

By which age is a woman 50% likely to have become pregnant at least once, if she starts her active sex life:

a) at 13 years of age.
b) at 15 years of age.
c) at 17 years of age.
d) at 19 years of age.
e) at 21 years of age.

And has sex:

a) once per six months.
b) once per month.
c) once per two weeks.
d) once per week.
e) once per day.

Using:

a) no protection.
b) condom only.
c) pill only.
d) both pill and condom. (assume multiplicative protection).

a) Typically.
b) Perfectly.

?

There were a bit too many repeated calculations to bother doing this by hand, so I wrote a small Java program to do it for me instead:


import java.util.Scanner;

public class Contraception {

public static void main(String[] args) {

double noneProb = 0.085;
double condomProb = 0.021;
double pillProb = 0.008;
double perfCondomProb = 0.005;
double perfPillProb = 0.0003;

double prob = 1.0;
double timesUntil50Perc = 0;

int startAge = 0;
int timesPerYear = 0;
boolean usesCondoms = false;
boolean usesPills = false;
boolean usesPerfectly = false;

Scanner scan = new Scanner(System.in);

System.out.println("Input age at start of active sex life:");
startAge = scan.nextInt();

System.out.println("Input average rate of intercourse:\n" +
"1: Once per six months\n" +
"2: Once per month\n" +
"3: Once per two weeks\n" +
"4: Once per week\n" +
"5: Once per day");
switch(scan.nextInt()) {
case 1:
timesPerYear = 2;
break;
case 2:
timesPerYear = 12;
break;
case 3:
timesPerYear = 26;
break;
case 4:
timesPerYear = 52;
break;
case 5:
timesPerYear = 365;
break;
default:
}

System.out.println("Input types of contraceptives used:\n" +
"1: None\n" +
"2: Condoms\n" +
"3: Pills\n" +
"4: Condoms and pills");
switch(scan.nextInt()) {
case 1:
break;
case 2:
usesCondoms = true;
break;
case 3:
usesPills = true;
break;
case 4:
usesCondoms = true;
usesPills = true;
break;
default:
}

if(usesCondoms || usesPills) {

System.out.println("Input use of contraceptives:\n" +
"1: Regular\n" +
"2: Perfect");
switch(scan.nextInt()) {
case 1:
break;
case 2:
usesPerfectly = true;
break;
default:
}
}

if(usesCondoms) {
if(usesPerfectly) {
prob *= perfCondomProb;
} else {
prob *= condomProb;
}
}

if(usesPills) {
if(usesPerfectly) {
prob *= perfPillProb;
} else {
prob *= pillProb;
}
}

if(!usesCondoms && !usesPills) {
prob *= noneProb;
}

timesUntil50Perc = Math.log(0.5) / Math.log(1.0 - prob);
System.out.println("This woman will have a 50% likelyhood of having " +
"become pregnat at least once at the age of " +
(startAge + (int) (timesUntil50Perc / timesPerYear)));

scan.close();
}

}

kurokotetsu
2013-03-15, 03:24 PM
Here is a collection of rather simple probability math problem, with real-life application! :smalltongue:

Assume that without any protection, each intercourse has 8,5% of causing pregnancy.
With typical use of a condom, the chance is 2,1%. With perfect use, it is 0,5%.
With typical use of pills, the chance is 0,8%. With perfect use, it is 0,03%.

By which age is a woman 50% likely to have become pregnant at least once, if she starts her active sex life:

a) at 13 years of age.
b) at 15 years of age.
c) at 17 years of age.
d) at 19 years of age.
e) at 21 years of age.

And has sex:

a) once per six months.
b) once per month.
c) once per two weeks.
d) once per week.
e) once per day.

Using:

a) no protection.
b) condom only.
c) pill only.
d) both pill and condom. (assume multiplicative protection).

a) Typically.
b) Perfectly.

?

Let's call P* the probability of not getting pregnant. If P is the probability of getting pregnant, then it is obvious that P*=1-P. After n intercourse the probability of not getting pregnant is P*=(1-P)^n. We can see that n=t*t0, where t is the time that has elapsed and t0 is the frequency of number of intercourse per time. The age A is A=A0+t, where A0 is the starting age. We can thus end with the horrible looking P*=(1-P)^((A-A0)*t0). If we want to find A then, we get ln(P*)/(ln(1-P)*t0)+A0=A. Plug in the values for the different conditions and voilà. Just calculate the values for the different conditions. A program or even a spreadsheet should be enough. Just measure times in years, so no discrepancies are found calculating this stuff.


The German.

Took me aproximately 10 minutes to solve. Windows Journal is a really useful tool in situations such as these.Right, but it is better to try and do it by hand.

Teddy
2013-03-15, 03:30 PM
Right, but it is better to try and do it by hand.

I did, albeit through a tablet pen. Windows Journal is little more than a graph paper in computerised form, with the added functionality of being able to quickly move around different parts of your scribbles without having to erase anything (oh how I miss this functionality on physical paper. I've almost begun to draw circles around misplaced text out of pure habit). Since much of Einstein's riddle is about puzzling (although this is one of the less trial-and-error-requiring examples I've seen), that feature proved to be rather handy.

Frozen_Feet
2013-03-15, 03:37 PM
An Excell spreadsheet, that I was expecting. But a Java program...? Congrats, Sir. But how do I get it to run? :smalltongue: *is Java inept*

Teddy
2013-03-15, 04:20 PM
An Excell spreadsheet, that I was expecting. But a Java program...? Congrats, Sir. But how do I get it to run? :smalltongue: *is Java inept*

Well, I just happen to like programming...

Anyway, you're going to need a Java compiler for the task, and if you're Java inept, chances are great that you don't have any lying around. Now, if you happen to have one anyway, you could just copy the code into a .java (plaintext) document, compile and run the resulting class file (for example by typing "Java FILENAME.class" in the command prompt while standing in the source folder).

That said, the algorithm is anything but complex, and basically boils down to asking the user to input each parameter and then runs the computations.

kurokotetsu
2013-03-15, 04:56 PM
I did, albeit through a tablet pen. Windows Journal is little more than a graph paper in computerised form, with the added functionality of being able to quickly move around different parts of your scribbles without having to erase anything (oh how I miss this functionality on physical paper. I've almost begun to draw circles around misplaced text out of pure habit). Since much of Einstein's riddle is about puzzling (although this is one of the less trial-and-error-requiring examples I've seen), that feature proved to be rather handy.AH, I don't use tablet's so I didn't know the program. It is indeed just puzzling, and if well written it shouldn't be about trial-and-error.


An Excell spreadsheet, that I was expecting. But a Java program...? Congrats, Sir. But how do I get it to run? :smalltongue: *is Java inept*You don't need to run to run it, just read the code, while you know a little programming there isn't anything that strange there (never touched Java in my life but it can be read easily with knowledge of C). From what I see, it works, but misses the possibilities of "Condom and Pills" (I see no "if (usesCondoms && usesPills)" loop) and depending on how Java compiles there might be a problem with accessing that option (it will run both "if" and probably just stick with the latter probability), but the formula is sound (and the same that I posted) and the probabilities correct. I see no reason why with some slight modifications shouldn't it calculate all the probabilities the problem asked for.

Teddy
2013-03-15, 05:14 PM
You don't need to run to run it, just read the code, while you know a little programming there isn't anything that strange there (never touched Java in my life but it can be read easily with knowledge of C). From what I see, it works, but misses the possibilities of "Condom and Pills" (I see no "if (usesCondoms && usesPills)" loop) and depending on how Java compiles there might be a problem with accessing that option (it will run both "if" and probably just stick with the latter probability), but the formula is sound (and the same that I posted) and the probabilities correct. I see no reason why with some slight modifications shouldn't it calculate all the probabilities the problem asked for.

Actually, it allows for the "condoms and pills" option just by walking through both if cases. The *= operator is a combined multiplication and assigment, multiplying the left and right term and assigning the result to the left term again. By walking through both if cases, you'll multiply the initial 1 with both probabilities, as specified.

kurokotetsu
2013-03-15, 05:31 PM
Actually, it allows for the "condoms and pills" option just by walking through both if cases. The *= operator is a combined multiplication and assigment, multiplying the left and right term and assigning the result to the left term again. By walking through both if cases, you'll multiply the initial 1 with both probabilities, as specified.Never used that operator. Then the program is sound indeed.

Since we are going through the classics, let's go to Graph Theory

The Seven Bridges of Königsberg
In the city of Königsberg there were two islands in the river. Connecting the islands and both shores there were seven bridges, connected like this:

http://upload.wikimedia.org/wikipedia/commons/5/5d/Konigsberg_bridges.png

There is no other way to have access to the islands. The problem is to find a path that crosses all seven bridges once and only once. When crossing a bridge it must be crossed all the way. You don't have to start and end at the same spot. Proof is necessary, as the answer is well known.

Douglas
2013-03-15, 10:01 PM
Bridges
No such route is possible.

Consider the area north of the river. This area has 3 bridges connecting to it, therefore any qualifying path must enter and exit that area a total of 3 times. 3 is an odd number, so the number of times entering and exiting that area cannot be equal - if they were, their sum would be even. The only way this can be satisfied is if either the starting or ending location is in this area.

The same argument can be made for the area south of the river, and for the eastern of the two islands. This means there are 3 separate areas that each must contain a starting or ending point. There are only 2 such points to distribute, however, so such an arrangement is impossible.

Edit: Wait, the western island has 5, which is also odd. So you actually have 4 areas that each require a start/end point, making it extra impossible.

kurokotetsu
2013-03-16, 12:30 AM
Bridges
No such route is possible.

Consider the area north of the river. This area has 3 bridges connecting to it, therefore any qualifying path must enter and exit that area a total of 3 times. 3 is an odd number, so the number of times entering and exiting that area cannot be equal - if they were, their sum would be even. The only way this can be satisfied is if either the starting or ending location is in this area.

The same argument can be made for the area south of the river, and for the eastern of the two islands. This means there are 3 separate areas that each must contain a starting or ending point. There are only 2 such points to distribute, however, so such an arrangement is impossible.

Edit: Wait, the western island has 5, which is also odd. So you actually have 4 areas that each require a start/end point, making it extra impossible.Correct Although there may be also only zero points to distribute

Another Graph Theory problem:

Water, electricity and gas
Suppose there are three cottages on a plane and each needs to be connected to the gas, water, and electric companies. Using a third dimension or sending any of the connections through another company or cottage is disallowed. Is there a way to make all nine connections without any of the lines crossing each other?

Demidos
2013-03-16, 01:56 AM
Another Graph Theory problem:

Water, electricity and gas
Suppose there are three cottages on a plane and each needs to be connected to the gas, water, and electric companies. Using a third dimension or sending any of the connections through another company or cottage is disallowed. Is there a way to make all nine connections without any of the lines crossing each other?[/QUOTE]

Two! Possible outcomes:

First, assume each company maintains three different sites from which they distribute their product. Problem easily solvable!

Assuming this blatantly cheesy answer is not accepted, the solution should be as follows:
Assume E to signify the electric company, and G the gas company, while the C's each signify a cottage, and any lines are a connection. Discount water for now.
............C............
........../....\........
......../........\......
.....E.--.C.--.G....
.......\........./.......
.........\...../.........
............C............

Take a moment to assure yourself that the above (with a little wire distortion) is the only way to connect the three cottages to the two utilities. Now, consider where to place the water plant. Any given open area adjoins only two of the cottages. Therefore, it may be assumed that this problem is impossible and has no solution other than the one presented above.

Emmerask
2013-03-16, 09:25 AM
Technically you did not disallow for a company to have two facilities :smalltongue:


......-----C............
.../....../.W..\........
./....../....|...\......
W....E.--.C.--.G....
.\.....\........./.......
...\.....\...../.........
....------C............

Demidos
2013-03-16, 03:27 PM
Edit: Emmerask...shush! :smallsigh: :smallamused:


Oh, came up with another one.

Three Guards (Hard):
There are two paths before you; one leads to victory, one leads to death. There are three brothers, one who always lies, one who always speaks the truth, and one who does either, at random. What is the least number of questions by which you can be certain of choosing the correct path? Each question must be asked to a single, specific guard.


One.http://www.giantitp.com/forums/images/smilies/smallstick/smallwink.gif

Indicating a specific path as, "Path A," you walk up to any brother and ask, "Is exactly one of the following statements true: "Your answer to this question will be a lie; Path A will lead me to victory"?

This creates a scenario where, regardless of the brother you choose, you'll always get the same answer. If the brother answers yes, Path A is the safe one. If he answers no, Path B.


Huh. Well done, so long as the random brother is entirely honest or dishonest within the scope of a single question.

I believe this answer to be incorrect. Logic, as well as a new suggested set of questions, spoilered below.

Reason for believing the above answer to be incorrect:
Assuming you did your math as follows:
Variables:
Brothers Truthteller (1), Liar (2), and Random (3)
and parts of the question Your answer will be a lie (a), Path _ will be correct (b) and the final answer given (c):
Assuming that they consider the answers to the first two statements truthfully, and then make a judgement on how to answer given they answered the two-part-statement truthfully.

To explain the logic above:
Assume the asker asks the question above about Path A, and the correct answer is indeed path A
...a...b...c
1.N..Y...Y
2.Y...Y...Y
3.?...?...Y/N --> In this case, if the random brother says no, you lose.

Already the logic breaks down: The random brother may answer either Yes or no, regardless of what path is the correct one. Following the same logic with the same question but a different correct path, while your answer would work for the truthteller and liar (both would answer no) there is no guarantee about what the random brother would answer (yes or no).


Suggested answer:

Two.

Ask the guards-->
"Assuming that you are either the truth-telling brother or the lying brother, and assuming that your random brother was not your brother, who would your brother say was the one who wasn't related to you?"

Assuming they point out one of the other two brothers, pick to question the brother they pointed to next, and ask:

"Assuming that you are either the truth-telling brother or the lying brother, and assuming that your random brother was not your brother, which path would your brother tell me to take to victory?"

Take the opposite path from the one pointed out. You win.

Logic:
When you originally decide to ask a guard, you have a 1/3 chance of asking any one of them. When you use the (admittedly rather convoluted) logic of the first question, you should have the guard pointing at the non-random brother -- Assume you asked either the truth-teller or the liar. If you ask the truthteller, he would point you to the one the liar would point you to -- the liar himself (as the liar would lie to make the random guard part of the family). Assuming you asked the liar, through a similar chain of logic, you would arrive at the truthful guard. At this point, the question is identical to the ever-popular two guard question, and can again be solved through a progression of the same logic.
In the case that you asked the random brother first, the possible answers can be condensed down to two choices -- either he points to one of the other two brothers, or he does not. If he does, then you can safely talk to the one he pointed to, as either of the truth-telling or lying brothers would tell you the correct route given the phrasing of the second question. If he answers something different, you can safely assume that he is the random brother, and proceed to question either of the other brothers in any case.

/logicrant



Anyone feeling up to checking my horribly convoluted work? The wording in the questions is somewhat clunky, but I didn't really see how it could be prettyied up much more than it already is.

Teddy
2013-03-16, 05:05 PM
Continuing on the guards, here's another of them:
One guard:
At a crossroad, there are two paths before you; one leads to your heart's greatest desire, the other leads to torturous death. There is also a single guard positioned at the crossroad, and only he knows which path leads where (there may have been others, but they took the correct path years ago :smallwink:). He'll gladly answer your questions, but there are two caveats, the guard can only answer each question with a single "yes" or "no", and before each question, he'll randomly decide whether to respond with a lie or the truth. Which question should you ask to determine the correct path?


Anyone feeling up to checking my horribly convoluted work? The wording in the questions is somewhat clunky, but I didn't really see how it could be prettyied up much more than it already is.

Alternative solution, saves you 242 characters (:smallwink:). It also works on a strict yes-or-no setup (Warning: this solution spoils the answer to the problem I posted above, so don't read this if you want to solve it on your own first):
Walk up to any brother, point at any of the remaining two and ask the following question:
"If I asked you if this man gives me non-random answers to my questions, would you say yes?"

If he says yes, walk over to the brother you pointed at, and if he says no, walk over to the last brother. Now point at any of the victory and ask the following question:
"If I asked you if this path leads to victory, would you say yes?"

If he says yes, take the path, otherwise, take the other path.


Rationale: A double negative is a positive, so by baking all you questions into an "If I asked you [...], would you say yes?", you can get both the truthsayer and the liar to answer the nestled question truthfully, and no matter what the random brother says, you'll only have consistent brothers to pick from, so his answer doesn't matter.

PirateMonk
2013-03-16, 09:33 PM
I need to make some brownies now...

Going with the logic puzzle aspect of this thread:
(don't look at the second one until you have decided for the first :smallamused:)
Does a list of all lists that do not contain themselves contain itself?

Does a list of all and only lists that do not contain themselves contain itself?

If lists must be finite, any set containing all lists which do not contain themselves is infinite and therefore cannot be listed.

If lists are allowed to be infinite, any set containing all lists which do not contain themselves is uncountable and therefore cannot be listed.

DaedalusMkV
2013-03-17, 02:15 AM
Continuing on the guards, here's another of them:
One guard:
At a crossroad, there are two paths before you; one leads to your heart's greatest desire, the other leads to torturous death. There is also a single guard positioned at the crossroad, and only he knows which path leads where (there may have been others, but they took the correct path years ago :smallwink:). He'll gladly answer your questions, but there are two caveats, the guard can only answer each question with a single "yes" or "no", and before each question, he'll randomly decide whether to respond with a lie or the truth. Which question should you ask to determine the correct path?


"If you were responding the opposite of the way you have chosen to respond, which way would you tell me to go to find victory?"

If he has chosen to tell the truth, he will respond with the wrong path (as he would have lied if he had chosen the opposite response). If he has chosen to lie, he will also respond with the wrong path (as he will give the opposite of the correct response he would have given had he chosen to tell the truth). It's essentially the same scenario as the two brothers scenario, as we can split the guard himself into two separate binary states.

Teddy
2013-03-17, 04:00 AM
"If you were responding the opposite of the way you have chosen to respond, which way would you tell me to go to find victory?"

If he has chosen to tell the truth, he will respond with the wrong path (as he would have lied if he had chosen the opposite response). If he has chosen to lie, he will also respond with the wrong path (as he will give the opposite of the correct response he would have given had he chosen to tell the truth). It's essentially the same scenario as the two brothers scenario, as we can split the guard himself into two separate binary states.


Correct, but asking him to respond differently from what he'd normally do is somewhat redundant, you may as well ask him how he'll respond ito the question to start with, and he'll give you the truthful answer.

DaedalusMkV
2013-03-17, 01:28 PM
Correct, but asking him to respond differently from what he'd normally do is somewhat redundant, you may as well ask him how he'll respond ito the question to start with, and he'll give you the truthful answer.


No, we need that as a filter. Much as with the original brothers' case, we need to have two binary states (truth and lies) making reference to each other to get a reliable result. If you just ask him "What path leads to victory", we cannot be sure of the answer. Only by running it through the filter of the opposite position can we remove any uncertainty from the result while accomplishing the task with one question, since asking him about the opposite position cancels out both states. Short of adding (frankly, cheating) clauses like "If you've chosen to lie in response to this question", that solution should be the only way to do it.

kurokotetsu
2013-03-17, 02:06 PM
If lists must be finite, any set containing all lists which do not contain themselves is infinite and therefore cannot be listed.

If lists are allowed to be infinite, any set containing all lists which do not contain themselves is uncountable and therefore cannot be listed.What? Sorry, but that answer doesn't make any sense at all. How do you know that the lists are uncountable? There could be more. And even if they are uncountable, why can it not be listed? I can make a lists using all of the real numbers, there could be a pi list and a square root of two list, and that would still be a list. I frankly don't see how your argument works.

Also, as I previously mentioned, this is Russell's Paradox, which doesn't care about infinite of finite, only about the vague definition of set. Using ZFC we can easily prove that the object we are talking about is not really a set (or "list" as we are saying here). It has nothing to do with the cardinality of the set.

And the first part doesn't make sense. Finite lists. Let A={B,C}.. Let B={X| X is element of A and X isn't an element of X}. B is finite (at most it has B and C) and it still is the same paradox as presented before, but it doesn't fit your argument of "the list must be infinite".


Technically you did not disallow for a company to have two facilities :smalltongue:


......-----C............
.../....../.W..\........
./....../....|...\......
W....E.--.C.--.G....
.\.....\........./.......
...\.....\...../.........
....------C............It is implicit in the problem. Two facilities is the same as two companies, so the problem implicitly states that there is only one facility as there is only one company. The correct answer has been given.

Teddy
2013-03-17, 02:24 PM
No, we need that as a filter. Much as with the original brothers' case, we need to have two binary states (truth and lies) making reference to each other to get a reliable result. If you just ask him "What path leads to victory", we cannot be sure of the answer. Only by running it through the filter of the opposite position can we remove any uncertainty from the result while accomplishing the task with one question, since asking him about the opposite position cancels out both states. Short of adding (frankly, cheating) clauses like "If you've chosen to lie in response to this question", that solution should be the only way to do it.


You misunderstand me, we do indeed need to filter the answer, but we don't need to tell him to answer oppositely to what he would, when he could answer equally to what he would. The question "What would you answer if I with this question asked you if this is the correct path" (while pointing at one of the two paths) would yield the truthful answer no matter if he's decided to lie or tell the truth, since a double negative equals a positive.

All in all, there's nothing wrong with your solution, but it makes it clear that you think heavily in terms of the problem with the two guards even when it isn't neccessary.

enderlord99
2013-03-17, 04:23 PM
Here's one I came up with:

The easily bored guard, and his absent friend

There are two paths, you don't know which one to take, the guards know, one lies, other tells the truth, yaddayaddayadda. Only one guard is currently on duty, and you don't know which it is. You may ask any number of questions, as long as you do not use more than ten syllables, and do not say the same word multiple times. Questions may not reference themselves, but may reference each other. What question(s) do you ask?

Teddy
2013-03-17, 04:48 PM
Here's one I came up with:

The easily bored guard, and his absent friend

There are two paths, you don't know which one to take, the guards know, one lies, other tells the truth, yaddayaddayadda. Only one guard is currently on duty, and you don't know which it is. You may ask any number of questions, as long as you do not use more than ten syllables, and do not say the same word multiple times. Questions may not reference themselves, but may reference each other. What question(s) do you ask?

"Your brother says this is path to win, yes?" :smallwink:

enderlord99
2013-03-17, 06:03 PM
"Your brother says this is path to win, yes?" :smallwink:

How do you know the guards are brothers in the first place? Also, note that the guard can answer with anything -not just yes or no- as long as it could be considered a reasonable answer, regardless of whether or not it is actually a correct one (for example, "Which direction should I travel in?" yields either "Left" or "Right" with no guarantee of accuracy), so the guard might say "I don't know what that means" or, if he's the liar, "that made perfect sense!" After all, it was pretty confusing grammar. Good try, though.:smallsmile: The answer I was originally looking for includes multiple questions, by the way.

Also, I should probably clarify that the final rule applies to questions, not guards (in case that was unclear).

Astral Avenger
2013-03-20, 02:39 PM
Here's one I came up with:

The easily bored guard, and his absent friend

There are two paths, you don't know which one to take, the guards know, one lies, other tells the truth, yaddayaddayadda. Only one guard is currently on duty, and you don't know which it is. You may ask any number of questions, as long as you do not use more than ten syllables, and do not say the same word multiple times. Questions may not reference themselves, but may reference each other. What question(s) do you ask?

What would other guard say, is this path safe?
or How would other guard answer, this safe path?

Fortuna
2013-03-23, 04:30 PM
On the absent guard:
Would the other guard call this path safe?

Really, it's more of a test of your language skills than a logic puzzle.

Here's an old favourite. Explain the order of this list of numbers:

8, 18, 11, 15, 5, 4, 14, 9, 19, 1, 7, 17, 6, 16, 10, 13, 3, 12, 20, 2.

enderlord99
2013-03-23, 04:37 PM
Really, it's more of a test of your language skills than a logic puzzle.

Seeing as the non-present guard doesn't need to be mentioned, yet everyone does anyway, I would say that logic is, in fact, necessary. Remember that you could have multiple questions...

Teddy
2013-03-23, 06:09 PM
Seeing as the non-present guard doesn't need to be mentioned, yet everyone does anyway, I would say that logic is, in fact, necessary. Remember that you could have multiple questions...

Removing constraints generally doesn't make a problem any more logic-demanding...

enderlord99
2013-03-23, 06:22 PM
Removing constraints generally doesn't make a problem any more logic-demanding...

Those are separate statements. The second was just a hint (my original solution requires exactly 2 questions, neither of which mentioned the other guard at all.)

Astral Avenger
2013-03-24, 10:19 AM
Those are separate statements. The second was just a hint (my original solution requires exactly 2 questions, neither of which mentioned the other guard at all.)

Does one plus three equal four?
(determines if he is lying (there was no random guard in this one right?))
Is this the safe path?

Lorsa
2013-03-25, 06:00 AM
Here's an old favourite. Explain the order of this list of numbers:

8, 18, 11, 15, 5, 4, 14, 9, 19, 1, 7, 17, 6, 16, 10, 13, 3, 12, 20, 2.

They are in alphabetical order. Well, if you write them down with characters in english anyway. In another language they might not be quite so alphabetical.

Lorsa
2013-03-25, 07:14 AM
Having read through parts of the thread now (this is very interesting although a bit tedious with all the guards iterations that essentially have the same answer) I wonder if the People with Unknown Eyecolor at an Island got solved? Here are my thoughts on the matter if I understood the problem right.

Everyone with blue eye color will leave that midnight. They can go to the ferryman and exclaim "I have blue eyes 'cause the Guru said so!". That would be their proof. Half of them would be wrong and thus turned away. Regardless of what "proof" the ferryman needs, if the Guru proclaimed he could see someone with blue eyes, going to the ferryman and saying you have blue eyes is really your best bet.

Also some comments on the first page problems (since that's where I started looking).

On the pirate and loot distribution, I think it fails on being a logic puzzle. If you are the 8th mate and the captain wants to give you 1 gold piece, you could theoretically miss out on that 1 gold piece by declining but since 1 gold piece is a very tiny amount and 100 gold (your potential gain) is a very large amount, you would vote no and the captain would be killed. The problem has so many solutions that there isn't one "right" answer. Your best bet as a captain would be to distribute everything equally. If you give away only 1 coin to a person then he might decline your offer on account of hoping the first mate will take the hint and give him more.

On the two envelope with money thing, if you get $500 then switching is always best as the net gain is higher than the net loss. If the sum is unknown however then you might as well keep it. There are two sums, X and Y, and there's a 50% chance you have chosen Y (the higher sum). If you follow some form of math that there is some gain to be had by switching then it means an infinite number of switching the unopened envelopes would give you an infinite increase in net gain. And that's just not possible with there only being two fixed values. So you clearly can't use the same reasoning as if you know the sum.

Oh, and with the 3 guards and one being random, if you ask the lying guard what his brothers will say the only thing he can answer about the random brother is "I know what he will say". Randomly choosing one answer and hoping it will be wrong does not constitute lying. If he say "Path A" and then the random brother would end up saying "Path A" then his answer was not a lie which was impossible in the scenario. The truthful brother can say "I don't know what my (random) brother would say" and the lying one can say "I know what my (random) brother would say". Neither of them can make any form of statement of his actual answer.

Lorsa
2013-03-25, 07:58 AM
Also, a comment on the infinite hotel.

While it's quite easy to see that you could, in theory, fit any number of new people simply by switching rooms of those already living in the hotel, I've always found those solutions fail to consider the fact that there will always be 1 (or more) person in the corridor. Fitting the one new guest for example, means you take one person, throw him out in favor of your new guest and then have him knock in the door of his neighbor, throw him out etc etc. All this leads to is that there forever is at least 1 person in the corridor, which doesn't really feel like "having room for the 1 new person". With infinite number of new guests, there will be an infinite number of people running around in the corridors at any given time.

Teddy
2013-03-25, 09:10 AM
Also, a comment on the infinite hotel.

While it's quite easy to see that you could, in theory, fit any number of new people simply by switching rooms of those already living in the hotel, I've always found those solutions fail to consider the fact that there will always be 1 (or more) person in the corridor. Fitting the one new guest for example, means you take one person, throw him out in favor of your new guest and then have him knock in the door of his neighbor, throw him out etc etc. All this leads to is that there forever is at least 1 person in the corridor, which doesn't really feel like "having room for the 1 new person". With infinite number of new guests, there will be an infinite number of people running around in the corridors at any given time.

This is why you link the PA system to every room in the hotel. Just announce that everyone should move to the room with a number one step above theirs, and you'll be done in a matter of seconds (during which an infinite amount of guests occupy the corridor rather than just one)

Douglas
2013-03-25, 11:10 AM
Having read through parts of the thread now (this is very interesting although a bit tedious with all the guards iterations that essentially have the same answer) I wonder if the People with Unknown Eyecolor at an Island got solved? Here are my thoughts on the matter if I understood the problem right.

Everyone with blue eye color will leave that midnight. They can go to the ferryman and exclaim "I have blue eyes 'cause the Guru said so!". That would be their proof. Half of them would be wrong and thus turned away. Regardless of what "proof" the ferryman needs, if the Guru proclaimed he could see someone with blue eyes, going to the ferryman and saying you have blue eyes is really your best bet.
The fact that half of them would be wrong should tell you instantly that it is not, in fact, proof. If your solution even contains the possibility of someone going to the ferryman and being wrong about his eye color, then your solution fails.

Lorsa
2013-03-25, 12:33 PM
The fact that half of them would be wrong should tell you instantly that it is not, in fact, proof. If your solution even contains the possibility of someone going to the ferryman and being wrong about his eye color, then your solution fails.

Well, yes, you are right, it's not a proof. So here is my next suggestion:

The person with blue eyes leaves the island. At this particular time, there are only two people still left, a person with blue eyes and the Guru. If the Guru will speak only one day of all the years on the island, then they would all reach the conclusion that the best chance of leaving is killing off everyone else so that's what they did and the person left alive happened to have blue eyes and on the midnight after the Guru had spoken, she left with the ferryman.

Jay R
2013-03-25, 01:01 PM
The nuclear option for hard logic puzzles: Blue Eyes
A group of people with assorted eye colors live on an island. They are all perfect logicians -- if a conclusion can be logically deduced, they will do it instantly. No one knows the color of their own eyes. Every night at midnight, a ferry stops at the island. Any islanders who have figured out the color of their own eyes then leave the island on the ferry that night, and the rest stay. Everyone can see everyone else at all times and keeps a count of the number of people they see with each eye color (excluding themselves), but they cannot otherwise communicate (except in the single event described below, in which only the communication described there occurs). There are 201 people on the island, one of whom is known as the Guru. The Guru never lies. Everyone on the island knows all the rules in this paragraph.

Of the 200 people on the island, 100 have blue eyes, 100 have brown, and the Guru has green. So any given blue-eyed person can see 100 people with brown eyes and 99 people with blue eyes (and one with green), but that does not tell him his own eye color; as far as he knows the totals could be 101 brown and 99 blue. Or 100 brown, 99 blue, and he could have red eyes.

The Guru is allowed to speak once (let's say at noon), on one day in all their endless years on the island. Standing before all the islanders, she says the following:

"I can see someone who has blue eyes."

Who leaves the island, and on what night, and why?

If only one person had blue eyes, he'd leave on the first night after the guru spoke, since he can't see anybody else with blue eyes. If only two people had blue eyes, then on day two, each would realize that the other didn't leave, and therefore they both saw blue eyes. So they both leave on the second night after the guru spoke.

If three people had blue eyes, and nobody left on night two, each would realize that the other two saw two blue-eyed people, and all three would leave on night three ATGS.

Lather, rinse, repeat.

All 100 blue-eyed people leave on Night 100 ATGS.

(After that, each brown-eyed person knows that his or her eyes aren't blue, but since they all see both brown and green eyes, there's no way to determine their own eye color.)

Douglas
2013-03-25, 01:51 PM
Well, yes, you are right, it's not a proof. So here is my next suggestion:

The person with blue eyes leaves the island. At this particular time, there are only two people still left, a person with blue eyes and the Guru. If the Guru will speak only one day of all the years on the island, then they would all reach the conclusion that the best chance of leaving is killing off everyone else so that's what they did and the person left alive happened to have blue eyes and on the midnight after the Guru had spoken, she left with the ferryman.
"The" person with blue eyes? Which one? There are 100 of them (though none of them actually know that at the start), and the Guru didn't point to any particular one of them.

I have no idea how you think killing each other off would help achieve leaving the island.

The solution does not involve any non-mental actions by anyone other than leaving (or not leaving) the island.


If only one person had blue eyes, he'd leave on the first night after the guru spoke, since he can't see anybody else with blue eyes. If only two people had blue eyes, then on day two, each would realize that the other didn't leave, and therefore they both saw blue eyes. So they both leave on the second night after the guru spoke.

If three people had blue eyes, and nobody left on night two, each would realize that the other two saw two blue-eyed people, and all three would leave on night three ATGS.

Lather, rinse, repeat.

All 100 blue-eyed people leave on Night 100 ATGS.

(After that, each brown-eyed person knows that his or her eyes aren't blue, but since they all see both brown and green eyes, there's no way to determine their own eye color.)
Correct.

Lorsa
2013-03-25, 03:02 PM
Well, I was trying to think outside the box. If you killed off everyone else before the Guru spoke obviously his words would be directed at you.

The solution seem quite simple though, yet brilliant. I obviously need to practice my logical skills a bit. :smallsmile:

Jay R
2013-03-26, 08:22 AM
Well, I was trying to think outside the box. If you killed off everyone else before the Guru spoke obviously his words would be directed at you.

The solution seem quite simple though, yet brilliant. I obviously need to practice my logical skills a bit. :smallsmile:

Giggle. If you were planning to solve a logic puzzle with mass murder, then I suggest that you need to work on your social skills.

Lorsa
2013-03-26, 06:24 PM
Giggle. If you were planning to solve a logic puzzle with mass murder, then I suggest that you need to work on your social skills.

Or maybe I've been playing too many roleplaying games?

Androgeus
2013-03-26, 06:55 PM
Well, I was trying to think outside the box. If you killed off everyone else before the Guru spoke obviously his words would be directed at you.

The solution seem quite simple though, yet brilliant. I obviously need to practice my logical skills a bit. :smallsmile:

Just make sure you move the bodies out of sight first.

Frozen_Feet
2013-03-31, 08:30 PM
Well, I was trying to think outside the box.

Thinking outside the box would be to ignore the guru, go to the beach, and look at your reflection from the water. They're on an island, after all. At night, they would point their reflection from the water - the ferryman will need some kind of light to steer the ferry, so this is possible. :smallwink:

Astral Avenger
2013-04-04, 09:38 PM
A casino has a game where a coin is flipped. You must pay any positive, real amount of money to play. If the coin is heads, you get double what you paid. If the coin is tails, you loose what you paid.

Edit: The coin has equal probability of being heads or tails.
You can play multiple times.

1) Is this game fair?
2) Is there a winning strategy?
3) People have gotten 1 and 2 right, so, what is the average amount you need to win $X using the strategy that Goosefeather and Jay R explained?

@Frozen_Feet

Thinking outside the box would be to ignore the guru, go to the beach, and look at your reflection from the water. They're on an island, after all. At night, they would point their reflection from the water - the ferryman will need some kind of light to steer the ferry, so this is possible. :smallwink:

its possible for their to be enough light to steer a boat without having enough light to be able to make out color in the reflection on the (not necessarily flat) water. I like the approach to the problem though.

factotum
2013-04-05, 01:28 AM
1) Is this game fair?
2) Is there a winning strategy?


1) Assuming the coin is a perfectly fair coin (e.g. comes up heads half the time and tails half the time) then yes.

2) No. You can't force the coin to come down on heads or tails at will.

Goosefeather
2013-04-05, 01:36 AM
A casino has a game where a coin is flipped. You must pay any positive, real amount of money to play. If the coin is heads, you get double what you paid. If the coin is tails, you loose what you paid.

1) Is this game fair?
2) Is there a winning strategy?
3) *will be edited in when someone gets the first two*




1) Depends what you mean by 'fair', really. It's certainly gameable.
2) Yes. Bet x amount of money. If you win, great, stop there. If not, bet 2x. If you win, you have wagered a total of 3x and won 4x, a net profit. If you lose, bet 4x. If you win, you have wagered a total of 7x to win 8x, and profited x. If you lose, bet 8x, for a total bet of 15x and possible win of 16x. Continue doubling your bet until you eventually win, and you will have profited x. Of course, this presumes sufficient funds to keep on doubling without running out of money, but as the betting goes on, assuming a fair coin, the chances of your making a net profit of your original bet x approach 1.

Jay R
2013-04-05, 11:12 AM
A casino has a game where a coin is flipped. You must pay any positive, real amount of money to play. If the coin is heads, you get double what you paid. If the coin is tails, you loose what you paid.

1) Is this game fair?
2) Is there a winning strategy?
3) *will be edited in when someone gets the first two*

The game is fair, in the sense that each individual bet has expected value equal to zero.

There is only a winning strategy if the casino has infinite money, the gambler has infinite money, and there are infinite goods in the universe (so infinite money has any meaning).

Assume either the casino or the gambler has a limit to the money they can bet. This limit might could be their net worth, or any lesser amount beyond which they will not gamble.

Assume the lower of these two limits is 2 to the power n. Then the strategy of doubling the bet after each loss, until winning a single time wipes out all previous losses plus winning one dollar can only happen n times. So the outcomes are:

Lose 2^n - 1 dollars: Probability = 1/(2^(n+1))
Win 1 dollar: Probability = 1 - 1/(2^(n+1))

Total expected value = 0.

It's a fair game, but I wouldn't risk any chance of losing 1 million dollars just for a near-certain chance to win one dollar.

Teddy
2013-04-05, 11:26 AM
A casino has a game where a coin is flipped. You must pay any positive, real amount of money to play. If the coin is heads, you get double what you paid. If the coin is tails, you loose what you paid.

1) Is this game fair?
2) Is there a winning strategy?
3) *will be edited in when someone gets the first two*

1) Depends on your definition of fair. I can't quite tell what you're getting at, but the casino can always make it profitable by always placing the coin by making you select side first and placing the coin with the selected side downwards afterwards.
2) Always bet on the side pointing upward when the coin is about to be tossed (this strategy assumes the casino isn't employing the strategy above). You should get somewhere close to 101% in return of the invested sum if you break it up enough to make chance statistically isignificant. So if you invest $1000 split over 1000 coin flips, you should walk home with $1010. The stock exchange should provide both greater excitement and returns than this game, in other words. :smallwink:

Astral Avenger
2013-04-06, 06:25 PM
A casino has a game where a coin is flipped. You must pay any positive, real amount of money to play. If the coin is heads, you get double what you paid. If the coin is tails, you loose what you paid.

The coin has equal probability of being heads or tails.
You can play multiple times.

1) Is this game fair?
2) Is there a winning strategy?
3) People have gotten 1 and 2 right, so, what is the average amount you need to win $X using the strategy that Goosefeather and Jay R explained?

Reposted for people to find the third question more easily.

Jay R
2013-04-07, 10:37 PM
3) People have gotten 1 and 2 right, so, what is the average amount you need to win $X using the strategy that Goosefeather and Jay R explained?

I repeat, "There is only a winning strategy if the casino has infinite money, the gambler has infinite money, and there are infinite goods in the universe (so infinite money has any meaning)."

To win $1, you need:
$1 to start the bidding,
$2 more with probability 1/2,
$4 more with probability 1/4,
...
$2^n with probability 1/(2^n)
...

The expected value of this is the sum of an infinite sequence of ones, or a countably infinite number.

If you have any finite amount of money less than 2^n for some n, then you have a 1/(2^n) chance of losing everything.

In short, this system wins $1 only on the assumption that you have so much money that winning $1 will mean nothing to you.

Lorsa
2013-04-08, 05:13 AM
In short, this system wins $1 only on the assumption that you have so much money that winning $1 will mean nothing to you.

In theory, if you could borrow an infinite amount of money interest free, you might actually care about $1. Or suppose the casino lets you run into negatives knowing they'll just bill you afterwards. But gambling at a casino (or any gambling) is rather stupid unless you actually enjoy playing rather than winning in which case you are paying for an experience.

Jay R
2013-04-08, 09:15 AM
In theory, if you could borrow an infinite amount of money interest free, you might actually care about $1. Or suppose the casino lets you run into negatives knowing they'll just bill you afterwards. But gambling at a casino (or any gambling) is rather stupid unless you actually enjoy playing rather than winning in which case you are paying for an experience.

I promise you that if I could borrow an infinite amount of money interest free, I would not care about $1.

Also, no casino could last a day running such this game if they would let me run an infinite tab, because in that case, nobody would ever lose.

Gambling at a casino because you care about winning is only stupid because casinos are careful to make sure that every bet has a negative expected value. This bet has both an expected value of zero and an unbeatable winning strategy on the assumption of infinite money (or infinite credit).