You are not logged in. Please login at to post your questions!


Editorial of Guess it Right and Manhattan Rectangle Feb long challenge


Consider the two cases for understanding basic pattern in this problem.The first case is that chef has 5 moves and have N = 10 and k = 10 which means he has 15 boxes in front of him and 10 boxes will be added after every choice now chef is making moves. For describing the chefs move I will use 0 if he chooses any box and 1 if he is removing xk boxes where xk<N.the optimal string is 1010101010 to maximize the probability because if he chooses any other move pattern he will end up increasing the number of boxes or the denominator so much that the probability will decrease drastically.So now you know he will always choose the box after removing the boxes whenever he can. Now if you will try to compute it like explained above you will see the answer will be expression of GP sum. (excluding the first in some cases where N<k in start and last move in some cases where he cannot remove boxes so he will choose from N+k boxes) We can calculate the gp sum in log m time so the overall time complexity will be O(Tlogm)

Now the next qustion Manhattan Rectangel which is a much easier. M = 10**9 first I have asked 4 queries from (0,0),(0,M),(M,0),(M,M) now the 4 equations are d1 = x1 + y1 d2 = x1 + M - y2 d3 = M - x2 + y1 d4 = M - x2 + M - y2 x1 y1 are the co-ordinate of vertex of rectangle which is nearest to origin and x2 y2 denotes co-ordiantes of rectangle of upper right. k = 0.5(y1+y2) = (d1-d2+M)0.5

now ask query to (0,k) which will give you the distance x1 because 0.5(y1+y2) is the y cordinate of mid-point of left vertical side of rectangle and it will give nearest distance from (x1,0.5(y1+y2)) and that distance is x1 now put x1 in first 4 equations and you will get other variables

asked 13 Feb, 16:18

faizan2700's gravatar image

accept rate: 0%

edited 13 Feb, 16:29

Hi @faizan2700 what do you mean by excluding the first in some cases in Guess It Right explanation? We know that N < K for all test cases. It's a given constraint and on the last move we will definitely have to choose a box. Also can you explain in detail how GP sum comes into play?


answered 14 Feb, 02:28

pskrunner14's gravatar image

accept rate: 0%

hi all

here is a solution given by rajput1999


answered 14 Feb, 18:00

himalaya7's gravatar image

accept rate: 9%

edited 14 Feb, 18:05

u r ossom


answered 14 Feb, 19:28

pool123's gravatar image

accept rate: 0%

the optimal string is 1010101010 to maximize the probability because if he chooses any other move pattern he will end up increasing the number of boxes or the denominator so much that the probability will decrease drastically.

But can you prove it? I solved the problem after convincing myself with some experiments but I couldn't prove the optimality of the moves.


answered 15 Feb, 20:44

gil_vegliach's gravatar image

accept rate: 25%

edited 15 Feb, 20:46

Its easy. Take a page and frame the equations. I proved that its not optimal to make $2$ or more random guesses in a row if $K+N >2*N$ which roughly translates to $K>N$. The maths required is simple equation framing, inequalities and perhaps quadratic depending on what result you want. Give it a try, if still not clear, ping someone.

(15 Feb, 21:56) vijju123 ♦♦5★

If you mean as in the editorial, I don't think the proof is complete as it doesn't consider the "tail probabilities" and I don't think applying induction is so straightforward.

(19 Feb, 16:01) gil_vegliach3★

Basically, if the probability of removing boxes and then taking a guess is greater than or equal to probability of applying $2$ guesses in a row, I removed boxes. This led to condition of $N>K$. I dont understand the "tail" probabilities, but we can see that after removing boxes, the other probabilities will only increase and not decrease w.r.t. making $2$ guesses a row.

(19 Feb, 16:54) vijju123 ♦♦5★

I get the general proof strategy. My issue is that when you guess you add very little probability but, at the same time, you make very probable to guess wrong and to take the "tail probability" with one move less. Call $p(N + K, M)$ the prob with $N+K$ boxes. With two guesses this becomes $1/(N+K) + ((N+K-1)/(N+K)) * (1/(N+2K) + ((N+2K-1)/(N+2K)) * p(N+3K, M-2)))$. With one remove and one guess this becomes $1/N + ((N-1)/N) * p(N+K, M-2)$. You want to prove the second is greater than the first. The editorial addresses the main part of the inequality but not the "tail+multipliers" part imho.

(19 Feb, 20:02) gil_vegliach3★
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text]( "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported
  • mathemetical formulas in Latex between $ symbol

Question tags:


question asked: 13 Feb, 16:18

question was seen: 1,171 times

last updated: 19 Feb, 20:04