Help in May CookOff BURGER problem

I was trying to upsolve this Problem

I was able to come up with this approach (which still seems correct to me)


Y needs to be divisible by X in order to get the required sum.

Moreover we need to find some n such that X * (pow(2, n) - 1) == Y (the sum of gp), which emplies → pow(2, n) == 1 + Y / X of course we can break n in many terms.

That’s what I do → I calculated int current_n = log2(1 + Y / X) (took the floor by default) and subtract the sum of this gp upto current_n terms from Y that is X * (pow(2, current_n) - 1) keep repeating it until Y becomes 0.

And also inserting this current_n in a set to maintain the unique set of burger eating minutes, as asked in the problem!

The moment we find that some current_n has already occurred then we can’t take that length again, this is the greedy approach and has to be optimal!

Time Complexity is logarithmic per test because we are subtracting the maximum sum (upto Y) in each iteration (essentially removing 1 bit, from Y)



I even learnt and tried stress testing my solution with an accepted solution to find the bug. But I think my tests were weak. I couldn’t find any counter case.

Any help would be appreciated: please provide some counter test or the bug in the code / approach!

Thanks in advance!