 # Why TLE in Snakeeat?

i used binary search,sorting in efficient manner in Snakeeat Problem

How Can i Optimize my code more

From the looks, it seems you are trying that-

1. For each query find critical index, after which all snakes are >=k
2. Then from that index, traverse backwards, making the snake just before snake of length k(or before critical index) eat required snakes to grow upto k length
3. Repeat step 2 till all snakes are exhausted.

This will cause a TLE . How will you cope up with test cases where length of ALL snakes is k-1? Your step 1 will provide no optimization, and step 2 would be repeated 10^5 /2 times. So for 10^5 queries, this will make it ~5 x 10^9 operations, for a single test case. Remember that T can be from 1 to 5, so operations will be even more.

Try to reduce O(NQ) of worst case to O(Q logN). I did it via additional use of Binary Search concept. The problem is to be appreciated, because its rare to find such apt problem which doesn’t yield to “Incorrect concept but fine optimisation” . A use of correct concept will immediately net you an AC and make this problem trivial. In a way, it tested implementation skills a lot.

People have ways to reduce NQ to QlogN. I dont know anything about trees or such, so not sure if the standard algorithms help here (Help from other users requested here).

I used Binary search concept twice for this. Take pen and paper, do some maths! You will get the concept real quick, its nothing exceptional! (If you want to solve it like i did)

[Well, yes, most part of the answer is dedicated to telling you why you got TLE. I dont think my method was one of the ‘expected methods’ so i think its better to wait for a better answer on optimization of code part.]

1 Like

I have used Collection.sort for sorting which takes O(logN) time and and stored the lengths of the snake in a SET and used Set.floor() function to get the number <=Required length .
I guess in worst case my solution would go to O(q*q) but i do have a termination condition that should take care of most of the cases .
Can someone look over my code and suggest what better could i have done with more or less same logic
ThankYou https://www.codechef.com/viewsolution/13752798

@vivek96 Modify the binary search a little.
Preprocess the sum of elements of array in reverse order. Let the array of sums be be Sum.

1. Find the critical point at which arr[i] >= k. (Hint: use built in search function like lower_bound() in C++)
2. Now for remaining elements [0, i - 1], search element j such that required number of snakes, (i - 1 - j) are of length >= k. Remember these (i - 1 - j) elements are going to feed on snakes of smaller lnegths.

Find j that satisfies k * (i - 1 - j) <= Sum[j] + j.

1 Like

@vivek96 use a HashMap to store the queries which have been answered and then if query is repeated produce the result from HashMap.for ex if for one query k = 5 and again there a query for k = 5 then you can give the ans quickly

1 Like

@vivek96 use a HashMap to store the queries which have been answered and then if query is repeated produce the result from HashMap.for ex if for one query k = 5 and again there a query for k = 5 then you can give the ans quickly

@sonu_628 HashMap are not useful in this case as K can vary over [0, 1e9] and be distinct.

@devilhector if K is distinct time complexity will reduce automatically as the length of a linear search will reduce each time.Time is consuming because of repetition of K.There are solution who passed the test with HashMap using the same logic

Can anyone tell me any test case for which my code is failing. I have used sorting and binary search and I tried all the combinations of test cases I could think of and all are passing. I got a WA for this code - https://www.codechef.com/viewsolution/13767521 . Any help is welcome. Thanks

i also used binary search for finding the index of element if its exist,then search backward means slide the window accordingly since we know if a[i] is not able to eat snake then why a[i-1] need to eat snake.but if element is found then the no of elements to its left(include itself)+ no of snakes which can be achieve length and one case if all elements are equal & query=a then why we need to sort and binary search,am i right?

For case where no such snake of length >=k exists, your optimisation would fail. Then you will be linearly transversing atmost half the array, which might give you TLE.

I did not sort again. One sorting was enough. After that i used binary search concept twice, one to find the critical index (after which snakes are >=k) and other to calculate no. of snakes who will be of length k after eating other snakes.

i also sort the array one time,and binary search for each query,now tell me how can i modify my code?

is the right way to check eating of snake? which i used?

For checking that, can you please give a bit more elaborate description on what you did? Those while loops after binary search are confusing me a bit. I will appreciate comments/details on them.

BTW, you can see my code here-

https://www.codechef.com/viewsolution/13691858

ok i checked if k is founded in array or not if not founded then i search backward means slide the window accordingly(by using start and end pointers),since we know if a[i] is not able to eat snake then why a[i-1] need to eat the snake .but if kis founded in array then the no of elements to its left(include itself)+ no of snakes which can achieve the length k (by checking with start and end-sliding window approach).

u used formulae?

Yes, i used formulae. Sliding past the elements, as i said, can go upto half the array length and thats enough to cause TLE in worst case.

why floor value? and use of tree map in this qstn?

Tree set have an function Set.floor(n) that would give the greatest number less than or equal to n.This would help me to find out the the largest element that is less than or equal to required lenghth

Its giving me null pointer exception when i try to run your code 