I solved the two small cases like this:

**My solution for case 1 (small case):** simply iterate over them and find the sublist with max sum. (you can do it in O(n) using Kadane’s algorithm or one of the solutions mentioned here: https://cp-algorithms.com/others/maximum_average_segment.html. I used algorithm #2)

**My solution for case 2 (large but positive):** you’d always want to pick your range as the entire dishes. so, we only need to find out which dishes have `c`

in `[c-k, c+k]`

and sum up their `d`

(we need to do this in `log(n)`

).

- to find the sum of dishes in range
`[c-k, c+k]`

, we find the sum of dishes with`c <= c+k`

and subtract the sum of dishes with`c <= c-k-1`

. - so, we just need to find the sum of
`d`

for dishes with`c`

less than or equal to certain`C`

. To do that, we keep all dishes in a binary search tree (sorted by their`c`

) and at each node, we keep the sum of`d`

for all dishes in that subtree. any time you add a dish, as you go down from root to find the place to put this new node, add its`d`

to all the nodes you visit. then you can also query the sum of`d`

for dishes with`c`

less than or equal to a given`C`

in`O(log(n))`

time. (BTW, I got lucky and I got this problem without having to balance the BST. perhaps they could have included a sorted case where my solution’s query costs`O(n)`

)

my code: https://www.codechef.com/viewsolution/21981982

I would love to know how to solve the hard case.