Tags: Dynamic Programming, Recursion with memoization.
Tester: Murali, Kaustav
This is just an extension of Fibonacci sequence with extra memory and different initial conditions.
How many ways can we get to N? We can reach N from N-1, N-2 and N-3 (those are the allowed step/jump sizes). If we denote number of ways of reaching N by F(N)
F(N) = F(N-1) + F(N-2) + F(N-3)
All we need to do now is initialize the values of F(1), F(2) and F(3) (these are given in the examples) and just implement a recursive function.
Let’s examine the order of the algorithm: For each i (1 <= i <= N), we call 3 recursive functions, each of which make another 3 recursive calls. This is asymptotically 3^N for computing N.
For SubTask1, N = 10, 3^10 ~= 6x10^4. T = 10
So total computations are 6x10^5 < 10^7 (you can do roughly 10^7 calculations per second)
This is good enough to pass our time limit of 1s.
For SubTask2, N = 1000. 3^1000 is indeed a large number, so this is not good enough.
But notice that to calculate F(i), we need to calculate F(i-1), F(i-2) and F(i-3) again. And again! This is clearly not needed. Let’s just cache these values.
Let’s take an array of size N and store the number of ways for each i. Take an array of size N and keep storing the values of each i as and when we compute it.
This is a linear order algorithm, so complexity is NT = 10001000 = 10^6.
Now for SubTask3, we have an upper bound on N = 10^6 and T = 10^6. Even with linear algorithm, the complexity is 10^6*10^6 = 10^12. Clearly TLE! Also, a recursive solution may run out of stack depending on the settings.
But we don’t really need to compute F(N) for each T. We can just calculate F(1000000) and store it in an array and just do a plain look up. Complexity is 10^6 + 10^6 = O(10^6).