Complexity of Bytelandian gold coins?

what is time and space complexity of solution of Bytelandian gold coins ? Problem link : COINS
My solution : solution .

Note : By XeY i mean X*10^Y .

I am storing information (DP) in a map and program is executing in less than a second for ‘num’ (input) up to 1e15 .

since program isn’t giving segmentation fault or time limit , i guess number of iterations the program is taking for input upto 1e15 is less tha 4e8.

Can someone tell the time complexity of the solution ?

O(\log^2{n}). That’s because \lfloor\frac{\lfloor n/a \rfloor}{b}\rfloor = \lfloor \frac{n}{ab}\rfloor. If you don’t understand why the fact above proves it, you can ask me.

2 Likes

Let me try , i will ask if i’ll have some doubt.

I proved in following way , correct me if i am wrong .
for number below or equal to 1 , we need to do constant number of operations.
ceil(n/2^x) >= 1 hence x = O(logn) , similarly for 3.
Also ceil(n/(2^x*3^y) )>= 1 , thus logn >= x + ylog3 , total possible combination (x,y) for this equation is O(log^2n) . Since 4 = 2^2 we , don’t need to take care of it while calculation total possible numbers we will reach in recursion .Since we are using map , hence overall complexity is O(log^2n * log(log^2n)) .

Oops, I forgot about the map.
Our actual complexity is O(\log^2{n}\log{(\log^2{n})})
That’s negligible anyways.
The actual proof is that there will be O(\log{n}) values of the form \lfloor n/2^k\rfloor, O(log(n/3)) values of the type \lfloor n/(3^1 * 2^k)\rfloor and so on. The sum is O(\log^2{n})

I guess it should be O(logn∗log(logn)) , what is wrong in my proof ?

How the sum is log^2n ? I ran the code to count number of points it’s reaching for n = 64 , and it’s 12 which 2*log(64) . Hence total points O(logn) .

Code to find number of elements in map
#include<bits/stdc++.h>
#define ll long long 
using namespace std;

map<ll,ll> dp;

ll solve(ll num)
{

        if(num==0)
            return 0;

        if(dp[num]!=0)
            return dp[num];

            
            if(num>(solve(num/4)+solve(num/3)+solve(num/2)))
                dp[num] =  num;
            else
                dp[num] = dp[num/4]+dp[num/3]+dp[num/2];

            return dp[num];
        
}

int main()
{
        
        long long int num;

    while(cin>>num)
    {
        dp.clear();
        cout<<solve(num)<<" ";
        cout<<dp.size()<<"\n";
        cout.flush();
    }


    return 0;
}
Output
1000000000
4243218150 241
100000000 
364906343 191
10000000 
29854363 148
1000000
2566393 107
100000
203708 75
10000
16615 48
1000
1370 28
100
120 14
10
10 5
Plotting log(n) versus map.size()

Hence total points O(\log^2{n}) :upside_down_face:

1 Like

what is wrong in my proof then ?
Edit : you are right . I understood what you meant by sum.

I think O(log^2n) is not tight bound . What is asymptotically tight bound for number of points ?

I think it is a very tight bound.

Let us compare my sum to O(\log^2{n})

As you can see, It levels out at a constant.
Comaparing my function to the answer for 10^i from i=0 to 120.

The last column is the ratio of the actual answer to my bound. It fluctuates at the start, and then slowly goes upto 97\%. I think O(\log^2{n}) is correct number of points, but The time complexity will become worse because dividing big numbers also takes more time.

1 Like

On implementing the memo array by a static array ( this answer: CodeChef: Practical coding for everyone) instead of dynamic allocation using new keyword,( this solution: CodeChef: Practical coding for everyone ) i’m getting TLE. can you please tell why it is so?

You shouldn’t be declaring an array of size 10^9 in the first place. But the compiler optimises it so you don’t notice.

2 Likes

thanks a lot… :smiley:

Graph tells that it’s tight bound.Do you know some mathematical way to prove that sum is Θ(log^2n) ? It is O(log^2n) , we need to proof it’s Ω(log^2n).