I did this problem by two methods:
1)Using Dynamic Array (using “new” keyword) - Solution
2)Using Vector - Solution
In the Dynamic Array Solution, Execution Time is 0.11 sec.
In the Vector Solution,Execution Time is 0.03 sec.
Can Anybody tell why Execution Time in dynamic array solution is more than Execution Time in Vector solution even when both are dynamic memory allocation?
Yes. However, your two solutions are very different: vector::resize does not necessarily cause a reallocation (if the requested size is less than the current capacity). in fact, in that case, vector::resize will be \mathcal{O}(1) (Oops - this last bit was wrong; see below).
No its not reallocation.
It is providing size to vector equal to input and vector dont need to be initiliazed with 0 as it is by default but why execution is long in dynamic array?
if the requested size if less than the current capacity, then no re-allocation occurs.
if the requested size is less than the current size, then vector::resize will be \mathcal{O}(1) (<-- Whoops - that’s not right either!)
Edit:
No, that’s not quite right, either: for a general vector<T>, resizeing to a smaller size will have to destroy the excess Ts. However, for ints, for which destruction is a no-op, I can imagine that a smart Standard Library implementation might skip the destruction and do the whole process in \mathcal{O}(1).
In general, no - but I would speculate that a vector of primitive types, when asked to resize to less than its current size, would perform this in \mathcal{O}(1). Might be wrong in this, though
Edit:
Whoo - based on quick test, it looks like this speculation might be correct!
Compare the times taken when N is decreasing compared to when it is increasing:
#include <iostream>
#include <vector>
#include <numeric>
using namespace std;
int main()
{
vector<int> v;
for (int N = 1'000'000'000; N >= 1; N--)
//for (int N = 1; N <= 1'000'000'000; N++)
{
v.resize(N);
cout << "size: " << v.size() << endl;
}
}
I am sure vector.resize() is done at every “t” and whole vector is initialized to 0 because i am not clearing the vector ( dp.clear( ) ) for every “t” and still i got AC .
Well check this CodeChef: Practical coding for everyone and CodeChef: Practical coding for everyone
Memory allocation is costly. If you reallocate memory for each t then it will take more time.
Always reuse the same memory again if possible.
This will also be cache friendly. Whereas it might not be the case when we relocate again and again.
Yeah, it looks like the value of dp[i] for a given i doesn’t actually depend on the value of n for that testcase, based on the contents of the sol function. And your solution