PREREQUISITES: Greedy Algorithms
We have two arrays(A and B) each of size N. In one step we can choose any two index l and r such that 1≤l≤r≤N and increment B[i] by 1 for l≤i≤r.
Find the minimum number of steps required to make both are identical.
There are different approaches to solve this problem. We would be discussing the greedy strategy here, but there is a nice divide-and-conquer solution, which is left for the audience to think.
Lets compute the difference array(C) where C[i]=A[i]-B[i].
Lets assume that we have somehow made the array identical for the first i-1 elements. Now,there are two cases to consider
Let number of steps performed till now be STEPS
Case 1: C[i]<=C[i-1]
In this case, the number of times the ith element has to be incremented is <= the number of times the (i-1)th element has to be incremented. So we can greedily choose ith element when (i-1)th element is chosen, thereby spending no extra effort for the ith element. Which implies that there is no change in the value of STEPS.
Case 2: C[i]>C[i-1]
In this case, the number of times the ith element has to be incremented is > the number of times the (i-1)th element has to be incremented. To minimize the extra effort required for the ith element, first we try to choose ith element whenever (i-1)th element is chosen. So, the ith element would have increased by C[i-1], now what is left is C[i]-C[i-1]. For the difference alone we have to take ith element separately without the help of (i-1)th element. Which implies that STEPS increases by C[i]-C[i-1].
Time Complexity: O(N)