AMR16H-Editorial

PROBLEM LINK:

Contest
Practice

Setter:
Tester:
Editorialist: Abhishek Pandey

DIFFICULTY:

Simple/Easy

PRE-REQUISITES:

Greedy , Arrays, Looping

PROBLEM:

Given an array with N elements, we have to maximize the time taken for verifying transactions. The mechanism of verifying is that, we initially have two pointers (one at start of array and another at end). The pointers are pointing towards the array elements at the respective positions. The minimum element out of the two is taken, verified and the respective pointer is moved 1 step forward (backward for pointer at end of array). On addition of a new element, the entire chain is re-verified, and hence their contribution is counted again. We have to print the maximum possible time taken for verification. (Maximum time for some permutation of the given array, i.e. we can re-order/sort the array)

QUICK EXPLANATION:

By working on some small manual test cases, we observe that greedy works here. Its best to put the transaction taking longer time first, so that its contribution is counted again and again. Some of you must have already guessed that the permutation of array must be {{A_2},{A_3},....{A_n},{A_1}} where {A_i} denotes the ith largest element of the array. It is because out of the two elements, the minimum one is added, so {A_1} is always going to get added last.

Now we simply sort the array and sum up the contribution of every element. (Discussed below).

The complexity is O(NlogN+N) [NLogN for sorting and another N for a traversal through the array]

EXPLANATION:

The question is a little bit on lengthier side, but it rewards the one who patiently reads it through.

Since explaining this is easier with numbers, let us take an exemplar array-

A = [1,2,3,4,5,6]

Now, we have to find the maximum time taken for verification of all 6 transactions. Remember, the key part of the question is-
Once the new block has been added, the entire verified chain is verified again, spending time equal to the sum of the verification times of each block in the current verified chain.

It is easy to see that, the earlier an element is added, the greater will be its contribution in verification time. But what will exactly be its contribution?

-Contribution of ith added element in verification time-

So lets see how this mechanism works. Initially, the pointers are pointing towards element 1 (starting) and element 6 (ending). The minimum one among them is 1, so 1 will be added. Time taken for verification is 1 . Then this 1 will be verified again when 2 is added, again when 3 is added, and so on till we add 6. Its easy to see that 1's verification time is counted for a total of 6 times, after which 2 is added and its time is counted counted 5 times, the next added element’s verification time is counted 3 times, and so on.

Formally stating, we observe that the ith added element’s verification time gets counted (N-i+1) times. (Eg- The contribution of first added element is counted N times)

[The above formula assumes 1-based indexing]

Order of adding elements to maximize time-

Now, we know how many times does the ith added element gets counted.

Also, we see that in above array configuration, we are adding 1 first, then 2, then 3 and so on. We ought to ask ourselves on thing. Is it optimum? Will adding like this yield the maximum time?

No! In fact, adding like that yields the minimum time. We already saw that earlier added elements/transactions contribute more towards maximizing time. So naturally, we will want to add the bigger ones first so they get counted again and again and yield a greater time. (Eg- if 5 gets added first, it contributes 5x6=30 seconds to verification time, while adding 1 first is contributing only 6 seconds in total)

Recall the mechanism again, it states that out of the 2 elements being pointed to, the minimum one is added. So its obvious that 6 (i.e. the maximum element) is going to get added last and will contribute only once. Hence, the only option to maximize time in our hands is to add the second largest element first, then the third largest, then fourth largest…and so on, and finally adding the largest element to get the maximum time. This order of adding elements yields the maximum time (as shown above). Its not hard to see that this order of adding elements is possible for this permutation of array A-

A= [5,4,3,2,1,6]

Here the contribution of 5 is added 6 times, of 4 added 5 times…of 1 added 2 times. Finally, the contribution of 6 is added only once.

IMPLEMENTATION

Now that we got our logic clear, we can easily implement our solution. We sort the array first, so elements are ordered according to their values. Now, we start from the second largest element, and keep on adding its contribution. We can manually add the contribution of final element. Formally stating, the contribution is calculated as-

Max Time= {A_2}*N +{A_3}* (N-1)+....{A_n}*2+ {A_1} where {A_i} denotes the ith largest element.

There are multiple other ways to implement it as well. For Example- one may use sort function to only partially sort array (from index 1 to index N-1 [1-based indexing] ) and then count the the contribution as well.

I would like to request anybody who has taken a different approach/implementation to share it here as well and let the community appreciate it :).

SOLUTION:

Editorialist

RELATED PROBLEMS AND CONCEPTS:

I had given some links and questions to practice greedy in my other Editorial ( for problem Mancunian Hoards Black Money ) so you can check it out :).

There is another interesting concept I would like to discuss. The question mentions that there are 2 pointers pointing to elements of array and so on. Actually, this is a very nice and neat algorithm/trick known as Two Pointer Algorithm. Some related problems related to it, if you’d like to practice, are-

Blocked Website
Maximum Unique Segment

COMMON ERRORS:

Still getting WA? Make sure you havent fallen into the common pitfalls given below-

1.Look at this code-

int arr[n];
....
long long int ans=arr[n-1];//Assume sorting+Input is done
for(i=0;i< n-1;i++)
   ans+= arr**(n-i)

Can someone spot whats wrong here? Its this ans+= arr**(n-i) . If you are declaring arr[n] and n as int, then during multiplication, the temporary result will be in int and overflow will occur. This overflowed result then gets stored into your answer!

For more info, try reading about implicit and explicit type casting. The correct way is ans+=(long long int)arr**(n-i); [The code snippet is for C++, but concept is applicable to most of the languages]