PROBLEM LINK:Author: Sergey Nagin DIFFICULTY:HARD PREREQUISITES:Dynamic Programming and Maths. PROBLEM:Two arrays are almost equal if the rank of element at each index is the same for both the arrays.Let F(P1, P2) denote number of pairs (l,r) such that P1[l..r] is almost equal to P2[l..r] and array P1[l..r] contain not more then E inversions.Your aim is to output the sum of F(P1,P2) for all permutations P1, P2 for n elements. ObservationSereja call two arrays A and B with length n almost equal if for every i (1 <= i <= n) CA(A[i]) = CB(B[i]). CX[x] equal to number of index j (1 <=j <= n) such that X[j] < x. This is essentially saying that the order of elements in A and B must be the same. For example : (4, 5, 3) is almost equal to (2, 10, 1) because order of elements is (2,3,1) i.e. rank of first element in both the array is 2 , rank of second element in both the array is 3 and that of the third element is 1. CA(A[i]) returns the rank of the i^{th} element in the array A. It is clear from the above point that rank of the element matters and not the number's itself. ExplanationLet's consider a different problem , we will come back on this problem latter: Base Case dp[i][0] = 1 DP state dp[i][j] = dp[i1][j] + dp[i1][j1] + ... + dp[i1][j(i1)] Pseudo Code
Let's get back to the original problem. Once you have calculated the above dp , the original problem is very easy to solve if you remember the observations we did. Let us solve the problem for the case l=r. Number of inversion is 0 when l=r , so total number of such cases are : (C[n][1] * C[n][1]) * (F[n1] * F[n1]) * (n) = F[n] * F[n] * n Reason : Total number of permutation is n! . So total distinct number of pairs (P1,P2) are n! * n! . So from each pair we can choose any element as l. Let's us solve for a more generic case when r = l + d1 , where d>=1 The solution to the above case would be : sum_dp[d][E] * ( C[n][d] * C[n][d] ) * ( F[nd] * F[nd] ) * (nd+1) where E is min ( E, (n * (n1))/2 ) because maximum value of E for a given n is bounded by (n*(n1))/2. Reason Choose any d elements from both the arrays , arrange the remaing elements by F[nd] ways in both the arrays , choose the starting position by (nd+1) ways. Now we need to arrange d elements such that in both the arrays the order is the same. So if we fix the first array , we can get the second array [ Proof is left as an exercise ]. So we can choose position for the d elements for the first array by sum_dp[d][E] ways [ Remember that sum_dp[d][E] denotes that number of permutations of length d whose total number of inversion is atmost E ]. Now you can brute on d to solve the problem . Pseudo Code
get function is as defined above. Complexity: O(n^3) , Total number of inversion for a given n is of O(n^2) AUTHOR'S and TESTER'S SOLUTIONS:
This question is marked "community wiki".
asked 14 Jul '14, 15:03

Hi A great question followed by a nicer editorial. It would be great if you could tell me why exactly do we need to take into account the number of starting positions and the number of ways of arranging the remaining elements? SumDP[d][E] will give me total number of permutations of length d that has atmost E inversions. Total number of distinct pairs, (l,r) that can be selected from two arrays( where r=l+d1) is basically similar to total number of ways of selecting d out of n elements, given by C[n][d]*C[n][d]. Won't the product of the above two suffice? Thanks And Regards LoneCoder answered 19 Jul '14, 19:06

Can we solve this problem if the limit on number of inversion(E) was on whole permutation rather than the sub arrays ? answered 15 Jul '14, 19:56

the time limit was too strict here i think... i got tle .. i replaced x%mod by if(x>mod) x=mod and got AC after the contest.. or maybe mod operations are simply too expensive!! answered 14 Jul '14, 17:53
3
Yes, it was tight  but there were 10 days for this and many people still solved it, so it's not like it was really hard to optimize.
(14 Jul '14, 18:06)
2
well.. @xellos0  yes that's right.. i should have started working on this problem earlier... my bad i started just today morning :( ... nevermind... this was a great question and a great contest!
(14 Jul '14, 18:17)
1
This is not the point  you shouldn't have to. Look at Topcoder for the examples of quality in problem setting  if your solution meet the expected complexity criteria  it will never TLE on Topcoder...
(14 Jul '14, 22:37)
I agree. I had the right complexity, but I had to resubmit a bunch of times trying to optimize the constant factor in order to get my solution to pass. It just felt like this problem was artificially difficult, and would have been better as a medium with a less strict time limit.
(14 Jul '14, 22:39)
There's a difference between a contest that takes less than 2 hours with tiebreaking by time and one that takes 10 days with time irrelevant, you know. And of course, as I said it's just a medium problem with a bit strict time limit.
(14 Jul '14, 22:52)
1
To xellos0: did it ever happened to you that the problem you've been trying to solve had asymptotically better solution than the one you came up with? Same here  if you have a reasonably written code that gets TLE  you're not rushing to optimize the crap out of it  you are trying to find better algorithm (which probably does not exist in this case...)
(14 Jul '14, 23:35)
If I run my code locally and see that it works in around 1.5 seconds, while the time limit is 1 second, then no, I wouldn't try to find a better algorithm. I'd try to get rid of obviously slow modulo operations and try to only count up to N(N1)/2 inversions for each N and not 125000. Because that's what I did.
(14 Jul '14, 23:53)
This is when you know the time limits well  I found them very confusing on this contest: for example the problem "Game of Numbers" states that time limit is 1 sec and yet the slowest accepted solution was over 30 seconds. When you see that  you don't trust the descriptions anymore, and you don't have good reference. Also it is pointless to compare local run vs test machine  they could be very different...
(15 Jul '14, 00:07)
By googling a bit, you can find out what the time shown in the scoreboard means. Your definition of very different seems to be very loose. Local runs can tell you if your algorithm has a chance of passing, and it's very unlikely to get more than 2 times slower or faster time than when ran on the test machine, if you're using a normal computer. And when nothing else works, you can run your code 30 times locally, take the running times' average, do the same for another code that got AC on another problem and compute the expected time on the test machine. It just needs a bit of effort.
(15 Jul '14, 00:23)
I'll give you an example: Same problem I mentioned "Game of Numbers"  I was generating random test cases double the size of the limits: 1 ≤ N ≤ 1000 (instead of 400) 1 ≤ Ai, Bi ≤ 10^9 No test case locally took more than 45 sec, and my solution got AC with about 26 seconds. And by the way  I have 3 y/o laptop.
(15 Jul '14, 00:31)
Or not all test cases were random. We can assume how much an algorithm should take, but not how all test cases will look. Also, with 5 test cases that take 45 seconds, that runtime is the same as on your laptop.
(15 Jul '14, 00:38)
I meant I ran consecutively all MAXT=10 in 45 sec. Anyways if you believe that problem setter did a great job on this one  up to the highest quality standards  I guess I'll leave you at that...
(15 Jul '14, 00:44)
Test cases as in files. I guess you believe you didn't solve this problem because of the problem setter's mistake, not yours. I guess I'll leave you at that... unless you want to pingpong more, then I'll be your guest :D
(15 Jul '14, 00:49)
Maybe after the next contest :D
(15 Jul '14, 00:53)
showing 5 of 14
show all

This was definitely not a problem to be labeled "hard", given the number of successful solutions. Mediumhard, at BEST. It was just changing the order of summation, DPcounting permutations with given length and number of inversions, and binomial coefficients to finish it.
And that is exactly how I did it  and got TLE!!! http://www.codechef.com/viewsolution/4324780 Good job problem setters!!! Shouldn't you allocate to a problem at least 10 times as much as your baseline code needs??? Why not make n<300 instead of 500? Wouldn't it still exclude bruteforce solutions? Have you even tried submitting in other languages and see how they'do?