PROBLEM LINK:Author: Lalit Kundu DIFFICULTY:HARD PREREQUISITES:MaxFlow and Maths. PROBLEM:You are given two arrays denoted by A_{1},A_{2}...A_{N} and B_{1},B_{2}...B_{N}. Find the largest number of moves that can be performed? ExplanationLet us first find (i,j) pairs which satisfy the property that they are unique [ i.e. (i,j) is considered once ] and B_{j} > A_{i} and GCD(A_{i},B_{j}) is not 1. Let us store gcd of each pair in vector U. But Wait we are not done with the solution! AUTHOR'S and TESTER'S SOLUTIONS:
This question is marked "community wiki".
asked 14 Jul '14, 15:03

@mohit2104: Edge capacity of the resulting flow graph is 1. And Dinic has a stronger time bound for networks with unit capacity O(min(V^(2/3), E^(1/2))E). http://en.wikipedia.org/wiki/Dinic's_algorithm. I feel complexity part must also be explained properly in the Editorial! Also Upper bound on number of nodes can be reduced: Source = 1 Max. no of nodes in (SET I + SET II) = 400*400 Each element can be connected to at most 9 dummy nodes = 9 * 400 * 400 Destination = 1 Total: 1 + 400 * 400 + 9 * 400 * 400 + 1 Same thing can be done with number of edges! answered 14 Jul '14, 15:48
1
ooh ! i wish i could have known it an hour before . I thought that its O(V^2*E). Anyways thanks. I will read about it :)
(14 Jul '14, 15:55)

The dummy nodes aren't necessary if you use a better implementation of Max Flow. This is because the number of distinct GCDs won't be as big as 400*400. I'm not sure about my analysis, but here it goes: The smallest distinct numbers to achieve the 400*400 distinct GCDs would have to be powers of 2 (the smallest prime). So, we'd need 400 different powers of two in each side: the biggest number in each set would be 2^400 => way bigger than the 1e9 limit, and would give us (400*400)/2, as A[i] < B[i] (first set of pairs) or A[i] > B[i] (second set of pairs). Important: as I used only distinct GCDs as vertices for U and V sets, the capacity from the source to U vertices and from V to sink was the amount of pairs that had this GCD (easily done using a hashtable/map), so the construction of the graph actually ran in O(N^2 log(N^2)). My solution got AC using edges from U directly to V and the Push Relabel algorithm: 4252168 answered 14 Jul '14, 18:08

We can greatly reduce the graph complexity by grouping pairs with same gcd on each side. Then the graph will become s to S1, where capacities will be # occurences of same gcds in S1. S1 to S2, again # occurences of gcd with gcd(s1,s2)>1.(where s1 in S1, s2 in S2. Capacity will be the # of occurences of s1. Then finally S2 to sink with capacities as # occurences of same gcds in S2. If you run Dinic's on this graph instead of the original the maxflow is nearly instant. I think this solution is conceptialy easier that the prime solution. answered 14 Jul '14, 18:41

Exact same method .. still tle .. I used Dinic's algorithm for max flow . I tried a few times but left it afterwards as i was unsure about the running time complexity of max flow. Even with the dummy nodes my analysis was telling me that it will be a tle. How is the max flow complexity decided ? For me the worst case analysis for the above code would result in tle . answered 14 Jul '14, 15:15
@mohit2104 oh, you should've tried the pushrelabel algorithm. I was sure about the number of vertices, so when i TLE'd I changed the max flow from Dinic's to Push Relabel. Unfortunatelly, here in codechef it happens a lot (problems that are solvable using max flow, but only if you use pushrelabel). push relabel works in O(V^3) in worst case, but in random graphs the expected complexity is way lower than this. From Stanford's notebook: It solves random problems with 10000 vertices and 1000000 edges in a few seconds, though it is possible to construct test cases that achieve the worstcase.
(14 Jul '14, 17:41)

I use fordfulkerson but get TLE in Java..:( grmmphh!!! answered 24 Jul '14, 04:10
