# Editorial - TABLECOV

#1

Author: Kanstantsin Sokal
Tester: Pavel Sheftelevich
Editorialist: Pawel Kacprzak

Medium-Hard

### PREREQUISITES:

Dilworth theorem, dynamic programming, flows for partial credit solution

### PROBLEM:

Given two integers $N, M$ and two-dimensional array of non negative integers $A[1...N][1...M]$, your task is to count the **minimum number** of **valid Ghariyal-paths** needed to cover array $A$ in such a way that at least $A*[j]$ paths contain cell $(i, j)$. Definition of a valid Ghariyal-path is given in the problem statement. Intuitively, it's just a path from A[1][1] to A[N][M], which in a single step goes from the current cell to the cell adjacent to it from the right side or to the cell adjacent to it from the bottom. From now, we will call a **Ghariyal-path** just a **valid path**.

### QUICK EXPLANATION:

Use a reduction to maximum flow problem in order to solve first two subtasks, use fast flow algorithm in order to pass the second subtask.

### EXPLANATION:

### SUBTASKS 1 AND 2 We will use a graph approach in order to solve the first two subtasks.

Let’s think of the input array as of a graph G where with N \cdot M vertices corresponding to cells in the array. There is an edge between vertices v and u in G if and only if there exists a valid path on array A which goes from a cell corresponding to a vertex v to a cell corresponding to a vertex u in a single step. Notice that a vertex can have at most 2 incoming edges and also at most 2 outgoing edges. Finally, let’s call the vertex corresponding to A[1][1] a source and denote it as s. Similarly, let’s call the vertex corresponding to A[N][M] a sink and denote it as t. It sounds like flows are involved here, isn’t it?

Next, we will transform the graph into the graph G' as follows:

First, for each v in G, we create two vertices v_1 and v_2. Next, for each edge (u, v) incoming to v in G, we add edge (u_2, v_1) incoming to v_1 in G'. Then, for each edge (v, u) outgoing from v in G, we add edge (v_2, u_1) outgoing from v_2 in G'. Finally, for each v in G, we add an edge (v_1, v_2) with demand equal to A*[j]. We call this value a demand, because we demand A*[j] valid paths going through this edge. For any other edge, we add a demand equal to 0, because there is no required number of paths going from one cell to the other one in the statement. For edge (v, u) in G', let’s denote its demand as d(v, u).

Now the problem is reduced to finding the minimum number of valid paths in G' from s to t, such that there are at least d(v, u) paths going thought the edge (v, u). This problem is very well known and has many solutions and applications. It can be solved using a reduction to a maximum-flow problem. Please check the following resources for more details: here, here or here. You can find other references searching for phrases “maximum flow with edge demands”. Notice that in our problem we don’t need a maximum flow, but we need to find the minimum one fulfilling all the demands. However, this can be easily simulated. One way to do this is to modify G' by assigning a very big capacities to all existing edges and then adding new source connected to the original source with edge of capacity C and a new sink with incoming edge from the original capacity with C. Then we can use binary search to find the smallest possible C for which there exists a flow in such a modified graph G'.

Depending of used max-flow algorithm, a solution may pass only the first subtask or both first subtasks. The list of some reasonable algorithms to use is here.

In the last subtask, $N$ can be as big as $1000$, so any solution based on flows is going to exceed the time limit. We need a clever approach here.

We will define a partial order on the cells in the table. Let define a relation \leq over the cells in such a way that for two cells, possibly equal, x = (i_1, j_1) and y = (i_2, j_2), x \leq y if and only if there is a valid path visiting both x and y and visiting x not later than visiting y.

Notice that \leq defines a partial order of the cells. In fact, \leq is:

1. Reflective: x = y \Rightarrow x \leq y. If x and y are the same
cells, there trivially exists a path
visiting them both and visiting x
not later than x.

2. Antisymmetric: x eq y and x \leq y \Rightarrow y leq x. If there
is a path visiting cell x and
after that visiting other cell y,
there is no path visiting y before
visiting x. This is true, because
each valid path goes right or down
at any point.

3. Transitive: x \leq y and y \leq z \Rightarrow x \leq y. If there is a
path visiting x and visiting y
not before visiting x and there is
a path visiting y and visiting z
not before visiting y, there is a
path visiting x and visiting z
not before visiting x. This is
also true in case of our valid paths
defined over the table.

In order to solve the problem using this approach, we need to use a famous Dilworth’s Theorem, which states that:

In a finite partial order, the size of a maximum antichain is equal to the minimum number of chains needed to cover all elements of the partial order.

Here a chain is a set of all elements that are mutually comparable, in our case any valid path forms a single chain and vice-versa. On the other hand, an antichain is a set of mutually incomparable elements, in our case any set of cells mutually unreachable forms an antichain and any antichain is formed by a set of unreachable cells. In other words, there is a bijection between the set of chains of the \leq relation and the set of valid paths over the array, and there is a bijection between the set of antichains of the \leq relation and the set of sets of mutually unreachable cells in the array.

In order to use the theorem in the problem, we can transform array A a little bit. Let’s create a new array B where each cell A*[j] is represented by A*[j] cells B_{i,j,1}, B_{i,j,2}, \ldots, B_{i,j,A*[j]}, but without any assigned values, in such a way that two cells x = (i_1, j_1) and y = (i_2, j_2) are in the relation x \leq y in array A if and only if cells B_{i_1, j_1, c} and B_{i_2, j_2, d} are in the relation \leq in table B for any integer c in a range [1, A[i_1][j_1]] and any integer d in a range [1, A[i_2, j_2]]. Moreover any two cells B_{i, j, c} and B_{i, j, d} are not comparable in the relation \leq. This transformation let us use the Dilworth theorem very straightforward. Notice that now each unique path covering cell (i, j) in the original table, covers one cell B_{i, j, c} in the B table for some value c.

Now, the theorem can be reformulate as follows:

The minimum number of paths to cover all the cells of array B equals the maximum cardinality of a set of mutually unreachable cells in the array B.

Using our transformation of array A to array B, the above version is equivalent to the below one defined for table A:

The minimum number of paths to cover each cell A*[j] exactly A*[j] times equals the maximum sum of values in a set of mutually unreachable cells in array A, where this maximum is taken over all such sets.

This observation leads us to a very simple and efficient solution. We just need to compute the greatest sum of values in mutually unreachable set of cells in the A array. This can be done by a simple dynamic programming approach:

for i = N down to 1:
for j = 1 to M:
dp*[j] = max(dp[i + 1][j - 1] + A*[j], max(dp*[j - 1], dp[i + 1][j]))


The answer to the problem is the value in dp[1][N].

What this dp does is that it either takes the value of A*[j] to the antichain, in which case it cannot take neither the value A*[j - 1] nor A[i + 1][j] to the antichain, or it doesn’t take the value of A*[j] to the antichain in which case it may take exactly one the values A*[j - 1], A[i + 1][j] to the antichain.

The total time complexity of this approach is clearly O(N \cdot M).

### AUTHOR'S AND TESTER'S SOLUTIONS: Author's solution can be found [here][333]. Tester's solution can be found [here][444].

#2

Alternate solution for first 2 subtasks
lets build 2 arrays a[N][M] and b[N][M] where a*[j] denotes maximum value of array*[k] where k >= j and b*[j] denotes maximum value of array[k][j] where k >= i.

Now lets start traversing from 1 , 1 and go right if a[x][y + 1] is positive else go a[x + 1][y]
and at the end subtract 1 from each cell in the path and rebuild the array a and b.
Naively implemented this is O(N*M * ANSWER) but can be optimized by instead of subtracting 1 from each cell from path we subtract the minimum value of that path from each cell of the path.

https://www.codechef.com/viewsolution/9267034

#3

@pkacprzak Can’t access author’s solution! Could you please update the link !

#4

@sarvagya3943 It seems that it hasn’t been uploaded yet. However, it’s for sure the same as the one by tester.