TELNET - Editorial

PROBLEM LINK:

Practice

Author: Vaibhav Tulsyan

Tester: tncks0121

DIFFICULTY:

Simple

PREREQUISITES:

Dynamic Programming

PROBLEM:

You are given a grid. The cell (i, j) has a value H_{i, j}. You need to find a sequence of cells (i_1, j_1), (i_2, j_2), \ldots (i_k, j_k) such that H_{i_r, j_r} = r for all 1 \leq r \leq k and the sum of manhattan distances between (i_r, j_r) and (i_{r+1}, j_{r + 1}) over 1 \leq r < k is minimized.

EXPLANATION:

Define f(i, j) as the minimum possible cost to reach cell (i, j). Then, we can define a recursion on f. Let t = H_{i, j}.

  • If t = 1, f(i, j) = 0.
  • If t > 1, f(i, j) = \displaystyle \min_{(i', j'):H_{i',j'} = t - 1} (f(i', j') + |i - i'| + | j - j'| ).

That is, we iterate over all possible previous cells (i', j') and add the cost to reach (i, j) from (i', j').

The overall complexity would be O(N^4), as we take O(N^2) to find f value for each cell.

The answer would be \displaystyle \min_{(i, j) : H_{i, j} = k} f(i, j) .

AUTHOR’S and TESTER’S codes:

The author’s code can be found here

The tester’s code can be found here