### PROBLEM LINK:

**Author:** Jingbo Shang

**Tester:** Kevin Atienza

**Editorialist:** Kevin Atienza

### DIFFICULTY:

Medium-hard

### PREREQUISITES:

Dynamic programming, bitmasks

### PROBLEM:

There is a H \times W grid. You are at (1,1), the bottom-left cell. There may be rocks at some cells, except (1,1).

The rocks are falling down, discretely. Specifically, a rock at (x, y) will be at (x, y - 1) at the next second, and will disappear once y becomes 0. You may move left or right to avoid being hit by a rock. At every second, you make your move, and then the rocks fall 1 cell down. You can only go to a cell within a grid if there are no rocks at your destination cell before and after the rocks fall.

How many different grids are there in which you can survive, if you always play in the optimal way after the layout is made available to you?

### QUICK EXPLANATION:

Let M be a subset of \{1,\ldots,W\}, and R be a *nonempty* subset of M (R and M can be represented with bitmasks). For 1 \le h \le H, define f(h,M,R) as the number of survivable W\times h grids such that in the topmost row, M is the set of cells not containing a rock, and R is the set of reachable cells from the start. The answer is clearly the sum of f(H,M,R) for all pairs (M,R). We can also express f(h,M,R) recursively by enumerating the possible sets M' for the second topmost row, and computing the new set R' based on R, M and M'. The base case is f(1,M,R) = 1 if R = \{1\} and 0 otherwise.

There are O(3^W H) valid arguments for f(h,M,R), and each one can be computed in O(2^W), so the overall running time is O(6^W H).

### EXPLANATION:

This problem strongly suggests a DP + bitmask approach.

The general idea our solution will be that we will generate the grid row by row, from the bottom to top, and count the number of possibilities depending on the rows we’ve already placed. The hard part would be to determine how to count the number of possibilities given on the choices we’ve already done. Intuitively, this would work, because the rocks fall one row by one row, which means we only need to check the current and next row to determine where we can go. (Although the optimal strategy itself may require looking ahead several rows.)

Since we’ll be building the grid row by row, it’s important to keep track of which cells don’t contain rocks in the current row. However, this is not enough! Consider for example the following case:

```
##.#
.#.#
.###
```

If you look at it two consecutive rows at a time, it *looks* like there is always a way to get from each row to the next, thus be able to survive all the way to the last row. However, it doesn’t take into account the fact that the second free cell in the second row isn’t reachable! Thus, we also need to keep track of which cells are *reachable*, in addition to just the free cells.

Throughout this algorithm, we fix the value W. For the current row, let M be the set of columns containing free cells, and R be the set of columns containing reachable cells. Clearly, R is a subset of M, and M is a subset of \{1, 2 \ldots W\}.

Let’s define f(h,M,R) as the number of grids with height h whose set of free cells is M and reachable cells R. Clearly, the answer is the sum of all f(H,M,R) for all nonempty R s. We also have the following base case:

This is because only the cell (1,1) is initially reachable.

Now, we need to express f(h,M,R) in terms of a sum involving the $f(h-1,M’,R’)$s.

Suppose M' and R' are the sets of columns containing free cells and reachable cells at row h-1, respectively, and M and R for row h. What makes (M,R) “compatible” with (M',R')? (Compatible in the sense that (M,R) is a valid new state given the previous state (M',R')) Well, R must be the set of cells you can reach from the previous row assuming you start from one of the cells in R' and given the set of free cells in the current and previous rows: M and M' respectively. In fact, given M, M' and R', the set R is uniquely determined! Given M, M' and R', you can in fact compute all cells reachable at row h according to the rules of the problem.

Here’s a more explicit formula: R is the union of three sets R_{-1}, R_0 and R_1, where:

In other words:

where we define R'+c = \{x + c : x \in R'\}.

Intuitively, R_{-1} is the set of cells you can reach by moving one step to the right. R_0 and R_1 are similarly defined.

Armed with this, we now have the following recurrence:

This gives us an algorithm to compute f(h,M,R) for all pairs (M,R) given f(h-1,M',R') for all pairs (M',R'):

- Initialize all f(h,M,R) = 0.
- For all pairs (M',R') and sets M, let R = M \cap M' \cap [(R'+1) \cup R' \cup (R'-1)]. Then add the value f(h-1,M',R') to f(h,M,R).

Finally, we can now perform this for all values h up to H, and we can then extract the answer from the last set of f values!

How fast does this run? First, let’s count how many valid (M,R) pairs there are. Note that R must be a subset of M, and M a subset of \{1 \ldots W\}. Thus, there are 2^W possible values for M. Naïvely, we know that there are 2^W choices for R, so there are at most 4^W pairs overall.

But we can find the exact value! Suppose M has k elements. Then there are 2^k possible values for R, and {W \choose k} ways to choose the k elements of M itself. Thus, the total number of ways to choose (M,R) is:

which by the binomial theorem we know to be equal to (2 + 1)^W = 3^W. Thus, there are 3^W pairs (M,R)!

Now, back to the running time. Consider a single application of the step above to compute the $f(h,M,R)$s given the $f(h-1,M’,R’)s. The slowest part is enumerating all pairs (M’,R’)$ and sets M, of which there are 3^W \cdot 2^W = 6^W possibilities. Assuming we can operate among subsets of \{1 \ldots W\} in O(1) time each, which is possible when the sets are implemented as *bitmasks*, the running time of a single step is O(6^W). Since we will do this H times, the overall running time is O(6^WH). This is not polynomial time, but W is small enough in the problem so this will pass the time limit.

As a final bit of trivia, we mention that it is possible to solve this problem in O(W4^WH) time. We’ll leave it to the reader to discover!

### Time Complexity:

O(6^WH) or O(W4^WH)