### PROBLEM LINK:

**Author:** Hanlin Ren

**Tester:** Jingbo Shang

**Editorialist:** Hanlin Ren

### DIFFICULTY:

Cakewalk

### PREREQUISITES:

None

### PROBLEM:

A permutation p is said to be *good* if p_i\ne i for any i\in[1,N]. Find the lexicographically smallest *good* permutation of length N.

### QUICK EXPLANATION:

The answer is

### EXPLANATION:

#### Guessing the answer

Let’s print the answer for small n's. (Actually, when the input consists of only one integer, we often print answers for small n's and maybe we can find the pattern.)

$n$ | the answer |
---|---|

2 | 2,1 |

3 | 2,3,1 |

4 | 2,1,4,3 |

5 | 2,1,4,5,3 |

6 | 2,1,4,3,6,5 |

7 | 2,1,4,3,6,7,5 |

Can you find it?

For n even, we split the identity permutation (1,2,\dots,n) into blocks of length 2, and swap every block:

```
1 2 3 4 5 6
1 2|3 4|5 6
2 1|4 3|6 5
2 1 4 3 6 5
```

For n odd, things are slightly more complicated: the last block has length 3, and when “swapping” this block, we change N-2,N-1,N into N-1,N,N-2.

```
1 2 3 4 5 6 7
1 2|3 4|5 6 7
2 1|4 3|6 7 5
2 1 4 3 6 7 5
```

#### Why?

First notice that when N\ge 2, there is always a solution.

Also, to make things simpler, we assume that N\ge 4. When N < 4, we can use brute force to compute the answer.

When lexicographical order comes into a problem, it’s usually useful to think “position by position”. Let p be the permutation we want.

First, can p[1]=1? Obviously the answer is no.

Then, can p[1]=2? Yes! It’s easy to see that, **there exists a good permutation that starts with 2**(e.g. (2,3,4,\dots,N,1)), so the lexicographically smallest one must start with 2.

Next we need to determine p[2]. Can p[2]=1? The answer is yes. Since N-2\ge 2, you know that you have a *good* permutation of 1\sim (N-2), say, q. Let’s add 2 to every element of q so it becomes a permutation of 3\sim N, and append it after (2,1). That gives a permutation (2,1,q[1]+2,q[2]+2,\dots,q[N-2]+2) and it’s obviously *good*. So p[2]=1. To make your permutation the lexicographically smallest, you only need to make q the smallest. That becomes a subproblem of size N-2. This is why the answer always begin with something like “2\ 1\ 4\ 3\dots”.

### ALTERNATIVE SOLUTION:

There is a simple brute force solution that actually runs in O(N^2) time:

```
p = [array of length n, uninitialized]
u = [array of 0's] #u[x] means if x appears in p[1..(i-1)]
dfs(i)
for j = 1 to n
if (not u[j]) and (j != i)
p[i] = j
u[j] = 1
dfs(i+1)
u[j] = 0
```

This brute force simply searches all positions one by one, and prune the search tree immediately while something isn’t right. Why is it O(N^2)? This algorithm actually does the following thing:

- Try p[1]=1. It’s not valid. Then try p[1]=2.
- Try p[2]=1. It’s OK.
- Try p[3]=1,2,3, all invalid. Then p[3]=4.
- Try p[4]=1,2, invalid. Then p[4]=3.
- …

In step i, we enumerate O(i) invalid numbers. When n is even, no backtracking happens; when n is odd, the only backtrack happens when i=n-1. Therefore the complexity is O(1+2+\dots+n)=O(n^2).

How could we improve this to O(n)? We simply maintain the smallest number j that u[j]=0, and the i-th step costs O(i) enumerations, rather than O(i).

```
p = [array of length n, uninitialized]
u = [array of 0's] #u[x] means if x appears in p[1..(i-1)]
J = 1//smallest J that u[J] = 0
dfs(i)
for j = J to n
if (not u[j]) and (j != i)
p[i] = j
u[j] = 1
//maintain new J
old_j = J
while u[J] == 1 and J <= n do
J = J + 1
dfs(i+1)
u[j] = 0
J = old_j//restore J
```

However, I think this solution is more complex than the pattern-finding one. What do you think?

### AUTHOR’S AND TESTER’S SOLUTIONS:

Author’s solution can be found here.

Tester’s solution can be found here.

Author’s solution for “Alternative Solution” can be found here.