PROBLEM LINK:
Practice
Contest: Division 1
Contest: Division 2
Contest: Division 3
Contest: Division 4
Author: raysh07
Tester: satyam_343
Editorialist: iceknight1093
DIFFICULTY:
Simple
PREREQUISITES:
None
PROBLEM:
You’re given an array A of length N that’s strictly increasing.
In one move you can choose any integer and insert it anywhere into A.
Find the minimum number of moves needed to turn A into an arithmetic progression.
EXPLANATION:
An arithmetic progression is defined by its starting element and common difference.
Since we’re aiming to insert as few elements as possible, it’s clearly optimal for the starting element to always be A_1 (since there’s no point inserting extra elements before it).
Thus, we only need to decide what the common difference will be.
To start with, let’s look at just A_1 and A_2.
Suppose we decide to insert some k elements between them (k \ge 0).
Then, if the common difference of the resulting AP is d, we obtain the following relation between A_1 and A_2:
This follows from A_2 being the (k+2)-th element of the AP (A_1 is the first, and there are k more elements between them).
This can be rearranged to d = \frac{A_2 - A_1}{k + 1}, so d is uniquely determined once k is chosen.
In particular, d will definitely be a divisor of (A_2 - A_1), since we can insert only integers.
Further, d and k are inversely correlated: that is, if we want k to be small (i.e. the number of inserted elements to be small), then d should be large, and vice versa.
Now, observe that this logic in fact applies to any pair of adjacent elements: that is, if we look at any A_i and A_{i+1}, the common difference of the final AP must be a divisor of (A_{i+1} - A_i).
We now know that the final difference of the AP must be a divisor of each of
(A_2 - A_1), (A_3 - A_2), \ldots, (A_N - A_{N-1})
As noted earlier, minimizing the number of inserted elements is equivalent to maximizing the common difference d.
Since d must be a divisor of every (A_i - A_{i-1}), by definition the largest possible choice is
This gives us a solution: choose d = \gcd(A_2 - A_1, A_3 - A_2, \ldots, A_N - A_{N-1}), and then compute the number of insertions needed.
To compute this, note that between A_i and A_{i+1} we have d\cdot (k+1) = A_{i+1} - A_i, so k = \frac{A_{i+1} - A_i}{d} - 1.
Add this up across all 1 \le i \lt N to obtain the final answer.
TIME COMPLEXITY:
\mathcal{O}(N \log\max (A)) per testcase.
CODE:
Editorialist's code (PyPy3)
import math
for _ in range(int(input())):
n = int(input())
a = list(map(int, input().split()))
d = a[1] - a[0]
for i in range(2, n):
d = math.gcd(d, a[i] - a[i-1])
ans = 0
for i in range(1, n):
ans += (a[i] - a[i-1]) // d - 1
print(ans)