# LPC - Editorial

Author: iceknight1093
Tester: apoorv_me
Editorialist: iceknight1093

1311

None

# PROBLEM:

You’re given a string S of length N and another string K of length M, both containing the digits 0-9 only.
Find the minimum number of moves needed to make K appear as a substring of S, where a move consists of either incrementing or decrementing a digit of S.

# EXPLANATION:

Suppose we fix which length-M substring of S is going to turn into K; say the substring starting at index L.
Let’s try to find the minimum number of moves we need to turn this substring into K.
That isn’t very hard:

• Notice that for each i = 1, 2, 3, \ldots, M, we’d like to turn S_{L+i-1} into K_i.
• To do that, we can either go forward or backward from S_{L+i-1}, till we reach K_i.
One of them requires |S_{L+i-1} - K_i| moves, while the other requires 10 - |S_{L+i-1} - K_i| moves (the values are small, so you can even just run a loop to compute these values, instead of using math).
So, we need the minimum of these two numbers to convert S_{L+i-1} to K_i.

Hence, for a fixed L, the required cost is

\sum_{i=1}^M \min(|S_{L+i-1} - K_i|, 10 - |S_{L+i-1} - K_i|)

This is easy to compute in \mathcal{O}(M) time.
Now, just try every possible starting position L, i.e, all 1 \leq L \leq N-M+1.
Compute the cost for each of them; and the final answer is the minimum of all these costs.

# TIME COMPLEXITY

\mathcal{O}(N\cdot M) per testcase.

# CODE:

Editorialist's code (Python)
for _ in range(int(input())):
n, m = map(int, input().split())
s = input()
pat = input()
ans = 10**9
for i in range(n-m+1):
cur = 0
for j in range(m):
d1, d2 = ord(s[i+j]), ord(pat[j])
cur += min(abs(d1 - d2), 10 - abs(d1 - d2))
ans = min(ans, cur)
print(ans)