Problem Link :
Author : Evgeny Shulgin
Tester : Misha Chorniy
Editorialist : Anand Jaisingh
PreRequisites :
Rectangular Sums
Problem :
Not repeated here.
Explanation :
First of all, this problem should not really be very new for experienced participants. Most of the techniques required to solve this problem may be standard by now.
Let’s try and observe the structure of any good square. Looking at it, we can make the following observations :

All rows in it with odd index have the same structure

All rows in it with even index have the same structure.

There are two possible options for odd rows, and the mask for even rows is exactly the inverse mask of the odd rows.

Masks of type 1 are of the form 010101... and masks of type 2 are of the form 101010..., so mask1 is the inverse mask of mask2 and viceversa.
Now, further in this problem, we are going to try and find for each cell (x,y), for each possible side length l, the minimum cost to make square (xl,yl),(x,y) a good square. Let the number of changes required be z. Then we can do something like ans[z]=max(ans[z],l), i.e we brute the maximum length square we can make in exactly z changes.
Later we can do something like ans[z]=max(ans[z],ans[z1]) from left to right, i.e the maximum side square we can make in \le z changes. Then we can just output answers to queries in O(1). So, now let’s solve a subproblem, i.e for each cell (x,y), for each possible side length l, find the minimum number of changes to make square (xl,yl),(x,y) a good square.
The minimum cost for a square is :
Sum of Hamming distance between mask of type 1 and all rows inside the square having odd index + Sum of Hamming distance between mask of type 2 and all rows inside the square having even index or vice versa
For example, consider you want to make the following square a good square :
Then, there are two cases :

For all rows with an arrow in front of them we sum the hamming distance between these rows and mask of type 1 + sum the hamming distance between rows that do not have arrows in front of them and mask of type 2

For all rows with an arrow in front of them we sum the hamming distance between these rows and mask of type 2 + sum the hamming distance between rows that do not have arrows in front of them and mask of type 1
The minimum among the two is the answer we require.
So,to pass subtask 1, we can fix the bottom left corner (x,y) and the side length L. This square has top left (xL,yL). Then we can find the minimum over both possible cases mentioned above by looping over all even and odd rows separately and summing the hamming distances.
The pseudo code for it would look something like :
for(int i=1;i <= n;i++)
{
for(int j=1;j<=m;j++)
{
for(int len=1;len<=min(i,j);len++)
{
// now we find min cost for square (ilen,jlen),(i,j)
// for all odd indexed rows try mask1 and for all even indexed rows try mask 2
or vice versa
}
}
}
However, the above approach is not enough to pass subtask 2. But guess what, we can just do some kind of precomputation and maintain some prefix sums to avoid doing the same things again and again. Let odd1*[j] denote the sum of hamming distances between mask 1 and all odd rows located in the rectangle with top left (1,1) and bottom right (i,j). Similarly we can construct tables even1*[j], even2*[j] and odd2*[j].
For example, odd1(3,3) would store :
That is the hamming distance between mask of type 1 and odd rows of prefix of length 3 having row index \le 3
All these tables can be built in O(N \cdot M). Now, once we fix the bottom right and side length, it is possible to find the minimum cost required to transform this into a good square in O(1) operations. But how ?
We can use something called as rectangular sum queries, you can read about it here. We can use the above approach to query each of the matrices odd1,odd2,even1,even2. Now, the answer will be minimum over assigning mask 1 to odd rows and mask 2 to even rows, or vice versa, but now we just query these tables for sums in O(1), not loop again and again.
Reading the code to understand statements written above should be trivial now. That’s it .
Time Complexity : O(N \cdot M \cdot Min(N,M))