PROBLEM LINK:Author: Sergey Nagin Tester: Jingbo Shang Editorialist: Hanlin Ren DIFFICULTY:Challenge PREREQUISITES:Greedy, Heuristics, etc PROBLEM:You are given a matrix $A$ of size $n\times n$. You need to construct a matrix $B$ of the same size. Initially all element in $B$ is $0$. By one operation you can choose $9$ parameters $a_1,a_2,b_1,b_2,c_1,c_2,d,l,r$, and for all $l\le i\le r$, let $$j=\lfloor\frac{a_1i^3}{a_2}\rfloor+\lfloor\frac{b_1i^2}{b_2}\rfloor+\lfloor\frac{c_1i}{c_2}\rfloor+d,$$ then set $B_{i,j}\gets 1$. You need to use as few as possible steps to construct such a $B$, that $A$ and $B$ has at most $100$ different cells. EXPLANATION:I'll list some common approaches(that are used by more than one contestants) below. The most trivial oneIt's very trivial to generate a valid solution. Let $a_1=b_1=c_1=0$,$a_2=b_2=c_2=1$,$d=y$,$l=r=x$, then we can use one such operation to make $A[x][y]\gets 1$. Also this approach doesn't exceed the $N^2$ operations limit. It can get 3.21 pts. A solution. If you choose to ignore the last $100$ $1$'s, you can get 3.261 pts, which is a little bit higher. A solution. Other approachesLet's find consecutive $1$'s in a column. For any maximal(nonextendable) consecutive segment of $1$'s, we only need one operation to deal with. This approach gives you 36.365 pts. A solution. We can, of course, stop the algorithm when there's at most $100$ $1$'s. However this only improves slightly to 37.512 pts. A solution. A smarter approach is to sort all nonextendable consecutive segments of $1$'s, and ignore the shortest ones of them until you ignored $100$ cells. This gives a bigger improvement to 43.704 pts. A solution. Another approach: when we meet $a_{i,j}=0$ with $a_{i1,j}=1$ and $a_{i+1,j}=1$, we modify $a_{i,j}$ to be $1$. Of course such modification can't be done more than $100$ times, and some contestants choose to have at most one modification per column. If we didn't run out of the limit of $100$ different cells, we use the sorting approach above. This gives you 43.915 pts. A solution. ALTERNATIVE SOLUTION:Please, feel free to discuss your approaches. As this is a challenge problem, there must be a bunch of different approaches!
This question is marked "community wiki".
asked 07 Sep '17, 19:17

Let me be the first one to share my nontrivial ideas. We choose some of the valid functions $$ f(x) = \left\lfloor \frac {a_1 x^3}{a_2} \right\rfloor + \left\lfloor \frac {b_1 x^2}{b_2} \right\rfloor + \left\lfloor \frac {c_1 x}{c_2} \right\rfloor + d, $$ and then list them in a table for every point (i, j), i.e. the table stores which functions go through (i, j) for each (i, j). Then we deal all point (i, j) with value 1 in some order. Valid functions are chosen practically. (just making every attempt.) When we deal a point (i, j), greedily find the function across the most point with value 1. The order of the dealing points is from right to left in the x axe, and from middle to edge in the y axe. Here is my best code. I'd like to invite the top three coder @ceilks @mugurelionut @dreamoon4 to share their approaches. answered 11 Sep '17, 20:56
1
I really appreciate you making efforts regarding solution of challenge problems. I have started attempting challenge problems only because I saw myself coming across your editorials/discussions and they made me realize that thinking a bit more optimally I can get a significant (nonzero :p) score in them. :)
(11 Sep '17, 21:09)

I pregenerated a set of linear and quadratic functions (all with d=0), for which I precomputed all their values. Out of these I subselect some functions for every test. I think I keep almost all possible linear functions (with d=0 there are only approx. 6K); I filtered the quadratic functions much more  I keep only those with abs(b1/b2) and abs(c1/c2) below some small threshold (I think ~24K of them). Then I run a Greedy algorithm. I pick a random point and I choose the function passing through it which covers the most yet uncovered points. When iterating through the selected functions, I now consider their correct value of d (that value which makes them pass through the current point, if it's within the appropriate range). When there aren't too many points left to cover, I also consider a small subset of cubic functions. Choosing the "base" point randomly is important. I tried iterating in some order (toptobottom, lefttoright, and all their variations), and the results are from 10% to 4% worse. I only cover 1s and I don't allow the functions to go outside of the board. I stop when there are <= 100 1s left to cover. I also remove redundant functions, if any (there were cases when some functions added early on become redundant by the end). This Greedy takes ~0.8 seconds on 20 test cases. I tried a lot to make it faster, but in the end that's all I could do. There are several reasons why making it faster would have improved my score: 1) I could have tried more functions in the available time (the improvement would not have been that large); 2) I could have run more iterations of stage 2, described below. In the remaining time, I try to improve the solution found by Greedy for the top 10 test cases with the largest scores (essentially those with many 1s). I select the F functions which cover the smallest number of unique 1s (i.e. the least efficient ones) and remove them from the solution. Then I run the Greedy algorithm to "repair" the solution (again, until <= 100 uncovered 1s remain). It's possible that a "repair" actually increases the size of the solution  that's fine, I accept the changes anyway, in order to reach more diverse solutions. I could run only ~10 iterations with F=4 for each of the top 10 tests in the remaining ~0.2 seconds and this improved my score by ~2.5% (I implemented it in the last day of the contest). Giving just 0.15 more seconds to my solution, I could obtain an extra 1% improvement on my local tests (which I think are a good approximation of the official ones, because any improvement I obtained locally was also an improvement on the official tests). Giving it 1 more second, I could get more than 2% improvement, and with 9 more seconds, more than 4% improvement. So, again, I tried to optimize the code for speed, in order to be able to run as many iterations as possible, but in the end I reached my limit :) Anyway, it seems to me that the problem could have allowed more interesting approaches if the time limit had not been so tight (1 second for 20 test cases is very tight!). answered 13 Sep '17, 04:32

No approach for 100 solution? Or will it be added later?