PROBLEM LINK:Author: Full name Tester: Full name Editorialist: Oleksandr Kulkov DIFFICULTY:HARD PREREQUISITES:Momentgenerating functions, FFT, computing functions on polynomials PROBLEM:There are $k$ chefs and $m$ workers. For each worker we know integers $a$ and $b$ such that worker can pick from $a$ to $b$ fruits. There are $n$ groups of workers, each group having same $a$ and $b$. Each worker uniformly choose some number from his $[a,b]$ and picks that amount of fruit. If the total number of fruits picked during the day is $S$, then goodness of the day is $F(S)=S^k$. You have to find expected value of $F$ modulo $998244353$. Additionally we know that $n\cdot k \leq 10^5$. QUICK EXPLANATION:Bring all stuff you know about combinatorics and probability theory, add some stuff you don't know, blend it in mixer until you get the solution. EXPLANATION:$p=998244353$ is prime number and $p1=2^{23} \cdot 7 \cdot 17$, so there is root of unity of order $2^{23}$ modulo $p$ and we can use this mod for fft which is strong hint here. Let's try to get some convenient look for the answer. Let's say that each group is described by triple $\{a_i,b_i,c_i\}$ where $c_i$ is number of workers in the group. Let's find momentgenerating function of random variable $S$: $$M_S(t) = \mathbb E[e^{tS}]=\sum\limits_{k=0}^\infty \dfrac{t^k\mathbb E(S^k)}{k!}$$ We will only need to get coefficient near $t^k$ in expansion of $M_S(t)$ and multiply it by $k!$ to find an answer. We should note that amounts of fruits picked by each worker are independent random variables, thus momentgenerating function of $S=\sum\limits_{i=1}^m s_i$ is product of momentgenerating functions for each separate worker. Let's find it for worker with given numbers $a$ and $b$. $$M_{s_i}(t)=\mathbb E[e^{ts_i}]=\dfrac{1}{ba+1}\sum\limits_{k=a}^b e^{tk}=\dfrac{1}{ba+1}\cdot \dfrac{e^{ta}e^{t(b+1)}}{1e^t}$$ Now we can say that for whole momentgenerating function holds: $$M_S(t) = \dfrac{1}{\prod_{i=1}^n(b_ia_i+1)^{c_i}}\prod\limits_{i=1}^n\left[\dfrac{e^{ta_i}e^{t(b_i+1)}}{1e^t}\right]^{c_i}$$ Coefficient on the left is constant and can be calculated in advance. As for product on the right, we can efficiently calculate it by using polynomial $\log$ and $\exp$, that is: $$\prod\limits_{i=1}^n\left[\dfrac{e^{ta_i}e^{t(b_i+1)}}{1e^t}\right]^{c_i} = \exp\left(\sum\limits_{i=1}^n c_i \left[ta_i+\log(1e^{t(b_ia_i+1)})\log(1e^t)\right]\right)$$ For each summand you may explicitly calculate first $k$ coefficients in expansions of $1e^{t(b_ia_i+1)}$ and $1e^t$ using formula $e^x = \sum\limits_{k=0}^\infty \dfrac{x^k}{k!}$. After this you may perform $\log$ and $\exp$ as described in this and this articles. This will give you $O(nk \log nk)$ solution. There are few things you should admit before using $\log$ straight ahead. Since coefficient near $x^0$ must be evaluated manually, you would like it to be equal $0$ because you work in modulo field. Argument of your $\log$ operator must have coefficient $1$ near $x^0$. To obtain this you should note following:
ALTERNATIVE SOLUTION 1:There is also simpler $O(nk \log m \log k)$ solution not messing with $\log$ and $\exp$ but only counting fractions in multipliers using formula for polynomial inverses, which can be found in same articles as for $\log$ and $\exp$ and binary exponentiation to power $c_i$, though you need good constant to pass with such solution. ALTERNATIVE SOLUTION 2:This is the one I wrote at first. After that I realized that there is simpler approach which is at main part of editorial now. To get expected value for $S^k$ we should find $M_S^{(k)}(0)$ since: $$M_S(t) = \sum\limits_{n=0}^\infty \dfrac{t^n\mathbb E(S^n)}{n!} \implies M_S^{(k)}(t) = \sum\limits_{n=0}^{\infty} \dfrac{t^n\mathbb E(S^{k+n})}{n!} \implies M_S^{(k)}(0) = \mathbb E(S^k)$$
NOTE ON POWER SUMS:In alternative solution 2 you have to calculate $S_p=1^p+2^p+\dots+n^p$. There is more direct way to calculate them if you consider exponential generating function: $$S(x) = \sum\limits_{p=0}^\infty \dfrac{S_p}{p!}x^p=\sum\limits_{k=1}^n \sum\limits_{p=0}^\infty \dfrac{k^px^p}{p!}=\sum\limits_{k=1}^n e^{kx}=\dfrac{e^xe^{x(n+1)}}{1e^x}$$ The fraction on the right is actually what we got in alternative solution 2 in closed form and you once again will need to reduce numenator and denominator by $x$ and find inverse series of denominator, which still be $O(k \log k)$. AUTHOR'S AND TESTER'S SOLUTIONS:Author's solution can be found here. Tester's solution can be found here.
This question is marked "community wiki".
asked 20 Jul '18, 07:05

Additionally, the first solution is related to Bernoulli Numbers , and the alternative solution is related to Stirling numbers of the second kind. If you find the formulas above tough to read or understand, take a look at these numbers. Much of the formulas in the solution is actually equivalent to the identities of them. Author's (alternative) solution in a simpler way: Calculate $\sum\binom S0,\sum\binom S1,\dots,\sum\binom Sk$ instead of $\sum S^k$ (Divided by $\prod(b_ia_i+1)^{c_i}$ in the end to get expectation). To write $S^k$ in sums of $\sum\binom S0,\sum\binom S1,\dots,\sum\binom Sk$, we use Stirling numbers of the second kind. Define $P(x)=\sum_{i=0}^kx^i\sum\binom Si$, Q(a,b,x)= $\sum_{i=0}^kx^i\sum_{j=a}^b\binom ji$. We can see if we add a new worker (a,b), new $P'(x)=P(x)\cdot Q(a,b,x)$. That means we only need to get $\prod Q^{c_i}(a_i,b_i,x)$. Every single $Q$ can be calculated directly because $\sum_{j=a}^b\binom ji=\binom {b+1}{i+1}\binom a{i+1}$. Finally, we can use FFT and divide and conquer to get $O(nk\log n\log k)$ solution. As described in the editorial, we can also use $\ln$ and $\exp$ to get faster but harder to implement $O(nk\log nk)$ solution. Most contestants' (first) solution in a simpler way: Define polynomial $P_2(x)=\sum_{i=0}^k\frac{x^i}{i!}\sum S^i$ and $Q_2(x)=\sum_{i=0}^k\frac{x^i}{i!}\sum_{j=a}^bj^i$. To get $Q$ efficiently, we use Bernoulli Numbers. Also, we can see if we add a new worker (a,b), new $P_2'(x)=P_2(x)\cdot Q_2(a,b,x)$. And the rest is exactly the same as the first solution. To calculate Stirling numbers efficiently, we use the explicit formula of the numbers and FFT. See author's code. To calculate Bernoulli numbers, we use polynomial inverses, because the exponential generating function of Bernoulli numbers is $\frac x{e^x1}$. Things described above may come differently as in the editorial, but they lead to the same formulas. answered 23 Jul '18, 07:32

QUICK EXPLANATION: Bring all stuff you know about combinatorics and probability theory, add some stuff you don't know, blend it in mixer until you get the solution.
*scrolls down to see the mathematical formulas*
....
......
*dies from Cancermathics*