# MRZRLT - Editorial

Setter, Tester and Editorialist: arvindpunk
P.S. Big thanks to Lebossle for helping me out with the proof.

Hard

### Prerequisites:

Probability Distributions, Expectation

### Problem:

Given N guns, each having S_i chambers and B_i bullets in random chambers with equal probability, find the maximum expected value of the number of tries it takes to fire the first bullet.

Formally, for each of the N guns - among S_i objects, out of which B_i objects denote success, what is the expected number of trials for the first success to happen without replacement. Find the maximum of all such expectation values.

• 1T10
• 1N10^5
• 1B_iS_i10^9

### Explanation:

Assuming we are dealing with a gun with n chambers and m bullets.

In terms of probability, n is the number of objects, while m denotes the number of success events. This scenario reminds us of the classic Geometric distribution. But geometric distribution works only on scenarios where the success probability remains constant throughout, i.e., all events are independent. This is not so in our case.

Let X be a random variable that holds the number of trials, then,

• For 0 failure(s) and 1 success,
Pr(X = 1) = \frac{m}{n} ​
• For 1 failure(s) and 1 success at the end,
Pr(X = 2) = \left(\frac{n - m}{n}\right)\frac{m}{n-1} ​
• For 2 failure(s) and 1 success at the end,
Pr(X = 2) = \left(\frac{n - m}{n}\cdot\frac{n - m - 1}{n - 1}\right)\frac{m}{n-2} ​

and so on.

We can clearly see, both the failure and success probabilities are not constant.

This is where negative hypergeometric distribution (NHG) is useful. It deals with cases where we need to find probability without replacement.

In the negative hypergeometric distribution, samples are drawn until r failures have been found, and the distribution describes the probability of finding k successes in such a sample.

This may seem quite different from our scenario but we can modify a few things to reach the required answer.

In NHG, trials are taken until r​ failures have been reached. This implies that the last trial will always be a failure. If we take r = 1, then it follows that the only failure will be in the last trial. We can clearly see, this is quite close to our case (except inverted), i.e., the only success will be in the last trial.

Hence, now we have number of success events = n - m.

For a random variable X that follows NHG distribution, the probability mass function (pmf) is given by

Pr(X = k) = \frac{\binom{k + r - 1}{k}\binom{n - r - k}{n - m - k}}{\binom{n}{n - m}} for k = 0, 1, 2, . . . , n -m

where k is the number of success events.

Now, the expectation value E(X) of the above distribution is given by (for a detailed derivation, click here)

E[X] = \frac{r(n - m)}{n - (n - m) + 1} = \frac{r(n-m)}{m + 1}

For r = 1,

E[X] = \frac{n-m}{m+1}

It might look like we’ve reached the final answer, but remember, this gives the expected number of successes and does not include the failure at the end. Our case required it to include the failure as well, and since that will always be one more than the number of success, we can simple add 1 to our expected value

E[X] + 1 = \frac{n - m}{m + 1} + 1 = \frac{n+1}{m+1}

which is the final required answer.

All that is left to do is find the maximum possible expectation value for all the guns.

### Time complexity:

Calculating expectation for each case is O(1) and there are N guns, so a total resulting time complexity of O(N) per test case.

### Solution:

Setter’s Solution:

#include <iostream>
#include <algorithm>
#include <iomanip>
#include <vector>
using namespace std;

int main() {
int t;
cin >> t;
cout << fixed << setprecision(7);
while(t--) {
int n;
cin >> n;
vector<double> A(n), B(n);
for (int i = 0; i < n; i++) cin >> A[i];
for (int i = 0; i < n; i++) cin >> B[i];
double ans = 0.0;
for (int i = 0; i < n; i++) {
ans = max(ans, (A[i] + 1.0)/(B[i] + 1.0));
}
cout << ans << endl;
}
return 0;
}


Can we get a quick code golf for the solution?