Official

E - Cheese Editorial by evima


Let us solve the case \(k=N-1\).

For each \(0 \leq i \leq N-1\), let us fix the number \(a_i\) of throws of cheese from coordinate \(i\). Here, we just fix the numbers of throws, leaving degrees of freedom for the order, but it turns out from the observation below that the order is irrelevant. Therefore, at the end of the calculation, we will multiply the result by the probability that this distribution of throws is achieved.

Furthermore, let us also fix the number \(x\) of times cheese passes coordinate \(-0.1\).

Then, for each \(0 \leq i \leq N-1\), we can find the number \(b_i\) of times cheese passes coordinate \(i +0.1\), as \(b_i=x-i+\sum_{j \leq i} a_j\).

For \(0 \leq i < N-1\), one of the \(b_i\) pieces of cheese going \(i \to i+1\) are eaten by mice, which happens with probability \(1-1/2^{b_i}\).

For \(i=N-1\), none of the \(b_i\) pieces of cheese are eaten by mice, which happens with probability \(1/2^{b_i}\).

It is possible that \(b_i\) gets negative on the way, but there is no need to handle this case, because then we would have \(b_i=0\) at some point, multiplying the result by \(1-1/2^{b_i}=0\).

Note that multiplying all these probabilities does not give the answer for fixed \(a_i\) and \(x\). This is because it can involve independent cycles, that is, cheese without starting and ending points. However, from the fact that adding one such cycle multiples the probability by \(1/2\), it is possible to exclude these cases. Specifically, let us find the (infinite) sum for all \(0 \leq x\). Here, for any valid solution, the combination of that and \(i\) cycles has the weight of \(2^i\). Since \(1+1/2+1/4+\cdots=2\), we can see that the weights of the valid solutions are doubled, so we can multiply the sum above by \(1/2\) to count only the valid ones.

Consider finding the infinite sum of probabilities for all \(0 \leq x\). By letting \(y=1/2^x\), all probabilities can be expressed as polynomials of \(y\), so the sum can be found as the infinite sum of \(y^p=1/2^{xp}=(1/2^p)^x\) for each \(p\) in the end.

Finally, we have to unfix \(a_i\) and let it move. Again, we define \(y\) similarly to the above and do DP to maintain polynomials of \(y\).

Each \(k\) takes \(O(N^4)\) time, for a total of \(O(N^5)\) time. \(O(N^6)\) with not too large constant factors can also pass.

posted:
last update: