Official

F - Do you like query problems? Editorial by evima


Instead of the sum, let us deal with the expected value.

Let us first consider \(E(t,i) := E[a_i \text{ after processing } \)t\( \text{ queries}]\).

Then, let \(P(t,i,v) := P[v \leq a_i \text{ after processing } \)t\( \text{ queries}]\) and \(X(t,i,v)\) be the corresponding indicator function.

Since \(a_i\) only takes values among \(0,\ldots,M-1\), we have \(E(t,i) = \sum_{v=1}^{M-1} P(t,i,v)\).

Let us fix \(t,i,v\). We will say the \(k\)-th query matters when \(l_k \leq i \leq r_k\) and (it is a “max” query with \(v_k \geq v\), or it is a “min” query with \(v_k < v\)).

That is, a query is said to matter when it can affect \(X(t, i, v)\). A max query that matters always have \(a_i \geq v\), and a min query that matters always have \(a_i < v\).

If there is no query that matters, \(X(t,i,v) = 0\). If there is at least one query that matters, we have \(X(t, i, v) = 1\) if and only if the last one is a max query, so we have \(X(t,i,v) = 1\) with probability \((M-v)/M\) and \(X(t,i,v) = 0\) with probability \(v/M\). Regardless of \(v\), a query matters with probability \(p_i := \frac{i(N+1-i)}{\binom{N+1}{2}} \cdot \frac{M}{2M+1}\), so we have \(P(t,i,v) = [1 - (1-p_i)^t] \cdot \frac{M-v}{M}\), from which we have \(E(t,i) = [1 - (1-p_i)^t] \cdot \frac{M-1}{2}\).

In the \((t+1)\)-th query, \(E(t, i)\) contributes to the answer when the query is a “sum” query and the segment contains \(i\), which happens with probability \(q_i := \frac{i(N+1-i)}{\binom{N+1}{2}} \cdot \frac{1}{2M+1}\).

In the end, the expected value can be computed as \(\sum_{t=0}^{Q-1} \sum_{i=1}^{N} E(t,i) \cdot q_i\). After substituting the values we calculated above, we get a formula with \(NQ\) terms. The sum of them for a fixed \(i\) over \(t\) has a similar form to a geometric series, which we can compute in \(O(\log Q)\) time, so we can find the answer in \(O(N \log Q)\) time.

posted:
last update: