E - 本棚の整理 / Organizing the Bookshelf 解説 by admin
Claude 4.5 OpusOverview
This problem asks us to find the minimum number of operations needed to sort an array using only adjacent element swaps. This reduces to finding the inversion count of the array.
Analysis
Key Insight
The operation of swapping two adjacent elements is the same operation performed in “bubble sort.” It is well known that the number of swaps required to complete bubble sort equals the inversion count of the array.
What is inversion count? It is the number of pairs \((i, j)\) in an array where \(i < j\) and \(A_i > A_j\).
For example, for the array \([3, 1, 2]\): - \((1, 2)\): \(A_1 = 3 > A_2 = 1\) → inversion pair - \((1, 3)\): \(A_1 = 3 > A_3 = 2\) → inversion pair - \((2, 3)\): \(A_2 = 1 < A_3 = 2\) → not an inversion pair
Therefore, the inversion count is \(2\). Indeed, we can sort it with 2 adjacent swaps: \([3,1,2] \to [1,3,2] \to [1,2,3]\).
Problem with the Naive Approach
If we check all pairs \((i, j)\) to see whether \(i < j\) and \(A_i > A_j\), the total number of pairs is \(\frac{N(N-1)}{2}\), resulting in \(O(N^2)\) time complexity. With the constraint \(N \leq 2 \times 10^5\), this would result in TLE (Time Limit Exceeded).
Solution
By using merge sort, we can count inversions in \(O(N \log N)\) time while sorting the array.
Algorithm
We count inversions during the “merge” step of merge sort.
In merge sort, we divide the array into left and right halves, sort each half, and then merge them. During the merge:
- When taking an element from the left half: no inversion occurs
- When taking an element from the right half: it forms inversion pairs with all remaining elements in the left half that are larger than it
Example: Merging left = [2, 4], right = [1, 3]
- Take right[0]=1 → smaller than remaining left elements [2,4] → inversions +2
- Take left[0]=2 → inversions +0
- Take right[1]=3 → smaller than remaining left elements [4] → inversions +1
- Take left[1]=4 → inversions +0
Total: 3
Complexity
Time Complexity: \(O(N \log N)\)
- Same as the complexity of merge sort
- The array is divided \(\log N\) times, with \(O(N)\) total processing at each level
Space Complexity: \(O(N)\)
- Due to the temporary array used during merging
Implementation Notes
Base case for recursion: When the array length is 1 or less, return with inversion count 0
Counting during merge: When selecting an element from the right array, add the number of “elements not yet merged from the left array” to the inversion count (
len(left) - i)Stable sort condition: By using
left[i] <= right[j]with the equality sign, we prioritize the left side when values are equal, preventing unnecessary counts (in this problem, since we have a permutation, no duplicate values exist, but this is important for general implementations)Source Code
def merge_sort_count(arr):
if len(arr) <= 1:
return arr, 0
mid = len(arr) // 2
left, left_inv = merge_sort_count(arr[:mid])
right, right_inv = merge_sort_count(arr[mid:])
merged = []
inv_count = left_inv + right_inv
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
merged.append(left[i])
i += 1
else:
merged.append(right[j])
inv_count += len(left) - i
j += 1
merged.extend(left[i:])
merged.extend(right[j:])
return merged, inv_count
def solve():
N = int(input())
A = list(map(int, input().split()))
_, inversions = merge_sort_count(A)
print(inversions)
solve()
This editorial was generated by claude4.5opus.
投稿日時:
最終更新: