Wouldn’t the merge over all tries be O(n^2) on a tree of the following form?

```
1--3--5--7 ...
| | |
2 4 6 ...
```

Wouldn’t the merge over all tries be O(n^2) on a tree of the following form?

```
1--3--5--7 ...
| | |
2 4 6 ...
```

```
if(a == NULL)
return b;
if(b == NULL)
return a;
```

This snippet of *merge* function will tend to give complexity of order O(n) in example given by you.

I think the worst case time complexity may be of order O(n * n^1/2) in the case where root node have n^1/2 children and each children’s subtree have n^1/2 nodes…

Use of compressed trie might have balanced this.

Can someone explain how the author’s solution uses O(N * log N * log C ) space for the persistent data structure ? Also, for a node, why are we excluding it’s child with largest subtree size, while calling update ? I’m guessing both are linked… it would be really helpful if someone could explain this… Thanks

The author’s solution was to apply persistence to each node in the tree and you can do this while traversing the tree in bottom-up fashion. Computing the tries for childrens you can build the trie for the parent. It requires merging and thus merging as explained here is done(small to large trick) .

Thus the merging part takes N * logN thereby making the overall space complexity just

O(N * log N * log C ).

Thanks for the reply… I had already gone through this post on codeforces. For the proof of complexity they just gave a reference to HLD/dsu, without details… I tried deriving it on my own, but I couldn’t… Is there any proof that clearly explains how merging takes N * log N time, in this case space ?