An alternative trick to solve many HLD problems using basic dfs

You can solve such questions using DFS order property. TAQTREE problem on codechef will help you understand it. Following is link to editorial of that question, there both HLD and DFS order property are explained in detail.
TAQTREE Editorial

Nice Trick . But how would you decide the depth at which the ancestor storing will be applied … ? What would be the criteria for the appropriate depth ?

What is vec[] and cost[] in your PSHTTR implementation…Please explain…

Brilliant decomposition trick! Though at first glance, the best choice is mod \sqrt{N} \approx 316 for O(Q\sqrt{N}), but interestingly when I ran an algorithm, using mod 1000 was surprisingly faster in practice…!!

I’m interested in finding out a method regarding the average case performance of such trick on graphs. The best modulo around some sort of expected value depending on graph size would be ideal. Hope that someone can make the time to come up with some method, or better yet publish a paper, that would be elegant! :slight_smile:

2 Likes

Nice Trick…! but how to handle updates as well?

Amazing Trick!
Thanks for sharing

Yeah I realized it is similar to square-root-decomposition but I just wanted to mention it here so that others could benefit from it as it is a viable alternative for HLD in short contests.

1 Like

For this…we can just run an initial DFS and find the value of the appropriate depth for which memory complexity is within the limits.Maybe for your case 999 works just fine.Similarly it can be proved that always such a value exists.

3 Likes

Thats nice. With this addition you seem to get a widely applicable n sqrt(n)-Algorithm for trees.

I had mentioned this fact in upd1 but I couldn’t prove it . I mentioned it by intuition. Could you give me a formal proof so that I could add it???

Proof : Let freq[i] store the number of nodes with value i=(depth%1000). Note that i takes value from 0 to 999. Also note that it covers all the nodes of the tree.Also if we take the minimum of all these 1000 values of freq[i], the minimum value <= 100 (approx).Because in the worst case we can have 100 nodes at each level.Hence that choice of i will be just perfect.Please correct me if I am wrong.

1 Like

A simple modification would be to store values in nodes only if their depth%sqrt(n) == 0 as well as height > sqrt(n). Now you’ll need to go up at most 2*sqrt(n) nodes for each query, but this guarantees space efficiency, even in the case you mentioned.

@drajingo What about the space complexity if there are 10^5 - 10* sqrt(10^5) nodes at depth 10sqrt(10^5).It increases to about 310^8 .

@abx_2109 I think I get your proof. Thank you very much. I will put in this post.

You can do this by performing an initial dfs and having count of number of nodes having depth%1000=1,2,3,… and take the least count value as the depth at which ancestor storing is done.

@abx_2109 I think you misunderstood me. By height of a node I meant the maximum number of children in a path to a leaf from that node. For example if the tree is a path, the leaf node has height 0, and the root node has height n.

cost[]:
Assume you have 2 nodes u and v which have an edge of cost ‘C’ between them. If u is the parent of v then I assign cost[v]=‘C’. Using this I turn this edge-cost tree into sort of node-cost tree which I found it easier for processing. Since each node has exactly one parent(except root) this technique is useful.

vec[]:
In my implementation I store cost of all ancestors of a node for all nodes whose depth%1000=1 . I store this information in the vec[] vector.

Let me know if you have any other doubts.

Thanx understood the idea behind it !!!

@hikarico Thanks. The choosing of value can also depend on the tree that has been given so we can choose the value depending on the tree also as mentioned in Upd1

In updates for adding nodes we can actually follow the same procedure as we are going to change only nodes with depth%1000=x and so if a new node is added it doesn’t make a difference to its ancestors(See my GPD solution).

For changing of values I am thinking of a solution and if I could get one I will update the blog.If someone else has an idea you could put it here and I will add it to the blog with your name/handle.