Can someone please help me figuring out why I am getting a Run Time error (Segmentation Fault) in my code.
BRIEFING MY SOLUTION:
I have ran a dfs to find the depth and parent of a node. There are two arrays ‘Increasing’ and ‘Decreasing’. Increasing stores the node(only consisting of ancestors) till which the values are strictly increasing. Array decreasing does the similar thing, in which values decrease.
After that, I make a sparse table, for computing LCA(x, y) in (O(log n)).
While processing the queries, I find the LCA of the two nodes, and then check the conditions which will result in the answer 1 and 0.