Atcoder - rooted tree problem help

I am trying to solve this problem.

Approach: I used vector<vector> , here index of the first vector represents the node and its column represents the node it is connected to.
Using basic dfs i solved the problem using map but it is giving tle and WA on 2. Using the same logic but with vector its giving runtime error.

Please help me figure out why I’m getting this error in this solution.

i didn’t get your approach, but here is mine… you can use dfs from root to store preorder traversal of graph in vector V (why ? because in preorder a node will have it’s subtree’s node contiguous to it), also for each node we will store the right_most element in it’s subtree(because rightmost element will come rightmost in contiguous range of V) after doing this we can store indices of each element from V.
Now take a array of size n with all values as 0 , now if you have multiple ranges to increment something into them you can do just array[start]+=val and array[end+1]-=val, after doing this for all queries just go interate in array and do array[i]+=array[i-1], array[i] will store the actual value for that index… how in original problem we are required to do this , we just need to find start,end index using the preprocessing we did

Seems to me a very straightforward problem. Simply apply pre-order traversal as @arjun_9426 said. Also, you don’t need an Adjacency Matrix representation for the same. Simply using an Adjacency List representation would suffice.

that is a quite good approach, it would solve the problem of TLE.
I’m trying to find why I’m getting error so same thing doesn’t happen again in tree problem.
I’ll explain my approach for test case:

4 3
1 2
2 3
2 4
2 10
1 100
3 1

so vector would look something like

1 2 3 4 (index)
2 3
. 4

I have initialized an array with 0 of size n + 1 ---- 1 to n - let it be arr
so for each query, I pass the node(index) and x. for 1, 100 - first it will increase the value of arr[p] += x || arr[1] = 100, then check if this node points to other nodes (1 -> 2) then call solve function again with p = 2, arr[2] += x, 2 points (2 -> 3, 4) similarly increment a[3], a[4] by x, since a[3] doesn’t point to any node its vector size is 0.

solve function

void solve(vector & ma, vector& arr, int p, int &x)
{
arr[p] += x;

    if(!ma[p].empty())
    {
        for(auto c: ma[p])
        solve(ma, arr, c, x);
    }
}

here ma is the 2d vector, arr is initialized with 0, p is the node and x is the value to increment.

well I’m just looking why I’m getting runtime error using the vector approach rest everything is working.