SCALSUM - Editorial

I see. Thanks

1 Like

I have the same doubt, this method has O(N*root(N)) complexity.

1 Like

Is there any other alternative for storing the result other than maps in this logic?
I was looking at pairing functions to store the result in an array but all the index were too big.

thanks for helping.It worked

1 Like

values[u][i]*values[v][i]
will overflow. Type cast it to long long
AC : CodeChef: Practical coding for everyone

I can’t think of any having O(1) complexity and small constant.
But you can modify unordered map’s parameters and get AC

Instead of using unordered map you can reenumerate vertices in each layer and make an array (vector) for each layer and use direct addressing as in the normal way (if we have K vertices in the layer then index of (i, j) is K*i + j).

This reduced the runtime of the code slightly-CodeChef: Practical coding for everyone
but still TLE in 2 cases…i guess only a solution without hashmap will get AC :frowning_face:

Can you clear if in some block there is only one layer or may be more then one layer in a block and then those layer cannot be have vertices less the k/root(n) then how we handling it??

sorting the queries depth-wise will give 60 pts CodeChef: Practical coding for everyone

Modified your code a bit(just the map declaration part) after reading some more the blog post given by @l_returns
https://www.codechef.com/viewsolution/39658815
Got an AC 100 pts. Will now implement the query sorting myself. Thanks @l_returns @bennyjoseph

2 Likes

I feel really bad for not trying that now :pensive:. Good that I see AC for my code now. Thanks for this!

1 Like

Did you use any recursion here ?

Can you please explain the add and remove process in mo’s algorithm

How do you maintain the “scalar product”? Please explain. I was trying the same, but my update takes much more time than expected.

yeah thank u !!!

I agree :frowning:

I tried this approach, where I found it failed just one test case in each of the second and third blocks, giving me just 15 points. It appeared that the caching of such a large number of instances was too slow (using C# Dictionary), so I changed it to cache only the products at a depth of a multiple of Floor(Sqrt(max depth)). With this changed it passed the second block of tests, giving me more points, but failed more cases in the third block.

1 Like

can you tell what type of map you are using gp hashtable ?? i am facing same problem of tle
code link → CodeChef: Practical coding for everyone

bhai mere sath bhi yahi hua last subtask me 1 hamesa tle de rha hai :cold_sweat:

https://www.codechef.com/viewsolution/39617023

1 Like