I have recently seen Chef And Easy Xor Queries question in DEC17 and I was unable to come up with the solution. However I was thinking of using Trie data structure . but could not. I found this question very Interesting and want to know all the different type of approaches for this question. I would appreciate if you will simply explain your idea also.I would like to know whether this question can be solved by segment tree or not ?

I feel really guilty saying this but I got 100 points despite an O(NQ) solution. It didn’t pass the TL in C++, but barely made it through Python with a bit of optimisation.

You’re probably not looking for that solution, but Codechef should’ve made better test cases for that problem.

I solved it using Square Root decomposition technique (Dividing it into `sqrt(N)`

blocks having `sqrt(N)`

elements). I stored the xor for each block in an array. So the update queries took `sqrt(N)`

time (since we need to update only the block in which the updated index lies) and the xor query took `sqrt(N)`

time complexity, since we jump till the block in which the index i lies.

Have a look at my solution: https://www.codechef.com/viewsolution/16516986

I solved it with sqrt decomposition. Idea is pretty simple. For each block store prefix xor of that block. also keep a variable called totxor storing total xor obtained by all values before that block. So update is clearly root(N) as you just need to update the block. Query is also in root(N). As first u process the start and ending block for that query manually. For rest, suppose for block x you need to find no.of elements having k^block[x].totxor in prefixxor array of that block. Here is my code

can be solved using hashing+ square root size blocks concept;

Each query can be answered in o(sqrtn) or O(sqrtn*logn) if we preprocess using square root size block concept…

https://www.codechef.com/viewsolution/16439024

I have solved this question using sqrt-decomposition,

First break the array in sqrt(n) blocks then make one hashmap for each block for maintaining counter of each element. For update query update the block and hashmap it will take O(Sqrt(n)) for each query. For type 2 query start from 1st block and for each block increase your result value by counter of A^x(by looking at HashMap).

Here A is initially 0 and its value for a particular block will be the summation of of elements in array having indices less than or equal to last index of previous block.

Only for last block you have to do this work iteratively. This will also take O(Sqrt(n)) for each type 2 query. Therefore time complexity will be O(q * sqrt(n)).

Proof:

Given an equation A^b=x, you have to find counter of x but you know only values of A(can be calculated) and x. So, we can simply find the counter of b which is A^x. Since A^(A^x)=x.

I made a video tutorial for this problem at my youtube channel. Watch it here https://www.youtube.com/watch?v=FqunpSmkzgE

I have used SquareRoot Decomposition to solve this.

will u explain with example what a block is storing???

yes surely. see i am storing count of prefix xor in the block. means suppose my array contents in the block is 2,3,2,4,5. First i do arr[2]++ as its the starting element. Now i do 2^3=1. I do arr[1]++. Then 1^2=3. So i do arr[3]++. Like that

Edit: In my code this arr is b.

can you tell why O(q*sqrt(n)) will be o(10^6*sqrt(10^6)) = o(10^9) is accepted ?

Q=1e5 1e5*1000=1e8 is permissible

@skyhavoc actually i wasnt sure that the memory limit would pass keeping sqrt(n)=317. You see its already 1e8. Had i kept 317 it would be 3e8. So a precaution. I traded time for space :). 1e8 passes T.L too

I was trying it to do with segment trees …didn’t noticed that even q root(n) complexity would be fine…

Can you explain how to do it with segment trees if possible.

@vishesh_345 @asutoshrana I thought about it pretty much too, but I couldn’t come up with a segment tree solution, so I just settled to sqrt decomp too.