I was trying to solve the **problem**, which deals with executing Q queries on a sequence of +ve numbers, and each query is of type [L, R], where I need to find whether bitwise-AND of A[L\ldots R] will be **EVEN** or **ODD**.

The solution which I implemented gives **TLE**.

How can I optimise the solution for an **AC**, and as it is query problem, either you do some pre-computation in order to find the answer to queries in constant time or use a data-structure to answer the query in less than O(n) time as the query is related to ranges.

**Problem Link: https://www.codechef.com/PLIT2020/problems/MINIAND/**

As I have already seen some of the submission, and I can only understand the most of them have used prefix sum method, so **please don’t only give me the source-code, I really appreciate it, if someone can explain the logic in plain English or can provide a well-documented code.**

Thanks for reading.

Peace