Not sure dear 
Is there any tutorial for learning minimum vertex cover properly?
Could u please explain what are your vertices and edges?
I thought this :
vertices: each set bit i,and then n array elements
edges:
set bit i:to all array elements whose ith bit is set
but for the example given above,
we have edges:
0 th bit set:1,3,5
1 st bit set:2,3
2 nd bit set:4,5
but here the minimum vertex cover is {0th bit set,1st bit set,2nd bit set}
I know i am missing something,could u please correct me?
Yes you are right sadly it doesnt work. I have an alternate solution though. Let me think on it a bit before posting here because it might also be wrong xD
sorry this doesnt work. Trying to think of some alternate approach 
Is the question link active now?
Anyone found the solution ?
The only solution which seems to fit in the time constraints is this - The smallest subset with maximum bitwise OR. - Codeforces
I came up with the following approach :- The maximum bitwise or of all the numbers is the bitwise or of all the numbers. Let is be n. The binary representation of n will have log(n) + 1 bits. We can create an array of size log(n) + 1 which will the number of numbers in array A with ith bit set, let’s call it cnt. So we can traverse the array and for each number we can iterate through all its bits and if ith bit is set, we can increase cnt[i] by 1. Finally, let’s initialize our answer to be the size of the array. We traverse the array again and see if we can remove the ith element, that is check if removing ith element still gives the maximum xor. This can be done by seeing if a set bit in the cnt array with cnt[bit] > 0 will become equal to 0 if arr[i] was removed. If it doesn’t we can remove it from the answer.
I’m not sure if the above approach is correct and if it is not can someone give me a counter example for which this approach will fail ? Thanks!!