Worst time complexity for time limit 0.5 seconds

Can someone please tell how the worst time complexity of any algorithm changes, for a time limit of 0.5 secs instead of 1 sec, for all sizes of input, N (ie. for 10 <= N <= 10^18). For eg: if For Time Limit: 1 sec, N <= 10^4, the worst time complexity of any algorithm can be O(n^2); and if the Time limit becomes: 0.5 sec, N <= 10^4, what will be the worst time complexity of any algorithm? Please explain how it changes for all N ie for N <= 10,N<=100,N<=10^4,N<=10^6…like that!

Time complexity of an algorithm isn’t dependent on the time limit. The relationship is usually the other way around - the problem setter will have a good algorithm in mind and set some combination of the time limit and input data size according to what appears to be possible, hopefully with a little bit of leeway.

I can see that as you approach a problem you might hope to take some clue on the time complexity of the expected algorithm from the time limit given, but that’s simply you expecting fair play from the setter. If I recall correctly there was some expectation that a time limit of a second might allow the order of a (few) billion (10^9) simple operations.

An O(N^2) algorithm will take (approximately) a hundred times longer each time you increase N by a factor of ten. So in a second you might manage to expect to handle N=10000 and not handle N=100000, if my recollection about ops per second is correct. O(N logN) algorithms change in a less obvious way but might increase time by a factor of 12-15 when multiplying N by 10, in a relevant range.