Algorithmic

Given n(<=10^5) stright lines,numbered from 1 to n, each of the form y = a*x + b where a and b is provided in the input.Now you are given 10^5 queries where we have to find the line whose y is maximum at a given x in the query,if there are more than one such line ,output the smallest indexed line.How to approach this type of problem?? Any Help would be greatly appreciated.

Thanks.

1 Like

plz provide the link for the problem!!

Not a Solution but can be a help… It is a type of a Uni-variate Regression Problem… I studied a slight deviation of the problem in Machine Learning where there are n point and we need to find y on given x as approximate as possible… this is done using Gradient Descent where a Hypothetical Function and a Cost Function is first determined to calculate y on given x… If the problem become Multi-Variate, then matrix method is used… but still this is a deviation from the problem u asked… I think it might help others…

Please mention constraints on variable x.

x is always less than 10^9

@jaythegenius why the downvote for the question? I think the question is great.

I consider the question is concise, a problem link is required… but not for a downvote… it is great…

Actually I was thinking of a problem and i came across this problem(Subproblem of my original problem),I feel that this problem could be solved efficiently,and thus i asked for the help as i can’t think about any solution for this problem.I have no link for the problem.You can relax some constraint if you feel its necssary but remember brute force should be timed out in any of your asssumptions.

Thanks.