Every time for efficiency of an algorithm we consider the worst case … but y do we consider worst case when we have the best case which takes less no of steps?

also i have one more doubt … what are duplicate entries in an array?can u give an example

# Asymptotic Analysis

**damn_me**#2

Best case analysis gives us the lower bound of an algorithm. Worst case gives the upper bound, i.e. the maximum possible complexity our code can have for any input. Thus, we consider worst case.

To answer next, if I am getting your question right duplicates in the array means an array having duplicate entries in the array i.e. the array is like `1 2 3 2 2 1`

As you can see, some entries are repeating.

**betlista**#3

From competitive programming point of view, motivation is obvious - if your algorithm runs in O(n * log(n)) in average, but O(n * n) in worst case (like quick sort for example), you have to pray, that this worst case is not in input data, but typically it is…