Asymptotic Notations are mathematical notations used to describe the efficiency of the algorithm that how much time a particular algorithm takes with a given input. Using asymptotic notation, we can very well conclude the bestcase, averagecase, and worstcase scenarios of an algorithm.
Best Case − Minimum time for the execution.
Average Case − Average time for the execution.
Worst Case − Maximum time for the execution.
Asymptotic notations are mathematical tools to express the time complexity of algorithms for asymptotic analysis.
Note: if there’s no input to the algorithm, Then it is considered to work in a constant time. Other than the "input" all other factors are considered to be constant.
Three Main Asymptotic Notation
 Ο Big Oh Notation
 Ω OmegaNotation
 θ Theta Notation
Big O Asymptotic Notation, Ο
The Asymptotic Notation Ο(n) represents the upper bound of an algorithm’s running time. It measures or calculates the worstcase time complexity or the maximum or longest amount of time an algorithm can possibly take to complete. For example, O(log n) represents a binary search algorithm’s upper bound.
g(n) is an Asymptotic upper bound for f(n)
O(g(n))={f(n): there exist a positive constants c and No such that 0<=f(n) <=cg(n)for all n>=No
Omega Asymptotic Notation, Ω
The Omega Asymptotic Notation Ο(n) represents the lower bound of an algorithm’s running time. It measures the bestcase time complexity or the minimum amount of time an algorithm can possibly take to complete. example, a Bubble Sort algorithm has a running time of Ω(N) because in the bestcase scenario, the list is already sorted, and the bubble sort will terminate after the first iteration.
g(n) is an asymptotic lower bound for f(n)
Ω(f(n)) ≥ { g(n) : there exists c > 0 and n0 such that g(n) ≤ c.f(n) for all n > n0. }
Theta Asymptotic Notation, θ
The Theta Asymptotic Notation Ο(n) represents the both lower bound and upper bound of an algorithm’s running time. It measures the average case of time complexity. When we use bigΘ notation, we’re saying that we have an asymptotically tight bound on the running time.
g(n) is an asymptotic tight bound for f(n)
The function f is said to be Θ(g), if there are constants c1, c2 > 0 and a natural number n0 such that c1 g(n) ≤ f(n) ≤ c2 g(n) for all n ≥ n0
Properties of Asymptotic Notation:

General Properties:
If f(n) is O(g(n)) and k is constant then k*f(n) is also O(g(n)). 
Transitive Properties:
If g(n) is O(h(n)) and f(n) is O(g(n)) then f(n) = O(h(n)). 
Reflexive Properties:
If f(n) is given then f(n) is O(f(n)). Since the max value of f(n) will be f(n) itself
Hence x = f(n) and y = O(f(n) tie themselves in reflexive relation always. 
Symmetric Properties:
If f(n) is Θ(g(n)) then g(n) is Θ(f(n)). 
Transpose Symmetric Properties:
If f(n) is O(g(n)) then g(n) is Ω (f(n)). 
Some More Properties:
 If f(n) = O(g(n)) and f(n) = Ω(g(n)) then f(n) = Θ(g(n))
 If f(n) = O(g(n)) and d(n)=O(e(n)) then f(n) + d(n) = O( max( g(n), e(n) ))
Conclusion:
Asymptotic notation is a way to describe the growth of functions in terms of other simpler functions. The most commonly used asymptotic notations are big O notation, big omega notation, and big theta notation. These notations allow us to compare the growth of different functions and estimate the time and space complexity of algorithms. In conclusion, Asymptotic notation is a useful tool for analyzing and comparing the efficiency of algorithms and can help in understanding the performance of code.