Data Structures & Algorithm : Time Complexity v/s Space Complexity

@Harsh
3 min readFeb 3, 2024

--

INTRODUCTION

In the world of algorithms and programming, understanding the efficiency of your code is paramount. Two essential concepts in this realm are Time Complexity and Space Complexity. Let’s embark on a journey to demystify these concepts and explore how they impact the performance of our code.

Time Complexity:

Time complexity measures the amount of time an algorithm takes to complete as a function of the input size. Time complexity measures the time taken by every statement of the algorithm. Hence, it highly depends on the size of processed data. To estimate the time complexity, we need to consider the cost of each fundamental instruction and the number of times the instruction is executed.

We often express time complexity using Big O notation, which describes the upper bound of the growth rate.

WHAT IS ASYMPTOTIC ANALYSIS ?

Asymptotic analysis of an algorithm refers to defining the mathematical foundation/framing of its run-time performance. Using asymptotic analysis, we can very well conclude the best case, average case, and worst case scenario of an algorithm. In Asymptotic Analysis, we evaluate the performance of an algorithm in terms of input size (we don’t measure the actual running time).

Exploring the Asymptotic Notation:

Big O, Omega, and Theta notations are commonly used to describe different aspects of an algorithm’s performance. Big O notation, in particular, provides an upper bound on the execution time, offering a worst-case scenario analysis.

Common Time Complexities:

  • O(1) — Constant Time: Operations that take the same amount of time regardless of input size.
  • O(log n) — Logarithmic Time: Common in algorithms with divide-and-conquer strategies like binary search.
  • O(n) — Linear Time: Execution time increases linearly with the input size.

FOR EXAMPLE :

// suppose we have the code :
#include <iostream>
using namespace std;

void integer()
{
for(int i=0; i<=10; i++)
{
cout << "The number of given range is : " << i << endl;
}
}

The above given function integer() always run 10 CPU operation in any case. Hence the time complexity of this algorithm is constant, so BigO(1) = O(1)

BUT , if we make certain changes in the code like :

// suppose we have the code :
#include <iostream>
using namespace std;

void integer(int n)
{
for(int i=0; i<=n; i++)
{
cout << "The number of given range is : " << i << endl;
}
}

Now the scenario change after we modify our algorithm. This function is now become dynamic, that means the value of n totally depends on user and it can go as much high as user want. Now here, the number of operations CPU will run, vary in each case. And this concept is also known as “SURPRISE”. Hence the time complexity of this algorithm is linear, so BigO(n) = O(n).

Space Complexity:

Space complexity refers to the amount of memory an algorithm uses relative to the input size. It is crucial to consider, especially in resource-constrained environments. Similar to time complexity, space complexity is also expressed using Big O notation.

The space complexity of an algorithm quantifies the amount of space taken by an algorithm to run as a function of the length of the input. Because a program needs memory to store input data and temporal values while being executed, space complexity is auxiliary and input space.

Optimizing for time complexity might lead to increased space usage and vice versa. Striking a balance depends on the specific requirements of the application and the available resources.

TYPES:

Similar to time complexity, there are different types of space complexity, depending on the memory consumed by each algorithm.

  1. Constant Space (O(1)):
  • Algorithms with constant space complexity use a fixed amount of memory that does not depend on the input size.

2. Linear Space (O(n)):

  • Linear space complexity implies that the memory usage grows linearly with the size of the input.

Conclusion:

In the dynamic world of coding, mastering time complexity and space complexity is essential for building efficient and scalable applications. By gaining insights into these complexities, developers can make informed decisions, crafting algorithms that deliver optimal performance while meeting the demands of their applications.

--

--