# How can Product Managers access the efficiency of an algorithm while scaling their products?

Once the PMs are done with shipping the MVP, the next challenge lies in scaling it for a larger audience. This requires an understanding of how the algorithm used in building the product behaves once the traffic scales to 10x, 100x and so on. Let’s understand **big O notation** in this aspect.

**Running time or big O notation**

Running time or big O notation, sometimes also called “asymptotic analysis”, majorly looks at the count of operations a sorting algorithm takes to completely sort a large set of data.

It defines the efficiency of an algorithm as the data set increases. It doesn’t give absolute time as an output; instead, it is a mathematical function to explain the speed of the operations once the data set is huge in amount.

It does this with regard to “**O**” and “**n**”, ( example: “*O(log n)” )*, where

**O**refers to the order of the function or its growth rate, and**n**is the length of the array to be sorted.

For instance, if the number of images in your gallery becomes 4x, will you be able to search or scroll through them with the same speed? or will it take 4x time or maybe more than that?

Big-O notation is a method to track how quickly the runtime grows relative to the size of the input. Let’s go through the popular runtimes :

# O(1) → Constant Time

It means that it takes a constant time to run an algorithm, regardless of the size of the data.

*For instance, accessing the recently clicked photograph in your gallery. You can access that with the same speed even if you have 10 or 100 or 1000 images in your gallery.*

# O(n) → Linear Time

It means that the run time increases at the same pace as the size of the data. So if your dataset increases 10x, the runtime might increase 10x too.

*For instance, selecting the images and sending them through WhatsApp. As you increase the no. of images, the time took also increases in the same proportion.*

# O(n²) → Quadratic Time

The runtime is much worse than linear. If the dataset increases by 10x, you might get a 100x increase in the runtime. That’s completely unacceptable for your product.

# O(log n) → Logarithmic Time

It means that the running time grows in proportion to the logarithm of the dataset, meaning that the run time barely increases as you exponentially increase the input. A 10x increase in the dataset might, for example, double the runtime.

# O(n log n) → n log n time

It means that the runtime will be a little worse than linear. For instance, if the dataset grew by 10x, the runtime might increase by 10.5x or 10.7x. It is considered acceptable for many of the present products.

Thanks for reading! If you’ve got ideas to contribute to this conversation please comment. If you like what you read and want to see more, clap me some love! Follow here, or connect with me on LinkedIn or Twitter.

*Do check out exclusive Product Management resources 👇*

**The Ultimate List of 100+ Product Management tools 🛠**https://rohitverma.gumroad.com/l/PM-tools**The Ultimate List of Product Management Frameworks ⚙️**

https://rohitverma.gumroad.com/l/PM-frameworks**The most exhaustive OKRs notion template 📔**https://rohitverma.gumroad.com/l/OKR-Template