Flexmonster Performance: A Million Rows in Seconds
You definitely have noticed how swiftly Flexmonster displays thousands and millions of rows of your data.
But have you ever wondered how the pivot table component renders the rows of cells so fast each time you load a large data file?
Let’s shed some light on the reasons for fast grid performance and develop an understanding of how Flexmonster Pivot Table & Charts works under-the-hood.
When it comes to reporting, performance is always a top priority. With this in mind, we do our utmost to provide you with a more reliable reporting experience and foster a data-driven culture in your business.
The secret of Flexmonster's excellent work is a blend of state-of-art rendering optimization techniques and memory management algorithms to speed up your reporting in the browser.
We put a lot of work in optimization to make Flexmonster consume as little memory as possible. Only by leveraging memory usage, we can guarantee you can work with heavy volumes of data and summarize them fast. In the 2.7 major update, we achieved 10 times faster data loading and used up to 50% less browser memory. That means the maximum size of the uploaded data also increased. And we continue to improve!
On every update, we thoroughly test the pivot table's performance, like using Chrome DevTools for memory measurements.
Since Flexmonster Pivot Table & Charts is a client-side component, it depends entirely on the resources your browser’s using, and especially on RAM.
Plus, because the data you load into the component is stored in your computer’s RAM, connecting to large data files can come at the cost of high memory usage. It's also worth noting that the amounts of RAM consumed by the browser are directly proportional to the data source’s size.
Broadly speaking, one should never blame a browser for using a lot of RAM - only in this case, it can ensure the rapid loading of web pages, quick data access, and processing. Nevertheless, memory management must be done smartly and efficiently to ensure a stable browsing experience without any slowdowns or web page crashes.
Along with memory optimization, it's important to handle the logic of rendering wisely to make your app performant.
Hence, we take care of rendering performance and minimize the number of running operations that are expensive in terms of computational cost.
With this in mind, we've decided to bundle Flexmonster Pivot with the virtual grid (also known as the virtualization feature) that is enabled by default once the total amount of empty and filled cells exceeds 500.
Let’s say you’ve loaded a data file into the pivot table. It contains thousands of rows and you are eager to scroll down to the last ones.
Each time you scroll, only visible rows are rendered. Additionally, the underlying algorithm picks the most suitable time for rendering spare rows around already existing ones (e.g., when the user is idle). This is why rows appear so smoothly.
Today we’ve got a peek under the hood of Flexmonster’s work and discovered that performance of your favorite pivot table is based on two major cornerstones:
- Smart optimization of the browser’s memory usage.
- Rendering performance optimization. As a result of the virtual grid’s work, the scrolling and any other complex interactions with the component are silky smooth. No time delays, no janks or sticky scrolling. All to enable you to use data to its fullest and run reports fast as lightning.
As you see, there is no magic behind it at all — only the best algorithms. Smart algorithms are always better than supercomputers. This is what we believe in.
To see the results of the virtual grid’s work, get hands-on experience with the 1 million rows demo.
As a way to increase the data processing speed even further, you can delegate it to Flexmonster Data Server. It's a server-side tool that works on top of the custom data source API. The Data Server gets data from your data source, be it a database or JSON/CSV file, processes it, and sends it to the pivot table for visualization. It's a perfect solution for working with huge data sets which size exceeds 1GB.
To maximize your performance and boost productivity, we recommend reading more blog posts on this topic.
You can also learn how to avoid performance bottlenecks in your web application from the developer’s guide created by Google: