🍉 Meet Flexmonster Pivot Table & Charts 2.9.Check out all hot features!
Get Free Trial

Memory use when exporting reports through puppeteer

Answered
Kim-Andre Kristiansen asked on December 15, 2020

We’re currently working on an application that allows users to export a big amount of data to xlsx files. We’re utilizing your puppeteer + flexmonster example for this purpose.
Smaller reports works fine with big queries. But if we add a big amount of rows, we’re struggling with memory issues. Please see attached pictures. Here i’m running puppeteer without headless mode to show memory usage in comparison with progress. Soon after it crashes due to being out of memory. Is there any way to decrease the memory usage?
Update:
When running this in chrome, same conditions and same data source (flexmonster data server) it finishes in about 10 minutes and has no performance issues as far as i see. Memory does not increase at the same rate. Must be a chromium issue, but why..

Update 2:
I made some changes to puppeteer to ensure that performance is as fast as possible. After running puppeteer for an hour and a half on a system with 16GB ram, with 8 columns and 45000 rows i receive the following message:
RangeError: Invalid string length
at Array.join (<anonymous>)
at Function.pEa (https://cdn.flexmonster.com/flexmonster.js:718:303)
at Function.Pza (https://cdn.flexmonster.com/flexmonster.js:718:473)
at Je (https://cdn.flexmonster.com/flexmonster.js:717:138)

6 answers

Public
Illia Yatsyshyn Illia Yatsyshyn Flexmonster December 15, 2020

Hello,
 
Thank you for reaching out to us.
 
Our team did not manage to reproduce the described behavior using the dataset consisting of 30k records.
 
Therefore, we want to ask you for some details to understand the issue’s nature better.

  1. Does the problem appear when using Puppeteer only? Please try exporting the same amount of the data using Flexmonster instance outside the Puppeteer context. For example, you can use our demo page to upload your data and perform the export.
  2. Please send us the complete report (JSON configuration) and the dataset used in your case (use dummy data if needed). It would allow us to reproduce this behavior on our side and find possible causes for the problem.

 
Our team is looking forward to hearing from you.
 
Kind regards,
Illia

Public
Kim-Andre Kristiansen December 15, 2020

Hi Illia,

Thanks for the quick response.

1. Yes i just checked, and i can reproduce it now in chrome. Earlier i imagined it was faster, but when i used the exact same conditions i use to render the report in puppeteer, i’m struggling with slow performance. Around one row handled pr second it seems like based on the loading screen number.

2. I would like to send you this, but is there anyway for me to export this data from Flexmonster itself? We’re using the flexmonster data server to query for the data, so when i use your built-in save method, the datasource property in the file returned just reflects the Flexmonster data server connection.

I had another look at the report, and we experience this issue when we set the expandAll attribute. This results in almost 200 000 “columns”. I do know these are not reflected in excel, but i guess they take their toll on the processing.
Please see a picture attached for better explanation.

Best regards
Kim-Andre

Public
Illia Yatsyshyn Illia Yatsyshyn Flexmonster December 16, 2020

Hello, Kim-Andre,
 
Thank you for providing us with details.
It did manage to bring some light on the nature of the problem.
 
When exporting to Excel, Flexmonster analyzes each cell of the table, storing the content in the computer’s RAM.
With the expandAll property set to true, the table consists of more than 10b individual cells when using your dataset. This amount of data is likely to exceed the limit exposed by the used machine.
 
Therefore, we suggest either increasing the server’s RAM capacity or using smaller data sets.
Another option is not to use the expandAll property. Instead, you can predefine only required tuples to expand using the expands property of the Slice Object.
 
We hope it helps
Please contact us in case any other questions arise.
 
Kind regards,
Illia

Public
Kim-Andre Kristiansen December 16, 2020

Hi Illia
Thanks for the follow-up. I was under the impression that the browser itself would have a limit to how much RAM could be utilized? As i was running this locally on my PC, i didn’t see it topping my available RAM in terms of performance.

I will try to run this in a container in a cloud service and expand the RAM limit though. To have a look at what might happen.
Using the expands in the Slice Object is a good idea, but unfortunately we’re a bit limited in that regard. As our BE needs to do heavy processing in regards to the query, we can’t provide the user with fast response. Therefore we only show a subset of the records in the browser, and as a result they dont have the option to expand each tuple individually.
Thanks,
Kim-Andre

Public
Illia Yatsyshyn Illia Yatsyshyn Flexmonster December 17, 2020

Hello, Kim-Andre,
 
From our observations, the memory limit of a single tab in Chrome is about 3.5-4GB.
Currently, we don’t know the way to increase this limit.
 
In case additional RAM does not resolve the problem, please provide us with the details about the final result you want to achieve. It could help to find possible workarounds for your case.
 
Kind regards,
Illia

Public
Kim-Andre Kristiansen January 28, 2021

Hi Illia
Sorry for the late response. We’ve put this issue on hold for the time being, will get back to you guys if we need assistance in the future. Thanks a lot for the help so far.
Best regards
Kim-Andre

Please login or Register to Submit Answer