We transitioned our project from using plain JSON to OCSV leveraging the compressor in node. This works totally fine, loading time improved a lot, but now memory consumption on clientside has become a huge issue. The memory usage is at least 10x bigger than before, in many scenarios even more than 2GB which is obviously killing chrome.
What are we doing wrong or is this a known issue and should only be used for small data-sets?
Thank you for your question. Could you please specify what version you are currently using? Just press Ctrl + Alt + I to get the pop up with the version information opened. Also, could you please clarify your data source size? Does your OCSV data size have more records than the JSON data source? In case you have started loading more data records than before it will require from the browser to allocate more RAM to store the data.
Waiting to hear from you.
Thanks for your swift reply. We are using “Version 2.6.3 (build 09/10/2018 16:16:14)” with the latest version of Chrome. It happens on Windows and Mac as well as in other browsers. The data we load of course is exactly the same both in JSON and OCSV. For example the memory consumption in Safari for the biggest dataset is almost 4GB whereas with JSON it’s less than 400MB. I would be more than happy to join some example files with you if necessary, but not public in the forum if possible.
Thank you for your feedback. Could you please provide us with the data source example where the issue is reproducible? That will definitely help us in our further investigation. You can send all the sensitive data to our e-mail.
Waiting to hear from you.
I’ve send the examples to your email
Looking forward to your reply
Thank you for sharing the examples with us.
We have checked both JSON and OCSV files and they seem to consume an expected amount of memory. The tests were performed on Windows, with 4GB of RAM, Chrome version 69.0.3497.100. The files were downloaded and then specified in the
dataSource section of the report.
Please share more details on how exactly you are using OCSV data, this may help us to reproduce the issue.
Waiting for your reply.
I was trying to reproduce your findings and downloaded the trial package of Flexmonster, serving the files I shard with you as datasource. Actually I was not able to reproduce my problem. Trying the same in our environment led to the same conclusion. But as soon as the data is provided through the compressor the problem occurs. Downloading the data URL in Chrome or through curl works perfectly fine (this is actually how I created the static example in the first place). But as soon as the Flexmonster component has to deal with this kind of streaming repsonse the browser is running out of memory. I’ve send you an email with an updated example together with a heap dump from the named chrome session.
Looking forward to you reply
Thank you for all the detailed information you provided.
We reproduced the issue on our side. It will be investigated further and we will do our best to include the fix in the minor release on October 8th.
We will keep you informed.
We are glad to inform that the version with the fix for the memory consumption issue is already available on our website.
You are welcome to update to the latest version and try it.