I have 170000 records in my query output and I am trying to create JSON file of those records on server. Which can be accessed through flexmonster. JSON File size is around 560MB – 570MB and when I use flexmonster connectTo() method to call the JSON file. Flexmonster stucks in loading at 520MB-530MB, So is there any limitation or a problem in that ?
Thanks and Regards,
Thank you for posting your question.
The thing with processing large data sets is that the browser is bound to run out of memory at some point.
Being a client-side component, Flexmonster relies on resources available to the browser, which affects the loading time and the maximum size of the data that can be handled on every particular machine. This means that the client machine’s RAM determines how much data can be loaded at once, and CPU capabilities affect how much time is spent on the data analyzation. Some machines may just not handle loading a large dataset into the browser at once.
However, we are glad to announce that a more efficient approach for working with large datasets has been introduced in the new major release version 2.8 of Flexmonster. We call it the Custom Data Source API and it greatly improves the performance and provides full control over how the data is processed. Our team highly recommends considering the Custom Data Source API in case large datasets are being used in your application.
Please see the following guides for more information about this new approach:
2.1) What the Custom Data Source API has to offer.
2.2) Introduction to the Custom Data Source API.
2.3) A sample implementation of the Custom Data Source API approach with Node.js.
2.4) A sample implementation of the Custom Data Source API approach with .Net Core.
2.5) How to implement your own implementation of the Custom Data Source API approach.
We hope you find our answer helpful. Feel free to reach out if you have any further questions we can help you with.
Hope you’re doing well!
We are just checking in to ask if you’ve found our response helpful. Did you have a chance to check out the Custom Data Source API? Perhaps, you have any further questions we can help you with here?
We would be happy to hear your thoughts.
Currently, I have done a workaround by using the CSV file and it is working fine. But in the near future, I will be using a custom data source API. Thank you for your quick response.
This question is now closed