Get Free Trial

Flexmonster Dataserver Max capacity

dongbeom Kim asked on July 30, 2020

I currently need to cache about 3200 csv files of large data in the Flexmonster data server.
I have a question.
1. Flexmonster maximum number of files per server,
2. Maximum capacity per file,
3. What exactly DataRefreshTime does,
4. Whether you can run multiple servers (per port),
5. I wonder if the CacheSizeLimit option means the maximum number of files that are cached, and if so, what is the maximum value of the CacheSizeLimit option?

1 answer

Illia Yatsyshyn Illia Yatsyshyn Flexmonster July 30, 2020

Thank you for contacting us.

  1. Flexmonster Data Server does not impose any limits on the number of indexes (in your case, CSV files). Even so, please note that the data from all specified indexes will be loaded into the server’s RAM and stored there while the Data Server is running. As a result, the set of indexes represented as 3200 large CSV files is likely to appear too large to be loaded and processed by the server.
  2. The Data Server does not limit the size of the loaded data set. Even so, please consider the circumstances mentioned in the previous paragraph.
  3. The DataRefreshTime property defines how often the data is reloaded from a file or a database. It means the data will be reloaded from all specified files every sixty minutes in case the value of the DataRefreshTime property is set to 60.
  4. Only one Flexmonster Data Server can be launched on a single port.
  5. Concerning caching the files, we want to notice that all files specified in indexes are cached and stored in the server’s RAM by default.
    In its turn, the CacheSizeLimit defines the maximum number of cached server responses for every index. It means the server will store 10 recent responses in JSON format in case the value of the CacheSizeLimit property is set to 10. It allows reducing the time needed to process the data – in case the response is cached, the data will be returned instantly.
    Flexmonster Data Server does not impose the limitation on the number of cached responses. However, please note that such responses are stored into the server’s RAM as well.

Finally, our team would like to draw your attention to the fact that the mentioned amount of files is unlikely to be handled by the majority of servers. Therefore, we suggest considering developing your own implementation of the custom data source API that would hit the corresponding CSV file every time the data needs to be displayed. It would significantly reduce the load on the server due to the fact that the whole processed file is not stored in your server’s RAM.
Also, your own implementation of the custom data source API would allow composing dynamic queries, as discussed within your other forum thread.
We hope it helps.
Please contact us in case further questions arise.
Kind regards,

Please login or Register to Submit Answer