Hey folks,
I have been working on some data analytics projects recently and wanted to tap into the wisdom of this community. When dealing with large volumes of data, it can get tricky to manage performance and ensure that everything runs smoothly. I am trying to refine my approach to handling searches, visualizations, and dashboarding.
What strategies do you all recommend for optimizing queries and making sure my dashboards load efficiently, even with a lot of data? When I was searching about this I came across these resources/articles The Complete Guide to Data Integration in Web App Builders | AppMaster, however, but I would like to get some advice in detail from community members.
Any advice on managing indexing or alerting to make things more streamlined would be awesome too!
Appreciate any insights you can share!
Thanks
Hey @lysander , welcome to our community!
Some general advices:
- Retrieve from DB only required data: limit amount of rows, select only small set of columns.
- When designing data models make sure to create indexes. AppMaster uses PostgreSQL as a primary DB and with proper indexes is works very well.
- Minimize amount of data that you send between backend and frontend. With large payloads browser takes a lot of compute resources to parse JSON even before it will be available to the logic.
- Avoid enormous number of requests per page load. Based on your case it could be from 2 to 5 requests max, but need to be tested anyway.
- AppMaster-generated web applications use webworkers and effectively offload computational logic from the main thread, but all operations with UI are single threaded. Make sure you batching UI updates if possible to get best performance.
- We have additional settings to set number of webworkers when application starting. By default we have only one to let app start with minimum delay, but we can increase in your project to handle more complex logic.
Let us know if you have any additional questions.