Skip to main content

What is your page load target?

Comments

3 comments

  • Official comment
    Glyn McKenna

    Hi Blake,

    Thank you for this forum question, it's a good one to ask to generate some discussion around best practices when developing reports/dashboards. 2 seconds or less is a good target to try to achieve

    There are a few things that can be, and often need to be, addressed when tuning your report definitions to ensure they are performing well.

    • Data; in general data retrieval is often the cause of most performance issues I encounter amongst our customers. I will always recommend that people get to know the debug tools in Logi Info. Ironically the debugger will slow down performance however this is relative, and the graphical indicator and timing column on the right of the debugger page is a great way to find some of those initial bottlenecks in page load
    • Data operations should be pushed down to the database where possible. The closer the data processing is to the data the better performance should be. Databases are dedicated platforms for this exercise, and although a lot of data processing and manipulation can be performed using Logi Elements it is still better to translate these into the appropriate query language and push them down to the database where possible
    • Data prep; data should be prepared with reporting in mind. This is sometimes missed when attaching to a transactional database, as these tend to be optimised for storage as opposed to reporting and are often highly normalised. This can result in overly complex queries and expensive joins.
    • Database tuning; many people are connect to RDBMS data stores for a number of different reasons. Some of these will allow you to specify columnar data storage which is generally self-tuning and provides better support for reporting, but not such great support for update transactions. Other big data sources can also be more performant. If that's not an option, then it is always wise to get a seasoned DBA or database developer to review queries and apply indexes or implement materialised views/aggregate awareness
    • Data set size; it is quite easy to just pull back large data sets in a query, however most of the time visualisations need very little data to be built. As report developers we sometimes bring back more records than required as we share that data amongst several visualisations or reports. This can be an efficient way to use the data and will reduce load on the database server, however the more you bring back, the more work the Logi Engine/App Server will need to do to serialise that data. Sometimes if the query is not too expensive, you may find it's more efficient to run separate queries for each visualisation and just bring back the two or three rows of aggregated data required to build it
    • Mixing inputs with data tables; for small numbers of records it isn't a serious issue, however if this is left unchecked and there is potential for hundreds of rows of data, this can cause a huge amount of additional page markup meaning the HTML payload that needs to be sent from the server to the client browser is quite large. I would recommend using a shared dialogue that can be loaded with inputs for each row if you need to setup input forms for data write-back. In the instance where a use case prevents this, then I would suggest using some Ajax based paging to ensure that the payload is broken down into more manageable chunks
    • Repeat elements; repeat elements are great for building dynamic content, however much like the inputs in the data tables, left unchecked or used poorly these can quickly add megabytes of page markup - be aware of this when deciding on the data layer you're going to add to a repeat element! 
    • Third party JavaScript and CSS libraries; third party libraries can be great at taking up the heavy lifting, however we often use a very small number of features from each of them, meaning the rest just ends up as bloat. Fortunately many of them are split into functional packages that can be downloaded separately or provide methods to compile just the pieces you need. It is always worthwhile looking into this, especially for your production deployments. In addition to this you can also minify the JS or CSS that you've developed once the development work is complete.
    • Network latency; when hosting your server, whether in the cloud or within your corporate network, it is always good to understand where your users are in relation to your data centres and where other server resources such as databases are in relation to the Logi Application Server

    There are a number of other things that can cause page slow downs, so in addition to understanding your page load targets, I'd like to invite others to add to the list I started above and add the things that you do to try to improve the load times of your dashboards and reports

    Best regards

    Glyn

  • Blake

    It's also worth looking at @Local data layers.  I used to use these a lot for KPIs since it is a 1 value thing.  I've now converted this to putting the KPI widget inside of a table.  Although, it was initially a pain getting the table just right so it didn't look like a table.  Regardless, it made a significant difference in load times.

    0
  • J. Reitter

    2 seconds? WOW! We wish!

    Our target is 1 minute or less... but we are typically dealing with millions of records in a "multi-tenant" (each customer has their own database). 

    For cases where we have known very long load times (> 1 minute), we display a message that it will take a few moments to load the data due to the data size. 

    Regarding @Local data... our answer is "it depends". In some cases, @Local data is fast.. in other cases (especially when we have a lot of selectable parameters in a report to filter data), getting the data "on the fly" works best. 

    0

Please sign in to leave a comment.