Case studies

30 May, 2025

Load testing at Rosterfy - Assessing and demonstrating the platform’s capability to handle peak user demand

Background

Rosterfy is an end-to-end volunteer management software solution that helps organisations streamline their volunteer management and engagement.

The company was born out of Event Workforce (now known as Spark Event Group), a staffing company created to help connect university students with work opportunities across Australia. The need for an online solution platform to manage volunteer teams became clear as Event Workforce grew, and from this need, Rosterfy was born.

Today, Rosterfy works with over 3 million volunteers in 35 countries. Their market leading technology helps to create an engaging experience throughout the whole lifecycle of a volunteer journey. They work with a broad range of clients, big and small, to streamline volunteer management. In 2024, Rosterfy celebrated their inclusion in the AFR BOSS Most Innovative Companies list, ranking in the top 10 companies for technology.

3 computer screens with code

The Engagement

Rosterfy were required to assess and demonstrate their platform capability to handle peak user demand, as part of a request from a customer.

While they conducted load testing on an annual basis through a separate third party service, this reporting didn’t meet the requirements and specificity required for this customer.

As part of this project, a team from both Rosterfy and Midnyte City was pulled together to run the testing necessary to report on Rosterfy's capabilities. This included the implementation of AWS's Distributed Load Testing Solution template, the development of several JMeter test scripts emulating key user flows, and the mutual uplift of our understanding of load testing best practices.

From this collaboration Rosterfy is now set up with the necessary tools and skills to measure their platform's capabilities, as well as to evaluate and develop load testing scenarios to meet future needs.

The Solution

Prior to the Midnyte City engagement, Rosterfy had a basic implementation of the AWS Distributed Load Testing Solution prepared, allowing the project to hit the ground running.

The first task that the team tackled was the development of JMeter scripts to emulate the desired user flows for load testing. Though existing JMeter scripts were received from the third-party load testing provider, it proved easier to map out short user flows directly from the source, using some existing credentials and writing down the important API calls made by the browser in a real user interaction.

This flags a key difference between types of load testing:

  • Simulating the full user experience including the emulation of a browser, or,

  • Emulating a user interacting with an application through the appropriate sequence of API calls, without the overhead of emulating a full browser.

As Rosterfy's priority was to assess the load capabilities of their backend rather than asserting the standard of the full user experience, the decision was made that JMeter scripts represent the necessary API calls a user makes throughout a journey, without emulating the full browser. This method of testing still provides some key indicators of poor user experience such as error rates and HTTP response times.

Throughout the development of these scripts, the team developed a shared understanding of several key elements of load testing and where Rosterfy were placed within them:

  • How to ensure we are load testing the right thing, e.g. one user flow in a vacuum.

  • The constraints of the default AWS Distributed Load Testing Solution template, and the ways that Rosterfy may look to extend its capabilities in the future if required.

  • Load testing best practices including data orchestration, the practice of setting up and tearing down the data necessary to perform load tests, as well as load test calibration, which ensures distributed testing hosts and test scenarios are right-sized for correct performance and lowest cost.

The Result

A win identified through the internal implementation of load testing was the identification of a bug related to concurrent user activity, resulting in an error. During the normal operation of the application, this bug was unlikely to have previously arisen, but its identification is an important improvement for ensuring Rosterfy's application can support the expected user activity levels required by the customer.

By the end of the engagement, Rosterfy had two JMeter test scenarios emulating key user flows within the application ready for final calibration and load testing.

Through a detailed handover meeting and comprehensive documentation, the Rosterfy team was left with the confidence to continue iterating on the developed load testing scenarios and practices themselves, without relying on third parties to meet their needs.

Contact us

If you would like to speak to someone about similar challenges in your team or organisation, reach out below to schedule a time.

*Fields are mandatory

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.