Shutterfly, Inc. is the leading manufacturer and digital retailer of high-quality personalized products and services offered through a family of lifestyle brands that enable consumers to share, print and preserve their memories by leveraging its technology-based platform and manufacturing processes.
The company went public in 2006 and started to acquire several other brands.
In 2015, Shutterfly reached over $1 billion in revenue for the first time and in 2019, upon celebrating 20 years in business, it was acquired by Apollo Global Management for $2.7 billion.
Redwood City, CA
$1.961 billion in revenue in 2018
In order to successfully carry out its continuous testing scheme, Shutterfly needed the help of a performance engineering team with experience in Gatling, Jenkins and performance analysis who would assist with three main tasks:
Execution of performance tests:
The team executes around 300 tests daily for which it needed someone to review the results to determine if any tests that fail are false negatives (due to problems with the test, test infrastructure, data, build, etc.) or if they correspond to a real problem and if so, report it.
As time for testing is always limited and Shutterfly has different teams delivering new tests very often, it’s necessary to consolidate tests. One important task it needed assistance with was to find tests that cover similar functionalities and combine them or even execute them in parallel if the levels of performance that the team tried to obtain from the beginning are maintained.
Maintain test assertions and profile tests:
It was also imperative for someone to maintain all the tests’ assertions. Periodically, they needed someone to check that the assertions were adjusted to the right level so that they continue to report an error when there is performance degradation. This would be done by profiling the tests, identifying the breaking point of the test in the testing infrastructure (the number of threads at which the throughout begins to degrade), and adjusting the tests so that they run with that number of concurrent threads, and with the corresponding assertions being as tight as possible.
Over a period of five months, a highly skilled performance engineer from Abstracta aided the team with the three aforementioned tasks.
Our engineer seamlessly collaborated with the Shutterfly team, attending every daily meeting via teleconferencing, providing useful suggestions for improvement (when appropriate) and consistently added value to the team.
Abstracta managed to quickly gain the necessary understanding of Shutterfly’s unique methodology for continuous testing and adapted to it.
The Abstracta engineer made sure to record all of the detected errors with as much information as possible and diligently reviewed each test as needed.
Daily video conferencing
Performance test execution