I have great news! I am really, REALLY excited about it! This year we are hosting WOPR (Workshop on Performance and Reliability) in Uruguay! It’s going to be December 6-8, 2022, in Montevideo.
I remember when I started in performance testing, it was my first job in tech, in 2005. I learned by reading articles and books from Scott Barber and other people in the field, and I remember reading about a conference where the most renowned people in the performance testing area get together to share their experiences and discuss and learn from each other; well, that was the WOPR.
In 2018 I had the chance to attend the conference and I had the opportunity to discuss very interesting topics with great professionals and get to know them (Eric Proegler, Andy Grabner, Ben Simo, Henrik Rexed, and many more). Now, we have the honor to host the event and bring several thought leaders to get to know Uruguay and the local community. I strongly believe that in Latin America we have much to share, and part of our goals in hosting the conference is to bring the discussion closer, make it more accessible, and reduce the barrier for fellow Latinx performance testers and SREs to participate in this type of events. Also, we want to show the expertise of our community to some of the most important thought leaders from the Northern hemisphere.
How is WOPR different?
WOPR is not a conference where a group of speakers presents a topic and the audience listens and asks questions at the end. It is not about the audience learning while the expert is presenting. WOPR is a workshop for a group of 20-25 people and participants are encouraged to discuss. The presentations are based on experience reports, and all the participants make questions, suggestions, and contributions. You can learn more about the experience reports on WOPR’s website. For most of the editions, Paul Holland participated in moderating the discussion with a specific dynamic that he calls “key cards” that helps to keep the conversation flowing and ensures that every voice is heard. Paul is great at doing that!
Every year the WOPR is hosted in a different city by a different company. When I attended in 2018 it was in Marseille, France, hosted by Neotys (now acquired by Tricentis). In the past, different companies have hosted it such as Sun, Google, Microsoft, HP, Salesforce, eBay, Facebook, Dynatrace, Blazemeter, etc, in different places in the US, Canada, and Europe.
WOPR started in 2003 and the 29th edition is going to be the first time that the WOPR is going to be hosted outside North America and Europe.
Call for proposals is open: Iterative Performance Testing
We want to encourage (and guarantee) a diverse group of speakers in this edition of the WOPR. I want to share with you some more details of the main topic this year, and the link to the CPF. You can read more on the official site here.
WOPR29’s Content Owner is Andy Hohenner.
Performance Testing examines the experience of users of a system; Load Testing generates simulated traffic to examine the performance characteristics of a system. For both disciplines, many of their current practices were developed during a time of slower and more infrequent releases, when products and systems would release large, significant chunks of functionality all at once.
Since that time, deployment frequencies have accelerated to as often as multiple times per day in some contexts. Software projects in these contexts build and deploy through pipelines, moving code from Commit to Production with minimal human analysis, assessment, or intervention. There are also contexts beyond Silicon Valley-style Agile/CI/CD, where highly skilled people are creating and releasing important software less often, with great care in examining and validating each release. These contexts (and others not mentioned here) are subject to significant recent changes in how software is shipped, provisioned, deployed, supported, monitored, and maintained.
Great progress has been made in Observability, and we can automate rollback/forward decisions based on what it can tell us. Specific versions of code and systems are shorter-lived, and the frequency of code change has significantly increased. System complexity in service architecture and cloud container scaling has changed the model of what a “complete” system is altogether. These changes, separately and collectively, have greatly reduced both opportunity and value in multi-week (and month) load testing projects.
But there are still aspects of system performance that are best explored with load. While Observing Production performance can provide similar information, investigating scaling risks in a controlled fashion by injecting synthetic usage can still provide predictive benefits hard to get in any other way. What does Performance and Load Testing look like for you these days?
At WOPR29, we want to hear about your experiences with Iterative Performance Testing.
We are looking for your real, recent experiences in using performance and load testing techniques in the context of modern software projects. Here are some prompts that may help you consider how your experiences apply to WOPR29’s Theme:
- How has your load testing changed recently?
- Has your definition of Performance Testing changed?
- What new techniques do you use to do your performance testing?
- Do you use iterative/incremental performance tests? What do they look like?
- Are you Performance testing inside sprints?
- Are you conducting Performance testing as part of CI and/or CD?
- How is Observability impacting your Performance testing?
- Are you measuring performance through Synthetic Monitoring?
- How have DevOps approaches altered how you approach performance testing?
So, all this only for 20 people?
Yes, WOPR is going to have 20-25 people only. We also want to take advantage of bringing a lot of professionals and thought leaders to Uruguay to deliver something of value to the local community. This is how we came up with the idea of organizing another conference about software testing and quality, the same week, on Friday, Dec 9th. This conference is going to be free for attendants and we will ask WOPR participants to give a talk at that conference. We will be looking for sponsors for that conference in order to help cover the costs. If you want to be part of a conference with lots of great speakers as a sponsor, please get in contact with Federico or Guillermo at [email protected] mentioning that you want to sponsor de conference.
More information about this conference, call for proposals, etc, soon.
Good to know
As it is specified on their site, WOPR is not-for-profit. Participants are not asked for an expense-sharing fee for WOPR29 and sponsorship will be used to cover expenses. In the past, participants were asked about access to training/PDP budgets to help offset expenses, as their employers greatly benefit from the learning their employees can get from WOPR.
With our additional conference, we have an opportunity to gather more sponsorship to help bring together a WOPR that works in this new geography.
At Abstracta, we are planning to help subsidize travel expenses for WOPR participants. More information on this soon. If you are interested in attending WOPR, please do not let travel costs prevent you from applying – we hope to assist!
I want to invite you to send your proposal to apply to WOPR and come enjoy Uruguay with the great weather we typically have in December, along with other great thought leaders and an avid testing community. Apply here.
This small South American country of 3.4 million people is a global leader in software exports. With more than 1000 active software companies, progressive politics, and amazing opportunities, it has earned international recognition.
Uruguay is not only ranked as one of the best places to live in Latin America. It also has become one of the most trusted places for tech companies worldwide to make business with.
The combination of high tech, education, and government policies that promote growth and high quality of life is turning Uruguay into a true Digital Hub in the region. You can read more about it here.
We co-create first-class software, generating opportunities for development in our communities to improve people’s quality of life.
Aligning our vision with our clients, we create a specific team to assess and provide exactly what is needed and more. Our experience of over 14 years gives us a solid track record to boost any project to its full potential, ensuring success by providing the highest quality services, constantly working to exceed expectations. We believe that passionate testing creates engaging software.
You can visit our site to learn more about us!
Test Automation Patterns and Good Practices
How to make sure your test automation not only makes your team move faster, but in the right direction. By Matias Fornara and Alejandro Berardinelli Test automation, or automating the execution of your tests, provides several advantages: it saves your team time, resources, and the…
Model-Based Testing Using State Machines
What is a state machine and how can you use one to improve your testing strategy? When testing software systems, it’s very important to build a mental model of what you understand about the expected behavior, the interactions between the user and the SUT (system…