Blog

JMeter DSL, the Story of Abstracta’s Latest Innovation in Software Testing

How was JMeter DSL born? What is its contribution to the IT industry? Why is this software testing innovation so important? What is expected from it? Find out all about the history of JMeter DSL, Abstracta’s latest software testing innovation, led by Roger Abelenda.

By Natalie Rodgers

Achieving innovations in software testing is crucial in Abstracta’s path, both as a contribution to the IT industry and to improve the quality of digital systems, with direct influence on the life of the global community.

In this sense, performance testing takes special prominence, in order to evaluate the performance of any application, without exception and under different conditions. To ensure the best possible user experience, even in complex systems and at times of high demand. 

According to Business Wire, 88% of Americans have negative feelings towards brands with poorly performing websites and mobile Apps. These negative feelings are associated with annoyance, frustration, distrust, and anger. Thus, performance testing is becoming increasingly important in the creation of quality software.

JMeter DSL is the latest innovation in software testing developed by Abstracta, and it is undoubtedly a great ally for performance testing. Released in 2020, it is a library that facilitates the use of JMeter through code, adding new functionalities. 

“JMeter DSL confirms Abstracta as an innovative company, with solid knowledge and experience in performance testing and in particular in JMeter, with the capacity to develop adequate solutions to improve existing processes,” emphasized Roger Abelenda, Chief Technology Officer of Abstracta and leader of JMeter DSL development.

With this in focus, we talked to Roger to understand in depth the relevance of this innovation for the IT industry.

How was the idea of creating JMeter DSL born?

It was several years ago, before joining Abstracta, when I was involved in the development of some tools that required frequent load and performance tests due to the criticality and volume of data they handled. Using JMeter was quite slow, difficult to modularize, maintain and share test plans, and the way to run the tests in the development pipeline was not optimal. At that time, I thought that something had to be done about it without having to migrate to another language and/or environment, as Gatling required. Some solutions existed, such as Ruby-DSL and Taurus, among others, but they required incorporating a new language or relearning things I already knew how to do with JMeter. 

How did you make room for this in Abstracta?

After a few years, already being part of Abstracta, I experienced the same needs again. We saw problems related to this in some Abstracta teams, so I decided to make a POC (proof of concept). I presented it to the CEO, and we decided to release it as open-source to see if the community would get hooked on the idea. The first version was released on August 13, 2020.

In what ways has Abstracta fostered this innovation in software testing?

The C-suite has helped with the vision, prioritization, and promotion as a way to foster innovation. The performance and development teams have been a big part of the development of the tool, helping in the design and prioritization of features, the introduction of best practices, writing blogs, etc. Currently, I dedicate part of my time to the design and implementation of functionalities, as well as attending to doubts and requests from the community. I also participate in webinars and write blogs to make the tool known. 

How has the testing community received the tool?

The truth is that we have received a very good reception. They have been an important part of multiple processes, not only in the diffusion through blogs and videos, but also in the development itself. How? By asking for features and improvements, reporting bugs, generating discussions, helping in the design and prioritization of features, sharing content in the discord channel, and sending pull requests with code to incorporate into the library. It is precise because of this that we continue to invest in improving the DSL. Even so, we are still looking for more collaboration from the community, so that more people get involved and get to know the solution. 

Why should more people get involved? How does this innovation help quality software development?

It makes it easier to integrate performance testing into existing development pipelines on a continuous basis, promoting shift-left testing. In addition, it facilitates the application of knowledge, engineering, and software development concepts to the development and maintenance of JMeter performance tests.

Why did you decide to pursue JMeter DLS as an open-source contribution?

Our main goal is to try to facilitate the implementation of performance testing and bring it closer to as many people as we can to, in general, improve the quality of all the applications we develop and use on a daily basis. We seek the collaboration of the community, to co-create the best possible solution for everyone, and we think the best way to achieve this is through open source. Ultimately, we seek to give something back to the community, through code and knowledge included in the code and its documentation, in exchange for all the solutions we use on a day-to-day basis that are open source, which is the basis of so many projects we work on.

How often are new releases made and what do the changes consist of?

We don’t have a stable release cycle, we release features and bug fixes as soon as they become available. On average, we make one release per week, but there are weeks with several releases and others with no releases. We focus on implementing features that demonstrate a need from users and based on these requests. We have many ideas in our backlog of things that we think could be interesting to implement, but we prefer in most cases to wait for users to report their needs. We base a lot of our releases on these requests.

Why do you prefer to wait instead of anticipating?

Because that way we avoid overloading the tool with functionalities that may not be used as much. We prioritize users’ needs first and foremost. This allows us not only to prioritize the work and effort invested but also to document the raison d’être of each component and involve the community in the development of the tool. 

By what means can users make contributions?

We always insist that users enter the repository and ask for things they consider useful or necessary. Also, we pay special attention to the stars on GitHub. We consider them as a good measure to understand how much interest there is from the community in the tool, as well as to promote the visibility of the project.

What are the expectations for the future with JMeter DSL?

We expect it to be the main alternative for performance testing from code, one of the fundamental components in the stack of tools associated with JMeter, and the main alternative to JMeter’s graphical interface (which we consider to be a very good alternative for many users).

Does JMeter DSL provide Abstracta with any economic benefits? 

No. The main reason for Abstracta to invest in JMeter DSL, in conjunction with other existing initiatives, is to nurture its innovative nature, which goes back to its first product, GXTest with more than 14 years in the market. And is also reflected in its spin-offs. This allows us to continue with our commitment to pour our knowledge and experience, in this case in performance and in particular in JMeter, positioning ourselves as a reference and contributing to the community by helping in the efficiency of existing processes and delivering better quality software.

In need of a performance testing partner? Abstracta is one of the most trusted companies in software quality engineering. 

Learn more about our solutions, and contact us to discuss how we can help you grow your business.

Follow us on Linkedin & Twitter to be part of our community!

335 / 355