Have you thought about making the switch to a commercial test automation framework? In this episode of Quality Sense, Katya Aronov shares how she managed to lead an organizational effort at her company, Trax, to involve devs in their quality processes, using Testim to create useful, stable, and reliable test automation.
Episode Highlights:
- What is needed to create useful automation that benefits the whole team
- How can you be sure to make the best decision when choosing a tool?
- Why open source tools are not as free as we think
- How testers and devs can best collaborate, advocating a culture of quality
Relevant Links:
- Connect with Katya on Linkedin – https://www.linkedin.com/in/katya-aronov-88b24b4/
- Join the Testim Community – https://www.testim.io/community/
Listen Here:
- Listen on Soundcloud
- Listen on Spotify
- Listen on Apple Podcasts
Episode Transcript:
Federico:
Hello, Katya. How are you doing?
Katya:
Hello, Federico. I’m very good, thank you. How are you?
Federico:
Fine. Thank you. I’m very excited to have you here in the show. Thank you so much for joining.
Katya:
Thank you for having me.
Federico:
Something that I’d like to mention is how we met, because I’m always amazed by the opportunities we can get from networking, especially from all the communities out there. And this time with you, particularly, we got introduced by Tristan Lombard from Testim, the community manager of Testim and well, he’s great. And there is something special with all the interactions that he makes, and here we are.
Katya:
Absolutely. I agree with you, Tristan is doing a great job. And for me, Testim community is like a StackOverflow for the automation domain, I would say because in the absence of other automation authorities in our company, I’m endlessly grateful for having this opportunity to increase my level of expertise by exchanging experiences with people across the globe who actually deal with the same, similar challenges. So it’s very highly appreciated.
Federico:
So tell me Katya, for starters, how did you end up working in software testing?
Katya:
Well, I started my career as a software engineer back in 2000 and I went through various experiences, such as developing solutions and working with the end users of company products. And I led the implementation and support team and so on, until I eventually discovered the automation domain back in 2011, 10 years ago now, which I’ve been passionate about since then.
Federico:
Perfect. I like to start talking about our main topic today, how to establish effective test automation. When Tristan told me about you and your background and your experience, I thought that this is worth sharing, because we all want to take the most out of our test automation efforts, and it’s not about only selecting the right tool or hiring someone to the job, right? So what about if you start by telling us about your team, your testing processes, and maybe how automation takes part in your test strategy?
Katya:
Sure. So maybe I’ll start with my company first. So Trax is a world leader in computer vision solutions for retail, and as in many other software companies with the software updates being released on an ongoing basis, not only the new features must be verified each time the software is released, but we should also make sure that the existing functionality is not damaged, right?
Which means that the further we go, the more we have to test and naturally, manual testing just can’t sustain this fast growing scale and allow frequent releases. So this is why I think automation has an essential role in performing those sorts of repetitive and time-consuming and human error prone scenarios. So this is how it is the central part of our testing strategy.
Federico:
Perfect. So what do we need to create test automation?
Katya:
Well, first I think we should probably define what useful automation is, what is actually what they call effective automation. And I’d say the main pillars in regards to the feedback automation should provide are, first it should be reliable. I heard your interview with Bas Dijkstra who was talking about the false negative and false positives. And I totally agree with him. I mean, we can’t afford false positives or false negatives. Our reports should be as reliable as possible, otherwise everyone will just lose trust in automation, and we’ll miss our goal.
But other than not reporting false alarms, we should only report actionable items. And I mean that if there is some temporary network error or something like that, or some bug in a test and it doesn’t really lead to any action items, then it shouldn’t be reported. Either we should create a workaround in the test or the bug should be fixed and then it’s an actionable item, but it should be as reliable as possible. Otherwise there’s no point in it.
The second point I’d like to mention, in order for everyone to be able to understand the reports, we should provide automation which is easy to understand. So in order for the issues to be resolved as soon as possible, anyone should be able to understand the report automation provides. So for example, if a certain developer committee’s code and automation runs and reports an error, and if he needs me to be available to understand the report, it’s not really that efficient, but if we provide tools which are easy to understand, they’re readable, there are screenshots, there are logs, and it’s clear for everyone what would have happened, what has happened and how to fix it, then there’s no QA bottleneck in the middle and the issue can be resolved faster.
This is first and second, automation people can be free to add more coverage in the meanwhile. So it’s actually the only benefits that come from this approach. So reliability, I would say is the first thing, ease of understanding is another thing. And definitely automation should be stable. It should work on an ongoing basis and also overtime because sometimes it’s quite easy to create several tests which run and everything is fine, but when you have thousands of them or even more, it’s easy to lose control over all of these tests.
And that if we spend a lot of time on maintaining our tests instead of adding more and more coverage, then it’s not stable. It won’t serve us over time. And definitely, as you’ve already mentioned, it should be helpful, not only for testers and developers, but also for, let’s say all involved stakeholders, even product managers should be able to understand what happened, team leaders and so on. So the faster we detect the regression, the faster we can fix it.
Federico:
And related to how to communicate well, how to make a report easy to understand, what I’ve found in my experience is that sometimes it’s even the way you name the test cases, right?
Katya:
Absolutely. Even with these small details, you can make a huge impact in the efficiency of understanding.
Our test should be readable. This is one of the most important things in order to allow these reports to be self-explanatory. But also, there are some more development concepts which should be followed in order to make automation efficient. In general, I believe that automation should be considered as any other software project, and it should have the qualified staff, or you should check if you can afford training them, and you should choose the best tool to assure this clear upon test failure. And as a result, lead to faster data recovery.
Federico:
The best tool. That’s easier said than done, right? How do you know which one is the best?
Katya:
The million dollar question! So I’d say that it definitely depends on the nature of products your company provides. I mean, if it’s hardware then it can be some tools which are different, which might not be compatible for software testing, even if it’s software testing, your applications might be web applications or mobile applications, desktops, and so on, maybe you want to test APIs and so on.
So there are so many tools out there, part of them are coded and parts of them are code-less and there are open source tools and commercial tools. Some of them are external and the other company builds in-house tools. There’s lots of them. And I would say there is no one size fits all approach.
I’d say to find the most appropriate tool for your needs first, you should define your goals. Do you expect the results to be provided in the short run or in the long run? Maybe if you expect it to be in the short run, maybe you don’t even need automation because you can easily achieve it with manual effort, manual testing, and it will be cheaper. Second, you need to know the solutions available on the market.
And there’s lots of articles comparing the most popular tools. For example, if we talk about web applications that, like comparison tables for a Selenium versus a Playwright, Puppeteer, Cypress and so on, and definitely be aware of the common pain points. Many companies face the same automation pain-points. So if you know what they are, it will be easier to overcome them.
And finally, know your resources. Do you or your testers have the required development skills? Can we invest in training? And so on. I think answering these questions will help finding the best value for money most appropriate to everyone’s particular needs.
Federico:
According to what you’re saying, selecting the tool is really dependent on your test strategy, right? So first you have to define the whole, this strategy. How do you want to test your system? And then according to that, you’ll understand what you need in a testing tool. And then you can choose it in a better way, right?
Katya:
Right. And also know your resources because it’s important. Who is going to implement it? Can you eventually implement this tool or app?
Federico:
So which tool did your team decide to use?
Katya:
After using Selenium for several years, last year we switched to Testim, which is a commercial automation framework. And Testim allowed us to establish and maintain effective automation for our web applications. At a very high level, the effort is divided between Trax and Testim in a way that we focus on covering our business flows and Testim takes care of all the rest.
They deal with lots of indirect automation issues, such as automation framework configurations and dockers, images, managing different code libraries versions, and so on, all the environment maintenance issues. And I think one of the main common automation pain-points different companies face is that this part takes time. And from my experience, the time that was left for really authoring tests, when I had to deal with all these around automation headaches was about 20% of my time and something like that, for really authoring tests, which could have brought actual value to my company.
So just think about it. So many companies deal with the same common infrastructure issues. Doesn’t it make sense to take it out of their scope and just deliver it to some external experts? By the way, the cost is not necessarily higher if you consider the total cost of ownership, which makes open source tools not really free as the name might imply. But the benefits-
Federico:
I’m sorry to interrupt you, but it could be considered as a hidden cost of the open source solutions, right?
Katya:
Absolutely.
Federico:
And also related to Testim, I know there is a difference between coded and code-less automation tools. Can you tell us a little bit about that?
Katya:
Yeah. So there are different solutions providing so-called code-less automation testing, promising creating automation in the blink of an eye. And this approach is usually based on the record and replay technique.
Federico:
You just click and play and that’s it right? You have all the information already.
Katya:
Absolutely. Whatever you do in your browser has been recorded and you can then replay the scenario and it’s like your automation is ready in a second, it sounds like it’s sharp.
Federico:
Is it really like this?
Katya:
So I’m not a big believer in such tools. From my experience, they don’t really prove themselves in the long run in terms of return on investment. Recording can only be a helpful start point, but then it must be followed by enhancing the test to become more generic, built from reusable components in order to allow going fast, even after having hundreds of tests.
And for example, if we use reusable components and something changes in our application, we don’t need to adjust all those tests. We just fix one shared component and all the tests are adjusted. It’s the minimum effort, but you need to invest at the beginning in development in order to save on maintenance.
I can maybe just add to your question about the open source tools that we were discussing, I’m not against open source tools in general. Some of them are great and we use them as well. For example, we use SoapUI for our API testing because I think some tools might be overkill for certain needs, but when it comes to automation testing of web applications, from my experience, do it the right way.
I truly believe that to bring value to my company, it was most effective to hand over all the additional efforts to Testim, for them to take care of it and to get real focused on whatever is related to our business flows.
Federico:
So we are talking about costs of the software testing tool, but thinking about the automation, do you really think that test automation is bringing you some savings? So do you see a return on the investment in your test automation efforts considering all the costs that you are paying today?
Katya:
Well, yeah. Not only does automation reduce testing costs, but I think it also increases testing efficiency because automation resolves the human bottleneck, the manual testing effort can concentrate on human intelligence related flows where people definitely do better than machines.
But having said that from my experience with different automation tools in the past, I wholeheartedly believe that automation is not magic. As we said, you can’t just record the scenario and have thousands of effective tests. It just won’t work. To be efficient, it requires investment. There is no way to avoid it as we discussed, even if it is an open source tool.
So the return on investment in automation comes in the long run. It can only be beneficial when it is considered seriously and all the software project concepts apply. So you have the staff who can do it, who have the technical background to do it. And all the rules apply to automation as to any other software project. But when approached correctly, I do believe that automation testing makes a significant impact on delivering higher quality software in a shorter time.
Federico:
I totally agree with you. Is there something we should take into account when selecting a tool for test automation?
Katya:
So we were talking about the self-explanatory reporting with some helpful framework features, which might be related to screenshots taken upon failures. And by the way, sometimes it’s very helpful when screenshots are taken not only for the failed steps, but also for the succeeded steps, because sometimes it’s very helpful for troubleshooting to go back several steps and to see that the flow went in a non unexpected way.
So reporting is key. If you can get some browser logs to get more information for developers to understand the issue or some troubleshooting tools, this will be definitely very helpful. But other than the reporting, I would say the learning curve is also very important in regards to the handover to other people, because I think it’s quite important to make sure that automation doesn’t die after your automations developers leaves the company.
And I’ve seen so many cases that it happens and it’s so sad because so much was invested in it. So the tool also should be easy to use. If you want to involve developers in your testing process, you can’t give them Selenium. It just won’t work. They want something that will be efficient and fast. So if you intend to make automation everyone’s asset, think about the ease of use of the tool that you’re choosing.
And definitely low maintenance is also very important. And when we talk about low maintenance, it’s not just the tool that can take care of it, but it’s also the best practices that should be followed by the people who create automation. And, what else? Maybe integration with some external tools like the CICD, in our case, we use Jenkins. And so our automation runs based on different triggers that we have defined. Some pipelines run on a nightly basis. Some of them run on before committing the code or deploying to an integration environment or production environment, or even on demand if you want to test something right now.
And it’s important to revise these triggers in terms of frequency and the contents. So assure best potential coverage at lowest costs. So just make sure that whatever runs is, is the contents that … I mean, don’t run all the hundreds of tests upon each commit because it will take time and it will delay the committing the code and it creates some smaller [inaudible 00:21:58] suits which would run in this case.
So each trigger should run the relevant tests on the relevant frequency and on the relevant environment. So I would say this is yet another way to make automation efficient. We, for example, in Trax, we have our daily dashboards, which are based on automation pipeline status. And we have a short summary of areas where regressions were detected, followed by a list of open issues.
So this for example is a final stage of our automation life cycle. This scenario recording through all those test enhancements, executions, regression detection, and to action items actually what regression was detected, let’s prioritize and fix it.
Federico:
What’s your criteria to divide the different test cases and decide when to run each of them in your pipelines?
Katya:
So, let’s say upon committing a certain change, only make the relevant tests related to this particular area should run. And in addition, we run once a day, actually at night, we run all our tests. And so in the morning, so the maximum delay is a day, in the morning, we can see all those regressions, which were detected as part of yesterday’s commits. Define the areas and what can be impacted. So this will make the best contents, to be applied wherever relevant.
Federico:
Cool.
Katya:
And I would add another thing, sharing knowledge is so important to prevent double work. When you develop different reusable components, share it, tell people that this is what you did, and others can use it. It would prevent double work and it will increase the professional level of everyone. Conduct code reviews with people, no matter which tool you eventually choose, I think communication is a crucial part of success.
Federico:
These types of practices, like code reviews and things like that are things that I really believe that these automators should also take into account because it’s a way to improve many things that you just mentioned, like make them easy to understand, easy to read, more maintainable and so many benefits. This is key.
Katya:
And learn from one another.
Federico:
Exactly. So another thing that you mentioned at the beginning that I really find very interesting is that you need qualified people for doing the test automation, right?
Katya:
Yes.
Federico:
So can you tell me which skills are required for this type of activity?
Katya:
So as I said, it’s important to understand having a technical background is a must to achieve success in automation projects. And you don’t have to be an expert in the front-end or back-end software engineering, but in order to be scalable, following the concepts of programming basics is essential in order to be able to always extend the coverage rather than spending time on test maintenance upon application changes.
So I would mention a few examples of those basic concepts. As I’ve already said, the grouping several steps into reusable components or functions, if we’re talking about codes, so that the further you go, the less you need to record, you just pull out your existing functions from your repository and you’re ready to go. So the later tests take even less time to create. Conditions and loops. You should know what these are because you’ll eventually need to use them.
Define parameters to make our functions more dynamic. And as we already discussed readability and simplicity. We should make our reports easier to follow. This is one of our most important goals and new tests faster to create. So the main goal is to get back to fixing tests as little as possible. This is how I would put it. Having said that, it can be learned.
In our case, I can share that our initial intention was to involve our manual QA engineers in the automation effort. So we were looking for a more intuitive solution for non-coding experts. And we had remote training our QA engineers in Sri Lanka. And I must admit, the results are remarkable. So it can be learned. It’s not that you have to find those experts. If you can afford training, if you have someone who you can teach them and you have the time to invest into it, it can bring you value.
Other than technical background, they probably need to have the best in expertise, of course, such as being able to prioritize correctly. And they understand the customer orientation, have customer orientation to understand customer needs. They should be able to work with various interfaces with the product, understand product definitions with developers, with testers, with everyone around.
You should also know that not everything is worth automating. So you should always keep in mind, what would be most efficient? Is it worth the investment or is it easier to do manually? And definitely availability is one of the most important things, because I’ve seen so many times when manual testers try to do automation in their spare time, like, let’s do manual testing because it’s urgent now because we have a release. We don’t have time now to invest in authoring tests.
And when we’ll have time, let’s just add another test or two. It doesn’t work because manual testing will always be most important, most urgent, most easy to do right now. But to create automation, you preferably should do it 100% of your position. And passion definitely, you should be. You should be passionate about automation. You should be curious to learn new things, to improve, to think, what’s worth the effort, what’s not?
Federico:
There are so many things.
Katya:
Yeah. And there are more.
Federico:
Something that I typically discuss with colleagues is that what is easier? To teach testing to a good programmer or to teach test automation to a tester who already loves and is passionate about software testing? Because there are things in the mindset of a tester that are really hard to teach to people who love programming and coding.
Katya:
That’s right. Sometimes this is also perceived as a downgrade from programming to testing, which is wrong, which I don’t think it’s right. But it’s a matter of philosophy, I would say. I think that if you have a person who is a testing expert and he has the basic technical background and he has the passion, it will work. And we proved it, in our case, we proved it. We have automation people who started as QA engineers, and today they are adding automation coverage at a very high level, which I’m absolutely proud of.
Federico:
Amazing. And also I think this is something great about the code-less solutions, because it’s an easier way to start with test automation when you are a tester, maybe without a degree in computer science. And in our experience, what do we typically do is to have an engineer to review the different test scripts of the more junior test automators, because this is a way that we found very efficient to teach and train by doing, by experimenting and also to foster the best practices and to apply the best practices in all the test automation practice.
Katya:
I can tell you that in our company, developers have started to use testing for adding automation as well. So it definitely can be good for both. You can have a software engineer background, and you can have a testing background, as long as you’re curious about this field, about quality, and you have the tools which allow you to do it efficiently and fast and quickly, it will work. So currently we have both QA engineers and developers adding automation, which makes it much more beneficial for the company overall.
Federico:
Collaboration and learning one from each other.
Katya:
And, I believe that quality is everyone’s responsibility.
Federico:
Totally. So do you have any other consideration to make automation efforts more beneficial?
Katya:
Well, I strongly believe that an automation developer’s responsibility is not testing. It might sound weird, but I think it is rather providing tools. I call it the “automation for all” approach. So as I said today, our developers use testing as well. And this way increasing the quality of our deliverables becomes everyone’s asset. And as I said, quality is everyone’s responsibility and I think automation people are there to make it accessible.
Release the analysis bottleneck, invest in more coverage wisely, in more solid infrastructure, building those tools, let people use it, make them efficient. This is the main idea of being an automation developer the way I see it.
Federico:
Cool. Really nice. I have a couple of final questions for you because, I think we could continue talking about test automation for hours, but one of my final questions is if you have any book to suggest the listener to read?
Katya:
Not related to high technology?
Federico:
Yeah. It could be any kind of book.
Katya:
Well, for me, The Little Prince is the best book of all time. Every time I come back to this book, I discover some new idea, wise in its simplicity.
Federico:
Cool.
Katya:
Read it again and you’ll find something new. It’s wonderful.
Federico:
What about habits? Do you have any habit that you think improves your productivity?
Katya:
Well, life is intense. I would say know yourself and find your work-life balance. I can share that for me, working from home revealed opportunities to invest in hobbies, one of them is I started practicing yoga. And I think even when we don’t have too much time starting a day with five or 10 minutes morning stretches helps setting intentions for the day and maybe even gradually obtaining the balance to perceive challenges as opportunities.
Federico:
Nice. And can you identify some way you have to improve your sense for quality?
Katya:
I believe that continuous learning and curiosity are key. Maybe one of my practical suggestions would be, find your most relatively quiet times, we’re all busy but, I found that we in Israel, we start our work week on Sunday, so I found Sundays are most relatively quiet times for me. So devote several days, you can even do it bi-weekly or monthly, whatever works for you, but devote it to learning something new, which you find interesting.
For me personally, realizing that what I do makes a positive impact is what motivates me to strive for efficiency. But I have to admit that having said that, one of the biggest lessons I learned and I would say, still learning is that perfectionism is not a good friend in this dynamic world full of changes, requirements, urgencies that we all live in and especially in the last year.
When you start enjoying the way, rather than struggling to achieve milestones, we find out that we love what we do and we do what we love. Isn’t this real happiness?
Federico:
Wow, Katya. I love it!
Katya:
Really, just think about it. It applies to work, it applies to life, to anything.
Federico:
Totally. Many times we are so worried about doing things in a perfect way that we get paralyzed and so many bad things appear.
Katya:
Start practicing yoga. It makes a difference.
Federico:
Thank you. Thank you so much, Katya. Amazing interview. I really enjoy talking with you. Is there anything you would like to invite our listeners to do?
Katya:
Yes. I encourage everyone to join the Testim community that we mentioned at the beginning. And you don’t have to be a Testim customer to learn and share. There are different various automation, expert panels being held on an ongoing basis, which you can join and just gain some expertise. Really, highly recommended source for professional growth and feel free to reach out to me if you feel like it on LinkedIn, should you have any further questions or need any advice on automation best practices, I’d be happy to share.
Federico:
Amazing. Thank you. Thank you again. And I hope to talk to you soon again.
Katya:
Thank you very much, Frederico. Stay safe.
Federico:
You too. Bye, Katya.
Did you enjoy this episode of Quality Sense? Explore similar episodes here!
Recommended for You
The Complete Beginner’s Guide to Functional Test Automation
Quality Sense Podcast: Bas Dijkstra – False Positives and Negatives in Test Automation
Tags In
Abstracta Team
Related Posts
Testim Review: Automated Functional Testing Tool with AI
Looking to hear what other testers think about Testim? Look no further. When it comes to finding the most adequate tools for testing a software application, our team generally leans toward the use of open source tools for many reasons. First, they’re excellent for integrating…
How to Quickly Set Up Test Automation in CI/CD
Here’s a mechanism to shorten the setup time for automated tests using Selenium inside a Jenkins pipeline Something you often hear about Continuous Integration (CI) is that it helps identify problems earlier, but does it really? CI is a practice whereby a team of developers integrates…
Leave a Reply Cancel reply
Search
Contents