Blog

Validating Modified Data in Test Automation

The importance of validating modified data in automation

In test automation of any kind, we end up automatically simulating (with a tool or with a fragment of code) the action that a user would execute on the system (in the broadest meaning of the word “user”). Therefore, a very important thing is to make sure that our automation, that is, our testing artifact, actually does what it is supposed to do in simulating the user’s actions. We have to test it! 

One way to check that our automation is working properly is to validate modified data

Something that has proven quite important in one of our recent performance testing projects relates to this: we must validate test scripts at the data level, more specifically, at the database level. This is particularly important when we don’t execute at the graphic interface level.

When we execute a test at the graphic interface level of the system subject to testing, such as with GXtest, Selenium or WatiN, we are certain to a degree, that we will do the same thing that a user would do by clicking buttons, entering data in inputs and so on. This is quite different in performance tests, where, for example, we attack the application at the protocol level. So, in the case of a web system, we will be sending invocations at the http protocol level. In this case, it is most important to verify that a script does what it is expected to do and that there are no differences with what is done manually.

Validating modified data is quite easy to do with the aid of a developer. When we show an expert on the application that we are executing a script that uses certain data, such as: A specific user enters an invoice with products issued to a specific customer. The developer will easily know which tables to check in the database to verify that the data is consistent and correctly updated so that the difference between that data being processed by an actual user or by a test script will not be noticed. It is important to validate that the different variables and attributes used in each event have been parameterized. Having the system’s KB or requesting this information from the developer can be helpful for that.

What’s important here is that, if the script is parameterized incorrectly and we send wrong data, we could be generating erroneous situations, and the simulation of reality would not be the right one. Imagine that we record a script where we indicate the creation of an invoice. This will surely require that we parameterize the client and the products. But, what happens if there was an error made in parameterizing the script and the product was not parameterized, so the same product is used every time that an invoice is processed? We would first think that it is not such a big deal and that in simulating hundreds of users processing invoices, we will just have numerous invoices with the same product. However, this could have negative consequences in the test: if the system must verify the existence of stock and update it after every purchase, we will be generating more locks at the database level than the locks there would be in reality. So, we would not be simulating the user behavior as we planned, but rather a situation that is much worse.

It is clear, that, if we’re not careful with validating modified data, we could arrive at incorrect conclusions, showing the importance of tests and of testing artifacts as well!!

 


Recommended for You

How to Avoid False Positives and False Negatives in Test Automation
The 4 Most Common Challenges of Test Automation (And How to Overcome Them)

30 / 423