Most people would agree that software quality is important. We have seen the results of buggy software in all kinds of situations: from Mars probes malfunctioning and chemotherapy machines administering lethal doses of radiation, to telecommunications systems experiencing a cascade failure. It would seem logical to assume that software testers would be much-valued members of a development team. Yet sadly, that is not always the case.
Some software developers, product owners, and managers assume that quality assurance (QA) engineers are people who wanted to be developers and lacked the necessary skill or grit to succeed. Unfortunately, there are a few testers who fit this description, but most testers are people who genuinely care about the quality of the product they are testing.
Why then have testers gotten such a bad reputation? Usually, it’s because of bad habits they have developed over the course of their careers.
This article will outline seven habits that QA engineers should actively avoid and the good habits to replace them with, in order to ensure that QA engineers are doing high-quality work and earning the respect of their peers.
Bad Habit #1: Testing Things You Don’t Understand
We’ve all been there: There’s some obscure story on the JIRA board that involves some legacy back-end code, and no one is entirely sure what the code does or how to change it. The developer tasked with the story has done enough research to fix the code but hasn’t put any detail in the story about how it works or what change has been made. The developer says to you, “Just run this request on this server, and if you get this response, then it’s fixed.”
Here’s the problem with this scenario: How do you know the developer is right? If the issue is not fixed, and there is a failure in production, your manager will come back to you with questions. Do you really want to have your only response be “The Dev told me to do this, and it worked, so I moved the story to Done”?
The Good Habit: Ask questions. Ask your developer to explain to you how the feature works and what changes were made to it.
Keep on asking clarifying questions until you really understand what is happening. In doing this, you may bring up points the developer hadn’t thought of, sending them back to improve their work.
Bad Habit #2: Testing Only What the Story Tells You to Test
Our development stories often contain acceptance criteria (AC), which outline exactly how the new feature or fix should behave. These are often written by the product owner, and sometimes by the developer. The AC are helpful and are certainly better than having no AC at all, but they often contain only “Happy Path” scenarios.
Even when the developer writes the AC, they may not include test scenarios where bugs could be hiding, not because they are trying to be duplicitous, but because the scenarios might not have occurred to them. Testers will often assume that the developer knows best and will test only the AC. This means that there may be critical areas that are left untested and bugs left undetected.
The Good Habit: Think outside the box. One of our skills as QA engineers is being able to think about what might go wrong; we need to use this skill with every story we test.
Before you sign off on the AC, ask yourself, “Can I think of anything else to test here? Is there anything I’ve missed?” This will often help you find bugs in areas that no one else thought of.
Bad Habit #3: Assuming That Odd Behavior Is Correct Behavior
Often, when we are testing a new feature, we run across behavior that doesn’t make sense. Perhaps it’s an odd page refresh or a navigation to a place we weren’t expecting. Or perhaps a button appears where we weren’t expecting one.
It’s easy when we are testing on a deadline to focus so much on the AC of the story that odd behavior gets pushed to the back of our mind. We might tell ourselves, “I’ll ask the Dev about that when this story is done,” (and then we forget), or we say, “Well, I’m sure she knows what she’s doing; it’s probably supposed to do that.”
The Good Habit: Listen to your instincts. If the behavior is odd, there’s a very high probability that end users are going to find it odd as well; they may even find it so frustrating that they stop using the application.
We need to remember that our end users are our customers. We are the last line of defense in making sure that they have a good experience with our application. If your instinct is telling you that something isn’t quite right, document your testing and speak up about what you are seeing.
Bad Habit #4: Chasing Things Down the Rabbit Hole
This is the opposite of Bad Habit #3; sometimes QA engineers are so focused on finding every single thing wrong with an application, no matter how tiny, that they wind up in “analysis paralysis” and bring their team’s progress to a halt.
I remember asking a fellow QA engineer what her favorite bug was that she had found in the course of her career. She excitedly told me about a bug that involved clicking a button several times, navigating forward and backward through a pair of pages, and then scrolling quickly, all in one specific browser.
While I’m sure this bug was fun to chase down, it involved behaviors that a user would never, ever do, and the bug itself wasn’t particularly harmful. I wondered how many other real issues she could have found while she was trying to reproduce this one obscure issue.
The Good Habit: Focus on real-world use cases. Always remember that our focus should be on making sure that our software works well for our users and that our software is well-protected from malicious users. We are not merely finding bugs for the joy of the hunt.
If you find yourself going down the rabbit hole, ask yourself if your time could be better spent testing more realistic use cases.
Bad Habit #5: Automating Tests for the Sake of Doing Automation
QA engineers who have learned how to write automation discover that automating things is fun. There is a certain rush that comes with solving a technical challenge and watching your test run automatically.
But automation is not always the answer. When we have a new feature to test, it’s important to take time and get to know the feature as an end user would by actually using the feature. When we jump into automation before we’ve done this, we can wind up automating tests that don’t exercise the feature well.
We can also miss key features. For example, if we had a new search feature that searches by a date range, an automation engineer might spend all their time figuring out how to pick dates using Selenium test software and never notice that it was possible to enter in a start date that was after the end date.
The Good Habit: Take the time to do manual, exploratory testing to get to know a feature. Ask questions about how the feature will be used. Think about what your end users will do. Find as many bugs as you can. Then, start to think about how you should automate it.
Bad Habit #6: Creating Complicated and Flaky Tests
When I first learned how to automate user interface (UI) tests with Selenium, I automated them like they were manual tests. My tests had lots of steps and implicit waits. The more steps a test has, the more likely it is that some test step will fail, causing the entire test to fail. Implicit waits are unreliable because waiting for a set number of seconds does not guarantee that the element will become present and clickable in that time.
Consequently, my tests were extremely flaky.
Every morning when I arrived at work, I checked to see which tests had failed and reran all of the failures. Then I would tinker with the tests that had failed a second time to see if I could get them to work correctly. This was a tremendous waste of my time.
The Good Habit: Remember that the point of automation is to make your work easier, freeing you up to do more exploratory testing.
Automated tests should be simple, with each test checking only one thing. Take a look at your UI tests and see if they could be automated with API tests instead. Application programming interface (API) tests are faster and more reliable than UI tests because they don’t rely on responses from the browser. When a UI test is needed, be sure to use explicit waits rather than implicit waits to reduce flakiness.
Bad Habit #7: Accepting a Poor User Experience
Sometimes, when we are working on a deadline and have many stories to test, we look only at the functionality of a feature. If the feature works and has no bugs, we call it done and move on.
But it’s important to remember the end users. If a user doesn’t understand what to do on the page or finds that they have to click several times in order to get something done, they will be frustrated and won’t want to use the product.
I saw an example of this recently when I was asked to fill out a survey. The questions I was asked required long answers, but the survey fields were so small that I could only see one line at a time, making it difficult to type and proofread my entry. I’m sure that the QA engineers that tested the product verified that the field could be typed in and that the entry was saved, but they didn’t consider how difficult it would be to use.
The Good Habit: Always think of your end users when testing your application. Find out from your product owner what the expected workflows are and run through those workflows. Ask yourself what you would think of the product’s behavior if you were the end user rather than the tester. If the behavior would frustrate you, advocate for a change in the behavior.
Remember Why You Are Testing
In our daily work as testers, it’s easy to get distracted by deadlines and technical challenges. We are great at focusing on the minutiae of software, which is why we are good at finding bugs. But we must never lose sight of why our company exists: to create software that people will use.
The end result of all of our tasks must be the assurance that a user will be able to use our product intuitively, safely, and easily. When we consistently focus on the quality of the products our team is delivering, we earn the reputation of being effective QA engineers as well as the respect and trust of our developers, product owners, and leaders.