“Should testers be technical?” This is a frequent question especially in the last decade, as the Agile approach has been adopted by more and more organizations, and people agree that tester have to do more than before.
But should they be technical?
Recently, the discussion happened within Thoughtworks testing group about a blog post on "Should testers be technical?" written by Sreedevi Vedula. Here are some great viewpoints shared by Thoughtworkers.
Yes, coding is definitely a kind of technical skill. In the beginning of this century, the industry needs many testers for blackbox and exploratory testing to verify functionalities since system behaviors are much more complicated than ever. Some of them never learnt coding or only know a little about coding.
“I agree with you that testers need to understand the system. But there is a big jump to being able to code and talking technical on how a system works. I reckon it's more about changing your mentality: build vs test quality. For that you don't have to be able to code. You have a dev that can sit next to you to help with that.”, Torsten Leibrich says.
I couldn’t agree more with it because automation is a good thing to do repetitive work for us so that we can spend more time on deep or complicated tests. Moreover we can think and talk deeply with developers and ask them to help us to check software internal quality. That’s the reason why I like quote from Sir Tony Hoare (I changed the word “testing” to tester):
"The real value of tester is not that they detect bugs in the code but that they detect inadequacies in the methods, concentration, and skills of those who design and produce the code." I think that's where Tester standing on in this domain.
“Perhaps there is a happy medium between "full on manual tester" to "uber technical tester who can code". There is still white box testing that is still a great skill to learn (or upskill). Instead of learning how to code, they can start to learn how to read and understand how the code works (or just the flow). Not only that is a nice introduction to programming, it will also help them debug and find the root cause of the bug instead of offloading it to the developer. Furthermore, they can also understand what a stack trace is and look at the line in the code that throws the exception and it might indicate why it failed. “ - Thien-An Mac mentioned
Technical skills are good but there is still more about testing. .
“Isn't there a home for the more business analytical testers? IMHO most enterprise systems are complex enough that it's a challenge to keep all the elements 'in your head', but many of the strong analytical testers *are* the ones who can tell me most about a system. They're the ones who say "If you do that, then X system downstream will do Y" -- they understand how a change will ripple through the system.” - Tim Brown
If we're doing a good job of separating our test concerns -- test definition, implementation, and driver (h/t to Twist) -- then the more analytical types I'm referring to might very well define the user journeys & system flows, while the more technical testers might do the automation implementation, and the most technical handle the driver layer.
Darrell Grainger, who has worked with many testers who do not program or have any idea how test automation works. “However I would hire them in a heartbeat over many technical testers. Just on my current project the QA department is actually separated out into manual testers and automated testers. They were completely manual testers. The manual testers really understood the business. During inception we found the application we were asked to write (1/6 of all the software this department uses) was incredibly complex. The SMEs are constantly re-explaining things to me because it is so hard to keep all the details straight.”
“One of the automation testers was explaining the automation to a manual tester because she felt she really needed to learn automation or she would be obsolete. As he explained a test case he was automating she immediately pointed out the flaw in test and how it completely under-estimated the complexity of the software it was testing. Essentially the test wasn't sufficient. When she explained why the test was insufficient I explained back what I thought the actual test should be to see if I got it. She then pointed out that my test was too complex and there were equivalences which could make the test, literally, 10 times less complex than my test but still be 5 times more complex than the original test. Knowing how to manually test the correct things is FAR more valuable than knowing how to automate the wrong test. They have a dozen highly skilled automation testers. They are having a hard time finding people who understand the business well enough to know how to test it. She is one of three manual testers. She is what Tim is talking about.”
“That is one why a manual tester adds value. I'm not going to say she is better than the automation tester. She knows WHAT to test (manually). The other guy knows HOW to test it over and over again without burning out (automation). Neither is better. They are just different.” said by Darrel Grainger
As the software development methodology evolves, we need to evolve. With Agile, with incremental releases, the team is under tremendous pressure to live to the “definition of done”. Does that mean manual testing / manual testers are out of the picture now?
“As pointed out by various people in this topic, I don’t see any disagreement in the need for manual testing. However, manual testing itself maybe a misnomer - now more than before!” Anand Bagmar says.
“To me, manual testing means - someone, at some point in time has identified test cases, and written them down in an excel / QC / or any other tool. Then the same person, or another set of people over a period of time, go and manually execute the same test case (with or without following all the steps listed) and report success / failure. That is the kind of testing / testers that I definitely do not see adding much value on teams (in most cases).”
HOWEVER, the manual testers that most are referring to in this post and thread are more evolved beings. They are the SMEs of the product. They understand what and why things need to be done from the domain as well as the product perspective to ensure it works fine for the end-users of the system. I strongly believe that every team needs people with this mindset. They may not be separate people / roles, but someone who thinks constantly from this perspective. I think it is high time we stop diluting their worth and value by simply calling them “manual testers” who do “manual testing”. Maybe we call these folks “exploratory testers”, or “system testers”, or “domain testers / validators” … or something more fancy - but we need to break out from this mental model what “manual testers” represent and imply - that too different things in different parts of the world. Another advantage that this will potentially bring out is in setting the correct expectation from these roles.
Maybe it’s good opportunity to look back something: What’s Software Testing and What skills Software Testers (hereafter as Testers) need to have in last decade and now.
From the definition about Software Testing from Wikipedia, you can see that Software Testing requires many things, like:
So my opinion is that testers should be technical and technical parts are not just about coding. And there are still more technologies involved in testing activities.
Finally, I would repost those comments from Anand Bagmar:
I have been seeing people tend to go from one extreme stance of - QAs should do Test Automation, to the other side, QAs should NOT do Test Automation. In my opinion, there are good reasons why we need this variety of QA profiles - mainly because we have different types of projects, different duration, different distribution, different domains, etc.
My friends, what would you say and what are you going to do?
Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of Thoughtworks.