I attended CITCON Sydney 2013 this last weekend. Let me start by saying that if you haven’t attended one, you don’t know what you are missing out! These conferences are held all around the world and they are a great forum for idea exchange. If there is one happening near you, I strongly suggest you attend it!
The creators describe CITCON as:
The Continuous Integration and Testing Conference, a world-wide series of free Open Spaces events for developer-testers, tester-developers and anyone else with an interest in Continuous Integration and the type of Testing that goes along with it.
In a nutshell, you show up on a Friday night, and after a brief introduction to how CITCON works (by the way, it is pronounced ‘kit-con’!) everyone present has the opportunity to suggest topics. After the topics have been suggested by all interested in doing so, we all vote for topics we’d like to attend. Hence the agenda for the conference is set by the attendees. What a genius idea!
Should testers learn to code?
In one of the sessions I attended, the topic of programmers learning to be better testers came up. As far as I could tell, the room had more testers than developers, I’d say 2:1 ratio. Hence when this subject came out no objections were raised and almost everyone agreed that programmers could be better at what they do if they became better testers (or learned to think a little more like testers). However, when the reverse argument popped up it was a different story altogether.
It was interesting to see how testers reacted to the comment that we could become better testers by learning to code. There was an interesting debate among the testers presents, some were in favor, others were not.
As a testers who knows a bit about coding, I find that after I learned to program I was able to find different types of bugs, to communicate in more meaningful ways with developers and to automate a few repetitive tests for myself and my team. In my opinion the time I spent learning how to code was time well spent.
During that CITCON workshop, someone raised a very valid point saying that we are all in this together, we (meaning testers and developers) should have the bigger picture in mind (i.e. the product we are delivering, the customer and the users). Therefore its not only important for developers to learn better testing skills, but it is just as important for testers to learn coding skills.
It is intriguing to me that developers in the room seemed OK with with fact that they should learn testing skills, however the same was not true for all testers. I don’t want to generalise here, the sample population from both industries in the room was extremely small, nonetheless this is not the first time I hear this argument. And I believe there are good arguments going both ways.
Some good testers cannot code
I once asked Elisabeth Hendrickson this very question and she said that she does not believe every tester should know how to code. Elisabeth wrote an interesting post about this topic, and in it she expressed her views that there are very good testers out there who cannot code:
Testers who specialize in exploratory testing bring a different and extremely valuable set of skills to the party. Good testers have critical thinking, analytical, and investigative skills. They understand risk and have a deep understanding where bugs tend to hide. They have excellent communication skills. Most good testers have some measure of technical skill such as system administration, databases, networks, etc. that lends itself to gray box testing. But some of the very best testers I’ve worked with could not have coded their way out of a For Loop
So like anything else in context driven testing it depends!
Not all testers should HAVE to code
Now, let me clarify a very important point. I am not a developer, never been one, nor do I want to be. I know how to code well enough to help me be a better tester, however that does not make me a developer by any stretch of the imagination. There is an important distinction between testers having to code and testers knowing code. Clearly, not all testers should code.
However in my experience, knowing a programming language helps a tester get a better perspective when testing. It helps facilitate dialog between testers and developers and it comes in handy more often than not. Michael Bolton’s post At Least Three Good Reasons for Testers to Learn to Program: tooling, insight and humility. It provides compelling arguments, worth reading.
Not all testers enjoy coding
I am a tester, and I enjoy the challenge and the craft of testing. I learned a bit of JAVA and C# at uni, and since then I decided to brush up my skills, and I’m currently working towards a JAVA certification. I have been able to use the skills I’m learning to simplify testing tasks, such as writing Selenium scripts for data creation and regression checking.
However I enjoy coding. An argument I have heard is that not all testers enjoy coding, and that is OK. Testing skills and programming skills are very different, however the question here is: just as programmers who learn better testing skills become better programmer and better team members, wouldn’t the same be true for testers that know how to code?
Of course, as with anything in testing, it depends. It depends on the person, on the type of testing they are doing, on the application they are testing.
Increased employability
Finally, as Elisabeth Hendrickson explains in her post, 79% of testing jobs advertised in the USA require some form of programming skills, from basic to advanced. Up to 90% of testing jobs in agile environments require the same skills. Elisabeth concludes that at a minimum professional testers should know SQL.
Inevitably, testers that wish to continue to be current and competitive in the international job market, will need to learn at least the basics a programming language, weather they like it or not.
Look at race car drivers for example. They are not required to know mechanics or engineering, however you’d be hard pressed to find a race car driver and does not know the basics of mechanics and that is not interested in how his car works. They do not need to be fully qualified mechanics, but they know they are the ones racing the car and sooner of later they’ll need to troubleshoot issues they encounter on the racetrack. The more they know about engines, the earlier they can detect issues and relay them to the mechanics team. In the same way, testers do not need to know how to code, and there are great testers out there that don’t. However software is our craft, it is our race car’s engine, and if we don’t know the first thing about how it runs, we could be missing an opportunity to become even better at what we do. And if there are good testers that don’t know how to code, imagine how much better they could be if they did.
I love this topic… I’m wondering how long I can keep up the battle for the ‘we don’t need to code’ team? Not long I suspect.
As you point out from Elisabeth’s post, the majority of testing related job adverts seek coding skills even here in Australia. So I may have to wave the white flag soon.
Having said all of that… I would love to know how to code, but I’m yet to have a job where it has been required. This leaves my ‘spare time’, of which there isn’t much. Now, given the choice of learning to code, or learning something else about testing in my spare time… the something else always wins.
So, you’ve mentioned… “In my opinion the time I spent learning how to code was time well spent.” What if you spent that time learning about critical thinking? Time well spent as well? What if developers spent their time learning how to code better, and testers spent their time learning how to test better? Then again, you could argue that by them learning the other discipline, they are in fact becoming better at their own. I’m now confusing myself!
So, give me a job where I need it… please! Because I fear that will be the only thing that pushes me hard enough to do it.
Great post Ale.
Hi David,
Thanks for your comments!
You make a very valid argument about time management. Time is such a limited resource that we have to carefully think where and how to spend it.
Testers learning how to test better and developers learning how to code better are the way to go. What if by learning to code, testers are learning to test better and by learning to test developers are learning to code better!
The dilemma of using time well is not specific learning how to code. Time spent learning about psychology has helped me be a better tester too! This applies to many other disciplines, so I guess, whatever makes you a better tester (or developer, or BA…) is time well spent!
Good luck holding the fort – someone’s got to do it 😉
As a designer obsessed with code, I tend to agree, Ale. I’d add that the question of whether time was ‘well spent’ in a product development context is ultimately measured in terms of the benefit to the product and user. Isn’t it?
Pin a medal on whoever can devise a formula to quantify that!
In the abstract, I subscribe to the belief that nothing is ever 100% irrelevant. If you don’t often venture beyond the boundaries of your discipline, then you are denying yourself countless serendipitous breakthroughs that might enrich your work and value. This goes for anything: psychology, art, code, geopolitics, or Battlestar Galactica (a really nice combo of the other four), etc. etc.
When it comes to learning to code, specifically, I have two cautions:
First, learning to code should never be the end in itself, but a means to better collaboration with developers, better understanding of underlying systems, and better product. I think you’ve outlined this well.
Second, a major danger inherent to learning code is that it brings you closer to the ‘implementation model’ of a system. The implementation model can be very, very different from your users’ mental model of how things work or how things are supposed to work. By learning code, one can become too sympathetic with developer priorities and lose sight of the goals of the business, the product, and the user.
For instance, edge cases – where developers spend most of their time beating their head against a wall – can be blown totally out of proportion. When implementing a new feature, a developer might spend 80% of their time devising a clever workaround or shim to fix a problem that affects less than 1% of the user population. It’s an unavoidable circumstance. It’s best for non-developers to keep their distance and a cautious, constant eye on the base rate probabilities that point to the REAL risks.
Alan Cooper beats this horse to glue in “The Inmates are Running the Asylum,” which I highly recommend.
This is one of my favorite topics. Frans Johansson writes brilliantly about that in “The Medici Effect”, about how innovation usually happens in the intersection of ideas, fields and expertise. How knowledge of different disciplines that can sound completely unrelated can have a profound effect on another.
I also agree with your second point above, where sometimes having independence of duties can bring a ‘fresh set of eyes’ to problems, which creates an impasse where one needs to ask how much to learn and how much to consciously not to learn about any given subject.
Thanks for the thoughts!