Integrating QA Into Agile Web Development – 352 Noodles & Doodles Episode 16

Is there a way to shave years off of the trial and error implementing Agile?
Find Out Now.

This may be hard to hear, but if you’re working in agile development, there’s a chance that you’re implementing quality assurance completely wrong. 352 director of quality assurance Chris Manning shows you the right way to integrate your QA team into an agile development process.

Transcript below:
: My name is Chris Manning; I’m director of quality assurance at 352. Now what if I were to tell you that everything you knew about how QA should work in an agile environment was wrong. Well hold on let me explain.

In the current QA environment QA is off to the side. You schedule a day for testing and to test the backlog towards the end of the sprint. QA is it’s own team, and its not accounted for during the effort pointing process. There is little to no client interaction.

Wel,l how should it work? You should be integrated with the teams. You should be testing the backlog as work is completed and on phone calls to hear the client expectations. Also you need to be a part of the effort pointing process. Right now it works a little like this: developers complete their task, they mark it ready for QA, and the project manager gets with you towards the end of the spring to schedule testing. What could go wrong? You have everything you need and can get everything done in short concise period of time, it sounds pretty good so far.

Well, you weren’t a part of the effort-pointing process. Well this means you can get low-balled on time leaving you little to no time to test. And you’re scrambling at the end trying to get everything tested, meaning stuff can get released without fully being tested correctly. Also you weren’t a part of any of the client interaction; you don’t know the scope of the website. You could be testing for 5,000 users in your low test when the client only wants 5-10 users to be sustained on the website. This leads to a lot of wasted effort on your part, and you could be submitting bugs for performance issues that just really don’t matter.

Well how should it work? It should work like this: be a part of client phone calls, be a part of sprint planning. This allows you to get the scope of the website, the demographic, the goals and the potential load of the website all in perspective. Now you’re testing leaner and more efficient. Because you know all this stuff. You know the scope, you know what the client wants out of the website.

Next be a part of sprint planning and be a part of the effort-pointing process in particular. That’s important. Now, you can talk to the team directly about how hard you think a task is going to be to test, so that too much work isn’t taken on in the sprint, leaving you no time to test at the end. Next, be a part of daily stand-ups. You have daily communication with the team, and you can tell them what you’re waiting on. This gives us accountability for the developers, now they know that you’re waiting on them for work, so they can complete it on time and as needed.

Working as a team closes the loop, right now it’s set up like this. You have your sprint level, you have your story level, and then you have your tasks. Well, if you wait till the end of the sprint and one of these fails, you’re going to have on average maybe 12 stories per sprint — let’s say you fail 5 of them. Now the entire team gets five tickets all the way back up to the sprint level at once. This is bad, because now they are stuck in a loop. They’re going to go from a sprint all the way back down to the task level. Now what should happen is as stuff is being completed, you’re testing. So as it gets to the story and you test, as it gets to task and you test. This keeps the loop relatively small if a task fails you go back to the task level. If the story fails you go back to the story level. Now you’re testing way more efficient, and it leaves time open for you to do be doing other stuff like exploratory testing, regression testing instead of just wasting your time with doing solely verification and validation testing.

Hopefully this has been helpful to you. My name is Chris Manning, feel free to check back for more videos later.

Related posts: