One way to blur the dev/test line: Testers doing Code Review

Crucible icon
Image by Atlassian via Flickr

Today I participated in a webinar with Alan Page to learn about testers doing code reviews.  Since Atlassian makes the awesome Crucible code review tool (which has a $10 starter license), we’ve got code reviews all over the place.  Part of my check-in process for automated tests is to have them code reviewed.  I’ve never, however, considered doing code reviews for the devs.  Thus, I tuned in to see how it had worked for Alan and his team.  These are some notes that I made.  Update: Here is a recording of the webinar.

When testers look at code review, helps them ask, “In what ways may this code fail?”

The “official” type of code review is the Fagin inspection: prep for the meeting, moderator and reviewers go through the code line by line.  It’s very slow, boring and massively successful.  Trade off is that they work extremely well but take a very long time. (so guess that would be good for mission critical 911/medical type stuff)

Alan’s approach was less formal.  Have a kickoff meeting ask for volunteers.  Expectation is that code review happens everyday.  As with guitar practice,  15 minutes a day is better than 2 hours done 1 day a week.  His star group was able to review 500 lines of code per week.

Checklists play a major role in the review:  “I wouldn’t do a code review without one.”

Checklist Manifesto:  Checklists were used to make surgery much safer

The Invisible Gorilla.  A book about how the brain works.  The authors are responsible for this youTube video which teaches about the effects of inattentional blindness (if looking for one thing, you’ll miss another)

Checklists should be modified over time.  Stuff that becomes less relevant should be removed.

It’s better to use a checklist of types of errors and go through  looking for a specific type of errors such as off-by-one, assignment errors rather than reading sequentially through code.  (or to have a checklist based on usage patterns.)

Alan thinks most of his checklists have 8 or 9 items which makes a lot of sense.  In psychology, I’ve heard about humans being able to remember +/- 7 items at a time.

What will you find:

Code that has lots of bugs probably has more bugs

Bugs likely to be found wherever there is churn

Greater complexity between 20 diff complexity metrics indicates bugs and can guide where to start review.

So what happened:

They found bugs

They learned much more about product code

They found areas where tests needed to be written

Stuff they found:

Instead of using a code review tool, they used word.  Lots of the comments were, “should we be doing it this way?”  Should this

Devs get sensitive about code.  It’s important to remember, code reviews are about the code…not the person.  One way to alleviate this is to refer to all problems with we.  “This is our problem.  This is something we should fix.”  The dialog improved relationship between devs and test.  Big wins were intangibles: understanding code, developing better relationship with dev.  Also was good for teaching non-coding testers similar to the way it’s easier to read a language than to write it.

One challenge was to find a  way to keep comments and code in synch.

Inspection SWAT:  Get the good reviewers together, find a risky part of the code and go at it for an hour or so.

Code Review of a Test

I asked about doing code review for tests, since this is what I’ve been doing lately.  Alan suggested that I use a different code review checklist for automated tests with a few suggestions:

Are there hardcoded paths?

Will the test fail if app taken away?

consistency & reliability – what causes it to vary from run to run

What happens when the test fails, is the information useful?

The review should include a copy of log file that test produces a

key bit of collateral from automated test is results and log file

What would be mean time to diagnosis in case of failure.

I also asked how much of the production code the testers knew before they started doing code reviews.  Alan said that his team knew bits and pieces, but did not have a very thorough understanding.  Doing the code reviews changed this.

The main benefits:

Testers learned much more about the production code

The relationship between testers and devs improved

They found bugs

Alan also thinks that code reviews are a good way to show developers new to the project what’s been going on with the code.

One of Alan’s stand-out concluding remarks was, “I couldn’t not do them now.”

Enhanced by Zemanta

1 thought on “One way to blur the dev/test line: Testers doing Code Review”

  1. This is an excellent tip for working on the relationship with the developers. One of the keys to success is being able to work well together.

    Another is pairing programmer and tester when the tester is focusing on the programmers specific features.

    The developer and tester analysing the coverage of the unit tests compared to those of the higher levels tests can also be fruitful.

Comments are closed.