There are no ‘requirements’ in agile!!

It surprises me to hear staunch agile followers still using the word “requirement”.  When questioned, they will respond, “…a story is a requirement.”  It isn’t.  Early in my reading of agile I came across this analysis of the word “requirements”:

“Software development has been steered wrong by the word “requirement”, defined in the dictionary as “something mandatory or obligatory.”  The word carries a connotation of absolutism and permanence, inhibitors to embracing change.  And the word “requirements” is just plain wrong.  Out of one thousand pages of “requirements”, if you deploy a system with the right 20% or 10% or even 5%, you will likely realize all of the business benefit envisioned for the whole system.  So what were the other 80%?  Not “requirements”; they weren’t really mandatory or obligatory.”

(I am leaving out the reference to this quote so there is no bias.  If you don’t recognize the quote or the style of writing it is worth your time trying to find the source.)

In agile, instead of requirements you have user stories.  It takes quite some time to make the transition from requirements to stories.  To me, after reading the previous quote, Mike Cohn’s statement about stories was essential: “A story is a reminder to have a conversation.”  Of course it is difficult to create stories.  Mike Cohn has written a book about that.

Alas, “requirements” are even more entrenched in the software testing world.  Interestingly, in 1999, a great tester wrote a wonderful article about requirements.  In that he wrote:

There are at least four alleged truisms about testing a product against requirements. Most  of the testing textbooks on my bookshelf promote these principles, and each principle reflects some truth about the dynamics of testing.

  1. Without stated requirements, no testing is possible.
  2. A software product must satisfy its stated requirements.
  3. All test cases should be traceable to one or more stated requirements, and vice  versa.
  4. Requirements must be stated in testable terms.

When we think in terms of risk, however, I believe a richer set of ideas emerges.

(I am again leaving out the source to avoid bias.  You can easily search for the pdf article. )

If you haven’t realized it by now, keep in mind there are no “requirements” in waterfall!!

In the “waterfall” world (pre-agile), there has been some good work on “requirements”.  Making the transition from that world to a world with “user stories” leaves many questions open.  I still believe, with a good agile coach, and a good testing coach, teams are better off making that transition.  However, writing stories and trying to determine what might go wrong can require a lot of work.

How I learned agile (far from done)

I started on agile late (yet again!!).   When trying to understand agile it is difficult to search for conceptual information on the web or in books.  You can search for standard terms such as TDD or pair programming.  I’ve listed the important sources which helped me understand agile.

James Shore has a great blog post on how he chooses different types of tests.  What is interesting is the sentiment that developers want to prevent defects (no matter what).  James provides his blueprint on how he plans to do that. (Sarcastic comments about developers won’t be appreciated here).  James uses exploratory testers (I think he means manual testers) to check if his process is broken.  I have seen many XP developers make such claims.  Without judging what they say and not having worked with such developers, I think just having such a mindset is great (it doesn’t matter if good testers find critical defects).

After sulking for months when developers in agile teams weren’t like James Shore, I was advised by Dave Nicolette on linkedin, that he would ‘let developers make that choice’ – what is their mission and how they will accomplish it.  You can’t hold a gun to a developers’ head and tell him that developers should have a mission statement like James Shore?  Not sure if I completely agree.  I wish someone would convince me.  I know Dave’s correct.

Brian Marick has a nice series of blog posts on agile testing.  Brian understands testing/exploratory testing and is generally a super sharp guy.  In circa 1995, Brian described how he would staff an agile test team – a few exploratory testers along with testers and developers (not his exact words).  From all I have seen (I’ve read it all), this perfectly summarizes how testing should be done in agile teams (not appealing to authority).

Somewhere along the line, I happened upon the context-driven testing school and the related yahoo newsgroup.  This group of testers exemplifies smart.  I also realized that “Lessons learned in software testing” (Kaner, Bach, Pettichord) is a scripture on testing.

Skill matters.  I figured this out from the responses on the yahoo newsgroups on scrum and xtreme programming.  Also gently reminded by Ron Jefferies wielding a sledge hammer on the scrum newsgroup.  (As an aside, both newsgroups are great sources of information.  The scrum newsgroup is probably more forgiving of silly questions (I know))

The one contribution which stands out from James Lyndsay’s excellent paper on agile is that in testing skill matters.  Of course, today many smart testers know this.  Process is not a proxy for skills. (All that remains is to define what you mean by testing skills – see Chapter 2 from the only book on testing you will ever need to read).

Lessons learned in Software testing explains the difference between iterative and waterfall
Lesson 165: Waterfall lifecycles pit reliability against time
Lesson 166: Evolutionary lifecycles pit features against time
If you have a task which has a fixed deadline/scope, you shouldn’t use an evolutionary lifecycle!!

Can you use scrum for non-development activity?  This was an important conceptual question for me.  Jeff Sutherland  has some articles on using scrum outside the development environment.  I think, in the development context, you shouldn’t bother.  If you want to learn agile, start with a meaty development project.

The term exploratory testing is used frequently among agile bloggers/writers.  What is exploratory testing?  Smart testers always (mostly) explore, i.e., testing is synonymous with exploratory testing.  (Does it mean you aren’t smart if you don’t explore – not necessarily.  If you tell me what you do, I can comment on whether you are a smart tester or not.  You may be doing exploratory under the hood.)  To understand exploratory testing look at Cem Kaner’s video (download this if it doesn’t load in the browser) and note that he has been doing exploratory testing for decades (it’s not new).  (Aside: If you make statements such as, “…do you create test plans in agile…”, you first need to figure out what you do/should do before agile/in waterfall.  You might have been wrong about testing all along).

Test-driven development is an extremely compelling concept in agile/software development.  If you are a tester (or developer) you should understand TDD.  The only way you can understand TDD is by actually trying to use it.  Use Brian Marick’s book to get a feel for TDD (if you are a developer there are plenty of choices).  Also note that TDD is not about testing, it is about design.

In James Shore’s book, Elizabeth Hendrickson has given a nice account of how an exploratory session looks like.

I love Kent Beck’s writing, especially “Extreme programming explained“.  If you do anything related to software development, you should own Kent Beck’s book.  This book gave me a great perspective on agile.

Mike Cohn is another amazing writer/thinker on agile.  He has two excellent books.  I read a lot of his books and website.