A common challenge within IT departments right now is to understand the implications of
disciplined agile
software development. A critical step to understanding agile development is to observe an actual Agile team
in action and see the results for yourself. Unfortunately there is no official criteria for determining whether
or not a team is Agile, and worse yet there are many "code & fix" teams claiming to be agile. I often run into
traditional IT professionals that are confused about Agile either because they mistake code & fix approaches for
Agile ones. Worse yet they then propagate their misunderstandings to others, compounding the problem.
At the time of this writing there is no official definition of what Agile is, and many people want to use the
values and
principles of the
Agile Manifesto as the definition.
Although I respect this opinion, and wish it were that simple, my observation is that the values and principles
are too vague for determining whether or not a team is taking an agile approach. When I "evaluate" a
team, here's what I look for:
Is the team producing value to their stakeholders on a regular basis? In the Scrum methodology they talk
about delivering "potentially shippable software" every sprint/iteration, and that's a good start. In the
Disciplined Agile (DA) tool kit we have a more robust view in that we believe that a disciplined agile team
will produce a consumable solution every iteration, bringing in the idea that the team is producing far more
than just working software (they're also producing supporting documentation, changing the business process,
hopefully delivering usable software, ...). When I'm assessing a team I expect them to minimally be able to show
me the working software that they've built to date as well as the source code. The code should be consistent
because it will have been written to conform to a common set of guidelines and the team will
refactor whenever appropriate
to ensure that it remains of high quality. If it is early in the initiative, say the first week or so, then I
wouldn't expect to see much. If they're at least a few weeks into the effort then I would definitely expect to
see some software. If the team is several months into the initiative then I would also expect to see evidence of a
track record of producing working software.
Figure 1, which summarizes results from the
2013 How Agile Are You? Survey, depicts the adoption rates of various strategies for potentially providing
value to stakeholders.
Figure 1. Agile criterion: Value.

Is the team is doing developer regression testing, or better yet taking a test-driven approach to
development? Minimally developers should be testing their code, to the best of their ability, at least daily in
a regression manner. With
test-driven development (TDD)you write a
single test before writing just enough code to fulfill that test. This can be done at the requirements
specification level as well at the design level. I will ask a team to show me their test suite. I want to see it
run, but anyone on the team, and I want to see the actual source code. I typically look for a roughly 50-50
split between testing code and production code, although 60-40, or 40-60 is also reasonable depending on the
situation. If I see an 20-80 split then I know there's a problem. There are no hard and fast rules, of course.
The development team validating their own work is only one part of your overall
agile testing strategy.
Figure 2
depicts the adoption rate of various validation strategies employed by agile teams.
Figure 2. Agile criterion: Validation.

Are stakeholders active participants in development? Your stakeholders can and should be directly involved in
the development process (I promote the practice of
active stakeholder participation). Minimally stakeholders should be involved on a daily basis, they should
provide information in a timely manner and make decisions in a timely manner. When assessing teams, I will ask
the team to introduce me to their stakeholder(s) and they should do so right there on the spot, or at least
respond with something along the lines of "that would be Beverley, and she's in a meeting right now and can't
meet with you until 11am."
Figure 3
depicts the adoption rate of common strategies for involving stakeholders.
Figure 3. Agile criterion: Stakeholder involvement.

Is the team is working in a highly collaborative, self-organizing manner? More importantly, disciplined agile
teams will be doing so within an effective governance framework. Self-organization means that the team members
themselves are planning and estimating their own work (granted, perhaps with some facilitation from the team
lead/Scrum master). When I'm assessing teams I typically look to see if the team is doing some sort of iteration
planning at the start of the iteration, or in the case of lean teams on an as needed basis, as well as daily
stand-up meetings (also called Scrum meetings, sigh) to coordinate their efforts. But just because they're
self-organizing doesn't mean that they're out of control doing their own thing. Disciplined agile teams will
perform their work within the context of an effective
governance framework
which guides and monitors their efforts, including working towards a common infrastructure (typically driven
by your
agile enterprise architecture efforts) and common
programming guidelines. To determine if agile teams are being governed effectively, which is unfortunately
rare in my experience, I look for existence of following common guidelines and working to a common
infrastructure, effective metrics reporting (hopefully automated) to management, and better yet evidence that
management actually acts on that information to help the team succeed. Sadly,the
Agility at Scale survey
found that only 11% of respondents indicated that their agile teams found their governance strategies effective
Figure 4
summarizes results from the
2013 How Agile Are You? Survey, depicting the adoption rates of various organizational strategies employed
by agile teams. Note that having work assigned to you by your team lead or project manager, which appears to be
occurring on many "agile teams", isn't considered to be self organizing. Figure 4 also covers various governance
strategies, sadly indicating that a large percentage of teams are manually reporting metrics and some are
producing regular status reports for senior management (regardless of all the rhetoric in the agile community
about the evils of doing so).
Figure 4. Agile criterion: Self organization and appropriate governance.

Is the team improving their process on a regular basis? A common practice on agile teams is to hold a
retrospective at the end of each iteration to
identify potential ways to
improve
their software process. More importantly they act on one or more of their issues the next iteration,
improving their approach throughout the initiative. Really disciplined teams make the effort to track their
progress over time and share their ideas with other teams.
Figure 5
shows adoption rates of relevant improvement strategies employed by agile teams.
Figure 5. Agile criterion: Continuous improvement.

You might find my articles
Examining the Agile Manifesto
and
Agile System Development Lifecycle (SDLC) to be interesting introductions to agile software development.
Also, my
various agile surveysprovide some insight into the how Agile is being adopted within organizations.