On occasion, I see dev shops complaining of work taking too long to finish, of deliverables being carried over from Sprint to Sprint, and the root cause turns out to be that the developers don’t know what it means for their deliverables to be “done” until they are done. They start work, having a “fairly good idea” of what the customer is asking for, hoping that they’ll finish the work in time to demo it at the end of the iteration. And they find out the hard way that “fairly good” often just doesn’t cut it.
So how to get from “I think I have an idea of what they want” to “we are on the same page about exactly what needs to be done”? It’s a question with a potentially very long answer, but ultimately comes down to creating quality requirements.
The quality of software requirements has a direct influence on the quality of deliverables. You need a sufficiently clear understanding of the business domain, of the users involved and their needs, of the scope of the work to be done, and any edge cases in the logic that needs implementing.
Eliciting all of that requires a certain baseline of understanding and commitment on the part of the product owner (or customer) and the team gathering and implementing these requirements. Further, when it comes to building the actual requirements, there are many kinds of documents to consider, from a specification of required product features, to user personas and how they work day-to-day, to a list of business challenges that currently have no solution. Some of these can be captured in free-form documents, others in bulleted lists, and others in specialized formats. Product features, broken down into small deliverables, can be recorded as User Stories. Today I will take a very narrow focus and look at User Stories in particular.
A key idea behind User Stories is that they are “atomic,” the idea being that a single User Story cannot be broken down further into requirements that themselves are good User Stories. (If it can, then that should be done, and the original user story may become an Epic or some other metadata that can logically group the resulting, smaller User Stories.)
A good User Story should be valuable (for the business), independent (provide that value on its own), testable, estimable (understood well enough that its complexity can be determined within an acceptable range), and small (ideally, so small that it cannot be broken down further into independent, valuable, testable user stories).
The I.N.V.E.S.T. mnemonic is sometimes used to consider how “good” a User Story is.
The typical format for the opening statement of a User Story can capture the “valuable” criterion.
As a ~
I want ~
So that ~
For example: As the site owner, I want users to authenticate prior to browsing, so that I can create a small barrier to entry, personalize user experience, and track user behavior.
Each part of the 3-phrase story contains useful information. The “As a ~” will describe the role from whose perspective the story is written. This can imply all sorts of things. “As an administrator” may imply that only those with administrator-level privileges are able to experience the described behavior and others should not. The “I want ~” briefly describes the desired behavior, and “So that ~” describes the value that this behavior provides. If a product owner cannot articulate the “So that ~” phrase — that is, if he or she cannot think of a good reason to desire the behavior — then it is likely that the work does not need to be done or in any case requires further conversation.
The rest is down to the details. Is the story independent or does it not actually contribute value on its own? Is it small or can it be further broken up into two or more user stories? Is it testable?
Most of the time, the “As a ~, I want ~, so that ~” statement is not enough to answer those questions. And, importantly, it is not enough to tell you the scope of the work. It is not clear how much work you have to do in order for the story to be considered done.
No User Story is complete without a clear definition of “done.”
One way to think about what “done” means is that if the product owner has accepted the work as complete, then there is no more work to be done. In other words, the User Story has a set of “Acceptance Criteria” that must be met. These should be written out so that they are clear to everyone, including the product owner, those implementing the Story, and those testing the Story.
Acceptance Criteria should certainly include everything relevant from the user’s perspective. They may also call out other requirements that concern the development team more than the product owner.
An example with our authentication story:
As the site owner
I want users to authenticate prior to browsing
So that I can create a small barrier to entry, personalize user experience, and track user behavior
- The user should be shown a login form upon visiting the site
- The login form should appear over site content and disappear upon authentication (that is, the login form does not live on a separate page)
- The user should not be directed to a different URL upon successful navigation
- The user should be shown an error message if incorrect credentials are entered
If there are any unknowns in this story that prevent it from being estimable, those should be brought up as soon as possible so that the User Story is ready to be implemented once it is high-enough priority. One crucial detail is missing in the story above — the credentials themselves. What are the accepted user credentials? Is it a username and password? Is it an email and password?
In this case the missing criterion for this story is: User credentials are an email and password. This story now has enough clarity that the work described can be estimated in terms of complexity.
There are other behaviors that relate to user authentication that are not touched on in this story. They may be worth asking about as part of requirement elicitation. An example is “can the user link his or her Facebook or Google account instead of using a site-specific one?” or “what happens when a user tries to log in with the wrong password several times in a row? should the site lock that user out to prevent brute-force account hacks?” or “what if the user has forgotten his or her password? should there be a ‘forgot password’ flow?” Mind you, the answer to these questions, if yes, does not mean additional acceptance criteria for the story described above. These behaviors are independent enough that the story above, done as-is, will provide business value on its own. Then features such as password reset, account locking, and so forth, would be described in separate User Stories that follow up on this one.
Perhaps this is the first time that the notion of user account management has even come up, which can spawn a whole separate discussion around security and how credentials are collected and stored. And shouldn’t there be user registration since there is user authentication?
Note that some things are not mentioned in the Acceptance Criteria that nevertheless become a decision point for the implementer and tester. For example, does the user submit credentials by clicking a “Log in” button? Or does the user press the Enter key on the keyboard? Or is it both? These details are specific to the solution rather than the problem and therefore the product owner may not care which direction is taken. Nevertheless, it is usually worth a quick conversation as work progresses so that there is less need for rework down the road.
I attended a talk by Ken Schwaber (a founder of Scrum) at a conference, where he succinctly explained that a story is done when “there is no more work to be done.” In other words, nothing is hidden, such as database migration scripts, or creation of user accounts, or whatever else may not be called out in a user story and therefore left until later but nevertheless must happen before that work can be live in production.
So Acceptance Criteria may not fully capture the definition of “done” for a User Story. In fact, many “nonfunctional requirements” are typically called out somewhere else. For example, stability and scalability. Does the fact that the site must support 10000 concurrent users logging in mean that should be called out in the Acceptance Criteria for user authentication? It almost certainly won’t be. Things such as performance, usability, accessibility, running on multiple platforms (Chrome, Edge, Safari, etc.), architectural conformance — these are probably considerations for every single bit of work done, and therefore are universally implied, provided they are called out somewhere and with sufficient precision (e.g. “must support 100 transactions per second for a sustained (1 hour +) period of time without experiencing statistically significant performance degradation”).
As you collaborate with the product owner and elicit requirements that are broken down and organized into User Stories with accompanying Acceptance Criteria, always keep in mind what “done” means and seek to clarify that definition before you commit to implementation.