The (written) unwritten guide to create a definition of done

Leone Perdigão
5 min readDec 4, 2019

--

Scrum joke from Scrum.org

Definition of Done is a shared understanding of what it means for complete work. In other words, is a checklist of features and activities, for example, writing code, coding comments, unit testing, integration testing, release notes, design documents, etc. that adds value to the product.

DoD is a good reporting mechanism for team members. It helps to say that “this work is done”. Using DoD as a reference for this talk, a team member can effectively update other team members and the product owner.

Also known as

Software developers have a reputation for being careless when answering the question “are you done with this?”. It can mean “done coding” and this is usually what a developer will have in mind when answering. However, the meaning of interest is generally “are you done programming, reviewing, testing, ensuring it is deployable, documenting…?”.

Proverbially, to get an answer to that, the question to ask is, “I know that you are done, but are you DONE-done?”.

An etymological note for the terminally curious: this doubling of a word to call attention to something is known as contrastive focus reduplication.

Roles

To make up a Scrum team, it is necessary a Scrum Master, Product Owner and a Development Team. However if we consider Kanban or Lean, no set roles are defined. And it is important to say these roles are not required to be cross functional.

Scrum joke from Scrum.org

In other words, these roles can change based on the way of working that suits better for the team, and the team should have common sense as they proceed. How? For example, in the absence of the PO to approve a story, maybe the scrum master can say the last word or even the most experienced business level player in the team. It is also fair to say that the team being self-organizing and with the right level of organizational and functional maturity can decide together. Being clear: the team owns the DoD. The team.

Deliverables

A deliverable is a tangible or intangible good or service produced as a result of a project that is intended to be delivered to a customer. A deliverable could be a report, a document, a software product, a server upgrade or any other building block of an overall project. DoD may differ depending on what is being delivered:

  1. Code

a. Task
b. User story
c. Feature
d. Release

2. Documents and drawings

1. Code

a. Task

In this case, done means coded to standards, tested, integrated and documented to attend the story goal.

b. User story

In this case, done means coded to standards, tested, integrated and documented.

If you can’t forget about a task when it reaches the “Done” column of your board, it is not done.

  • Product Owner accepts user story

c. Feature

Done on this level may mean it qualifies to add to a release. Not all user stories need to be completed. Rather, it means the feature may be sufficient to satisfy the need.

  • Acceptance criteria met
  • Automated tests pass
  • Pipeline and SonarQube metrics passed
  • Promoted to higher level environment (e.g. tst and acc)
  • Feature level functional tests passed
  • Non-Functional requirements met
  • Meets compliance requirements
  • Functionality documented in necessary user documentation

d. Release

Done on this level may refer to an organizational strategic priority, portfolio plan item, or some other collection of features that satisfied a customer need. Not all user stories or features need to be completed. Rather, the release may be sufficient to satisfy the need.

  • Code shippable to Production
  • End-to-end integration completed
  • Code in Production with monitoring in place
  • Code passes smoke test
  • Meets defined market or customer expectations

2. Documents

Documents here are all no-code artifacts that can be linked to a business definition, solution or feature. Speaking of documents, it is important to recall that the purpose of agile documents is to help the support and operations staff with easy to understand and concise information. So, there is no use of documenting things which will not be of any use. See a checklist below:

  • The document purpose is clear
  • The document has just one unique communication standard
  • The content have visual elements ready for publishing
  • The article concept must have passed a peer review

Documentation that facilitates knowledge transfer is only possible when effective communication with all the project stakeholders is there throughout the project.

Expected benefits

  • Avoids beginning work on features that do not have clearly defined completion criteria, which usually translates into costly back-and-forth discussion or rework
  • Provides the team with an explicit agreement allowing it to “push back” on accepting ill-defined features to work on
  • Almost anything that needs to be done significantly increases the risk that a bug will be found, a problem will emerge and we won’t be able to deliver the product to our customers as promised. DoD provides a checklist that can help reducing risk.
  • If something outside of the “done” needs to be prepared, some members of the team will have to do it, which means they won’t be able to focus on new tasks, thus slowing the development and introducing delay. DoD helps team members reduce delays by preventing them from being blocked by the leftover work.

Common pitfalls

  • Obsessing over criteria’s list can be counter-productive; the list needs to define the minimum work generally required to get a product increment to the “done” state
  • Individual features or user stories may have specific “done” criteria in addition to the ones that apply to work in general
  • If the definition of done is not shared somehow, even spelled out or displayed on a wall, it may lose much of its effectiveness; a good part of its value lies in being an explicit contract known to all members of the team

Related articles

[1] Core Practices for Agile/Lean Documentation

[2] The Definition of Done: What does “done” actually mean? by Danny Smith

[3] Maturity Model by Martin Fowler

[4] Automated Integration Testing

[5] Misuse of code coverage

[6] Pithy commentary of Testivus

--

--