Is it time to redefine TDD?

Everybody talks about it, hardly anybody does it... should we redefine what Test Driven Development means?

Everybody asks for TDD but nobody uses it.

We’ve all interviewed at companies for Software Developer roles where Test Driven Development (TDD) is pushed as common practice at the company and is stipulated in the role spec. Most do. We know the answer they are looking for… the definition of TDD… write tests first, only do enough to get the simplest of tests passing and follow red, green, refactor.

We wing it in the interview… we know TDD and understand the concept, although we haven’t worked anywhere that actually practices it. We can explain it and we would love to work somewhere that applies it… this could be the place! You get the job and start and, although there may be suites of automated unit tests, nobody does it and practicing TDD is frowned upon.

Why do companies ask for it?

Companies like to hire people who know best practice and TDD falls into one of those things. Companies believe that good software developers know best practices, so even though the company doesn’t use TDD they believe that saying it is used there attracts good developers who know their stuff. Admitting it isn’t used would lower the companies credibility and appeal.

Why don’t companies use it?

There are a number of reasons below. They have all been written about a lot, mostly in defense of why TDD works and that these things are myths. There is a reality though and in my 20 years of experience working in software development I’ve found them to be true at every company I’ve worked at that has tried TDD.

  • There are deadlines to hit.
  • It takes too long to write tests first and follow red, green, refactor.
  • You end up with too many tests that take longer to update when updating the code.
  • Rigidly following TDD means a lot of rework as you hardcode things to get a test to pass then change the code to get the next test to pass… then change it again to get the next test to pass. All to get to the point you knew you would be it when you started the code.

Shouldn’t we always follow TDD as we know it’s a best practice?

There are a lot of best practices and they all take time. To actually deliver code we need to pragmatically select and apply them to get the best balance of short term delivery and long term maintenance. Those two things are both important and conflict with each other so getting the balance right is critical.

Perfect code that is never used is worthless, bad code that is usable now but can’t be updated in the future is worth little more.

Claimed benefits of TDD.

There are a number of benefits associated with TDD, but the reality is if you write automated unit tests after you have written the code, you still get all of these benefits:

  • Testable code — You don’t need to write tests first in order to write testable code. In fact following other best practices, like the SOLID principles, will give you testable code. TDD does not lead to testable code, it isn’t a cause and effect thing. Writing testable code gives you testable code.
  • Automated suite of tests — It doesn’t matter if the tests are written before the code or after the code, an automated suite of tests is an automated suite of tests. If a developer can’t write good tests, it doesn’t matter if they write bad tests before or after they write the code.
  • Good code coverage — Again, understanding code coverage depends solely on developer ability, not whether the tests are written first or red, green, refactor is followed. Good coverage isn’t a percentage of lines of code ran, it’s the number business critical scenarios that are tested. Kent Beck himself actually said “I get paid for code that works, not for tests, so my philosophy is to test as little as possible to reach a given level of confidence”
  • Confidence in making and deploying future changes— If you have an automated suite of tests with good coverage, it doesn’t matter about the process used to write them.

How should TDD really be defined?

To align with what most companies both want and actually do, should we really redefine TDD to be a suite of automated tests that pass with good coverage of testable code at the point a PR is merged, rather than test first or red, green, refactor?

Further reading

If you haven't read "Test Driven Development" by Kent Beck I would recommend giving it a read, or you might prefer something more modern using your language of choice, for instance for .NET developers maybe something like "C# and .NET Core Test Driven Development: Dive into TDD to create flexible, maintainable, and production-ready .NET Core applications" by Ayobami Adewole

Knowing how to write unit test suites and testable code is definitely a useful skill, and having some automated unit tests really help. I just think, like a lot of best practices in Software Development, TDD doesn't need to be applied dogmatically to every line of code written. If you do, it grinds your delivery to a snails pace and turns your organisation against TDD.