So, I’ve been out of work for about a month. During my time back in the hunt I learned just how big test-driven development, or TDD, has become. Quite literally every place I interviewed (for Java and Rails jobs alike) asked me what my experience with it was, my philosophy, and on and on.
And I had to answer, in each case, truthfully. I get the idea of TDD. I’m a fan. But never have I been in a place that’s truly adopted it. Usually the typical legacy excuses all get in the way – nobody wants to write tests for code that already exists, so there’s no good framework from which to build new tests, and it’ll take too much effort to make that happen when it’s easier to just continue writing code like they’ve always done. Not a good excuse, just a popular one.
Well, I’ve got a new job now and all the above is still true. However I think I’ve found a bit of a loophole in that logic that’s going to allow me to force feed TDD into the mix. I’m a new hire. There’s an existing code base. I can read the code all day long, and I can think I understand it, but how do I really know? Should my first real interaction with the code be when I’m making changes that could potentially introduce bugs? Into production? Not such a great idea.
So instead, as my first project, I integrate JUnit (it’s a Java shop). Then, I start writing unit tests. I read an existing function of an existing object, I think I understand what it’s supposed to do, so I write a test for it. One of two things is going to happen. Either the test will work, in which case my understanding of the code was correct, or else it’s going to fail, in which case either my understanding of the code was wrong, or else the code itself was wrong.
The end result is that a) I know the code does what I think it does, b) I potentially uncover bugs that would have otherwise been uncovered the hard way, and c) now unit tests exist for the next guy. Win win win.
I call it “test driven learning”, and I dub it TDL. Pass it on.