Enhancing The Processes Of Test Driven Development

Knowledge / Inspiration

Enhancing The Processes Of Test Driven Development

Continuous Delivery
UXDX EMEA 2017
  • Understanding the processes of test driven development
  • Collaborating with teams in QA and development
  • Helping customers understand and communicate what they want
  • Testing and scaling to improve development
Nix Crabtree

Nix Crabtree, Lead Principal Software Engineer,ASOS

**Nik Crabtree **00:00
So we are going to be talking about enhancing the processes of test driven development. We're going to do three things. We're going to just have a quick run through of TDD (test driven development) cycle, which you may or may not know, actually, let's have a little show of hands. Who understands the TDD cycle? Okay, good. Fair amount. Another quick show of hands. Who here is a developer, software engineer? Okay, and who is on a kind of product management, product owner side? Okay, and who is on the BA side or quality assurance side? Certainly, okay, cool. So run through the TDD cycle quickly. Then we're going to talk about why ATDD (acceptance test driven development) and then we're going to go through the ATD cycle and how that fits with TDD. And a whip through the first bit, because I'd like to get onto the ATDD bit. And I don't want to keep you any longer than necessary. So TDD red, green refactor. Hopefully, we all know this, from the show of hands, we write a test that exercises the smallest amount of visible behavior. Unit test fails, its read, we know that our code base doesn't contain that functionality. We make the simplest code change possible to implement that new behavior, the smallest unit that we can make, we run the test, it's green, brilliant and then we might need to go back and change the structure of the code to make it more maintainable, but not the behavior. So it's a refactor not a rewrite. And the unit test obviously, should still pass because we haven't actually changed anything other than the way we format the code.

**Nik Crabtree **01:53
So let's talk a little bit about why ATDD? What is the benefit of moving from that well known trusted cycle to something broader, and really comes down to the fact that TDD is not a testing practice, you end up with unit tests as a byproduct of it. TDD is a developer-centric process whereby you specify what code you're going to write, or at least, what behavior you're going to write code for, and then write the code that fulfills that behavior. So there are a number of problems that it can solve, hopefully recognising most of these. Well, it's a shame that you'll recognise most of these, I guess, the flow requirements are sometimes disconnected. So from the business requirement down to what code is implemented, is not always a continuous and joined up process. Sometimes unable to take stories into a sprint. This is based largely around scrum ways of working, obviously, it applies in other models as well. If the requirements are poorly defined, you get to the point where you can't take it into the sprint, because you're not entirely sure what it is that you need to do or you take it into the sprint anyway, and hope that you can work it out in two weeks or wherever and implement it. Rework is required when you know, what you've implemented is not what was expected, because the requirements weren't quite clear.

**Nik Crabtree **03:29
The acceptance criteria is a variable quality in different formats, it might be that you're starting to get implementation details and the requirements, it might be that they're too vague. It might be that actually, they contain a number of different technical implementations. Ultimately, collaboration between the three amigos is inconsistent. It's absolutely foundational to me that everyone talks and understands each other. Not that the things passed between them are kind of open to interpretation. And you end up with this mini waterfall testing thing where you have your devs going, Yeah, I've written some code, now you go and test it, and then they forget, they go off and do something else and then maybe there's a problem. And then they have to get back into the context they were in and ultimately, it slows everything down. And you end up with an increasing gap between what you wanted and what you got. And quality assurance, quality engineers are often second class citizens. And in fact, those guys who are absolutely vital in any team might be different people and might actually be the same people who are writing the code, but the mindset there is really important. It's a first-class citizen in any development team. So we're talking about three things in ATDD. Shared understanding, shared ownership, and a test first approach and these are the three key cornerstones of ATDD.

05:02
What they mean is that we end up with high quality software, and high trust high performing teams. It sounds like it's a magic bullet, but it genuinely works. I've done it, I've done it more than one company and these are the things that we see consistently. So let's race through this a little bit so we can get onto the juicy bits, shared ownership. It's a three-legged race. It's not a relay. They don't know what three legged races and one of them did at school, you tie one leg to someone else, and you have to do it in sync either as a pair, otherwise, you both fall flat on your face. I'm gonna talk a little bit about Scrum and Agile development practices. Again, because it fits very well with this model. Your mileage may vary, you may have other models, the foundation of ATDD still works. So the scrum guides tell us that there are no titles for development team members other than developer, it tells us that there were no sub teams, you don't have a test team in your development team, because you end up with that handoff, and you end up with a lot of context and the individual development team members will obviously have specialised skills, things that get out things that they like to focus on, but the accountability for delivering a piece of software of high quality software is that team, there should be no point at which someone can point to someone to say that was him or her.

**Nik Crabtree **06:35
There are no heroes. I'm sure we've all seen the hero developer, you know, stand back, I can do this just you go and do something else. Hold my point. I'm going in. Nobody likes those people. The development team is responsible. I'm going to labor this point a little bit. So the development team is responsible for delivering this stuff, not the individuals on the team. The development team and this is really important. They should feel a collective sense of pride and achievement and getting stuff done. The whole team, not just the guy who's you know, worked late and told everyone else that they don't need to worry, everyone should feel they've contributed something to the organisation. There is no dev complete. PBI is done or a piece of functionality or feature is done when it's proven to meet all of the acceptance criteria. Not when someone says yeah, I've done the code. There are no handoffs in a high performing team, every time you handoff, you lose context, you take yourself out of the process of completing that feature. And there are no second-class citizens in a high trust team and a high trust team, everyone is trusted to deliver high quality software wherever it might be and everyone should feel that they are empowered to do that and be proud of it. So shared understanding, PCQ, anyone know what this is? Good. It's a whole triple. You have a precondition, a session, a command, and you have a postcondition assertion.

**Nik Crabtree **08:21
The precondition is met, then executing the command establishes that condition. So let's put it another way, which maybe you'll start to recognise, given the free condition is met. When we execute the command, then we have a post condition that we are expecting. So holding a triple is a logical function. We provide some input, we invoke an action, and we examine the output. What does it mean? Well, isn't that what a unit test is? We specify something that we're going to pass in, we do something, and we expect a known result every time. Cool, but then isn't that also what an acceptance test is? It's an acceptance test and I realised it's an overloaded term, but an acceptance test is exactly that. And so if an acceptance test is that well, surely that's also a functional requirement. If we have a system, we're going to pass something in, we're going to do something in our customer journey and then we're going to establish a particular condition. It's exactly the same thing. So what it means is turning functional requirements into acceptance tests into unit tests into working software can and should be done from exactly the same set of criteria. Exactly the same. Everyone involved is singing from the same hymn sheet, no one gets the words wrong. No one seems out of tune. That's the plan. This is the glorious utopia. Test first development are a few things that it's not, it's not a waste of valuable delivery time. I've heard that so many times.

**Nik Crabtree **09:59
Now, all about code coverage at all, it's not an opportunity to start a holy war, which still to this day is the case. It's not just the software engineers and it's definitely not boring. I love it. I love writing software that way, because it means I know exactly what I'm writing is exactly what was required. What it is, is a lean approach to development. So, we avoid the gold plating. You know, I'm going to need to store something. So, what I need is a repository pattern and I need to abstract everything away, well, maybe we'll do one day, but not right now. It's a way of being confident that your software meets the requirements from the very first line of code that you write. And the reason for that is that you've already written the test that says so. And it's all about understanding what you're doing and why it's not like diving into the code and trying to figure it out as you go. And then going down rabbit holes, it's about knowing exactly what you need to do, being able to do it and then move on to something else, adding more value as you go. Acceptance criteria is kind of foundational to all of this is defined and agreed with the development team and the product owner, whatever terminology you use for that. It's written in business readable domain specific language. So everyone should be able to understand it by reading the words, not having to read the code. Gherkin is a particularly good example of this. It's written from an outside perspective. So, as a user of the system or as a consumer of the system, I'm going to do a thing and something's going to happen. And it is definitely the expectation of behavior, not implementation, implementation comes later, you can change the whole implementation, and the behavior remains the same, you're still meeting that acceptance criteria.

**Nik Crabtree **11:52
So on to the juicy bits, the ATDD cycle, we had a red green refactor and TDD & ATDD, we have discussed to still develop and demo. Let's go through those one by one. Discuss the team and ask questions. This is typically when your product owner, your BA, whoever is looking after that role says we need to do a thing, but they say it to the development team, the development team sits with them they have a chat might not be the whole development team, it might just be representatives from the you know, the software engineering specialisation in the team and the quality engineering specialisation in the team and the data engineering specialisation in the team, but it might result in breaking down that feature, it might be that that considering that how you might do it means that actually there are two different sets of features in there. And maybe one just goes on to the backlog. And you focus on the one that you're going to do right now with the highest business priority business value. They might add other new features to the backlog, spark ideas and you know, could be things to develop in the future, it could be things for this, this particular feature. And ideally, establish a shared understanding. So everyone knows what is required. At the time when you're discussing this. Everyone knows it's coming down the line and what needs to be done.

**Nik Crabtree **13:18
So, we then distill that you formalise that acceptance criteria, we're still not writing any code, we're still not writing any tests. And we're not writing any automated tests at this point. Using a format like gherkin. Again, there are other formats, which you might prefer, I like gherkin. Use the ubiquitous language. So something that everybody understands but has specific meaning. When you might use personas, you know, you might say, Jenny is a particular customer, and she has a particular order history etc. When you say, Jenny, that ubiquitous term, is something that everyone understands as a more concrete object, if you like, they don't include implementation details, the implementation can change, and the behavior would remain the same in theory. And it should, at this point, be ready for spring. So we develop that by turning it into automated tests. We transform those ubiquitous terms into domain objects. So we take Jenny and we turn it into an actual customer in our tests, a customer object with a related history or whatever it might be. We might add technical scenarios in the development team, you might add NFRs (non functional requirements), you might add various other things and you can do that at this stage when you start the implementation and what you end up with is a set of executable requirements. The requirements that you took from the business in the same language in the same terms becomes executable and repeatable so that every change you make you are confident that it still does what you set out to do. And then you demo that might not be an actual demo. It might be that you start to gamify it, and you already know it meets the requirements because of your executable requirements. So you might do some exploratory testing, you might gamify it, by handing it off to someone else in the team, I've developed this feature, so you can break it or you might have two teams that are working side by side, and you go, okay, well, we've done all this stuff, you go and play with it and then everyone gets involved and starts trying to break it and passes in like 2 billion characters into your string field or whatever else it is and then your classic kind of sprint review where you demo it back to your product owner or your stakeholders, whatever it might be.

**Nik Crabtree **15:35
The TDD cycle sits in the middle of this and you when you get to the develop, stage, and you have a set of automated tests of executable requirements, at that point, you can then start using TDD from an outside in perspective and to start with and then classic TDD once you once you've identified your collaborators to implement the code, but you're still using exactly the same criteria, you can take those given when thence, and you can turn them into your arrange at cert or however you choose to do it, you can even use a BDD style framework and at that point, your automated acceptance tests are green, your unit tests are green, and is ready to go. And so everything in the TDD box, there is white box testing, so your tests understand the code that you've written. But everything outside the box, the ATDD stuff is Blackbox testing. And that comes back to the point I made earlier, which is that you can change the entire implementation, but your acceptance tests, your executable requirements should still pass, you could take the whole thing and rewrite it from dotnet into Java, or you could turn it into go. And ultimately, if you pass in the same thing, and you perform an action, and you get the same thing back out, it makes no difference and then everyone understands right from specifying the business requirements, right through your big quality engineers, your software engineers, everyone understands what they're implementing. And everyone has the confidence that any changes they make to that software will result in a high quality, bug free release. In theory, okay. whistlestop tour. So, we have time for questions. We're gonna do maybe three, four minutes of questions, but I'm gonna be around for the whole afternoon. So come and grab me and have a chat. Always happy to talk about this stuff at great length.