In this episode of Codurance Talks Javier Martínez Alcántara, Data Engineer and Software Craftsman at Codurance, Alasdair Smith Software Craftsman at Codurance and Jose Enrique Rodríguez Huerta, Managing Director at Codurance Spain + software craftsman + organisational transformation consultant talk about TDD.
They talk about why testing is important when launching a product, when is the ideal time to work on this process, and how responsibility for the testing process is managed in a team. They also talk about science and data science in this area.
- How is testing responsibility handled, as something collaborative or is it better to have an independent and specialised test team?
- Should you have a separate test team or should you work more collaboratively?
- Is the testing team involved early in the process?
- How can we create a testing culture?
José Enrique Rodríguez Huerta:
Hello, hello hello and welcome to another episode of Codurance talks, the podcast where we talk about all things technology, software development, and craftsmanship. I'll be your host for today's session. And I'm joined today by two of our senior craftspeople, Alasdair Smith and Javier Martinez. Alasdair is from the Manchester office and Javier is a data engineer who recently also joined our Barcelona office. So guys, welcome!
Nice to meet you. Yeah, I guess I'll do introductions. So my name is Alasdair, I work for Codurance Manchester and I'm a senior craftsperson. I've been working with Codurance for two years now. And I'm loving the craftsmanship approach to development.
Hello! Thanks José, thanks Alasdair. I’m Javier, I’m a “pythonista” with seven years of experience lately working with ETLs and data engineering projects dealing with Python Kotlin. So really, for the topic of today.
JRH : It’s a very good segue into our topic, because we're going to be talking about a massive topic, which is very prominent, not only in the software industry, but also it's very dear to any craftsperson in general. And that is testing.
In particular, today, we're going to be talking about creating a testing culture, and what are the challenges around that, and a lot of other topics around us. And again, it's a massive thing, actually, when we were looking at topics to cover the list was so huge that we thought we may have to do another, you know, a two part kind of thing. And so, so let's get into it, and let's start with a really good question: Why testing? Why is this important? And why would someone want to listen to, you know, a couple of software developers talking about testing for an hour?
JM: Yeah, I think that's a good question. I really do want to go first. And I think my answer is that it's a controversial topic. So each of us tries to develop in a better way, in the best way we can. A testing is within the best practices. But there are many details in real life that are linked to testing to provide value, and to develop. So it's not always possible to achieve the quality that we look for in the literature. So we are going to go through many topics related to this. Probably provide some insight into what we have found in the real experience, in the trenches, under fire.
AS: Yeah, I think yeah, I agree with Javier on this one, I think he sort of hit the nail on the head. And I'm not going to add too much to it. I think for me, I'd like to think of testing as a tool, which helps you get the job done. And yeah, I think I think it's if you didn't know anything about it, you would be worse off. But that doesn't mean to say you have to use it, per se. So I'm sort of being very much on the head with this one. But yeah, I think as a developer, it's your responsibility to at least know about it, even if you don't use it, for sure.
JRH: That's a good point. Because you mentioned, well, it's a tool and you're not, you know, you're not forced to use it. But.. is it even possible to do software development without testing a thing? Personally, I think it's a bit impossible to do that. Because you're testing, right, whether you're doing it manually or not, you somehow need to figure out if what you're doing is correct, or is behaving the way that you expect it now. So what do you think?
AS: I completely agree. I think that at some level or another, we all do testing as developers, how we actually choose to do that formally or informally is a different question. Totally. But yeah, I mean, how are you very, I don't think I've ever heard of a person who's gone and written a piece of computer code in one setting, never running it and being like, yes, perfect. It's good to ship it. Do you know what I mean? Like, you have to run it at some point. And you probably run it multiple times before you deliver it. So that is in its own right form of testing. You could argue for sure.
JM: Yeah, exactly. I mean, I think it's impossible. So you almost remark. So it's impossible to develop. For me, the mental flow is like, “Okay, I have this goal, I have to achieve this specification, I have to achieve”. So many times a developing feature has to do with exploratory coding. So during that phase, I always test so maybe you have to work with our new library, a new package, so you have to test it before otherwise, figuring out how the feature will be, how the code will be. I'm personally testing in the code in a command line in the interpreter, so that I can achieve fast results, and they set their goal.
JRH: Yeah, so there is inherent value in it as you mentioned, there is this exploratory side, like trying to figure out what it is that you're trying to build to some extent, and there is the specification side of it, and whether it is working the way that it's supposed to. But I would also add, and comes to mind, you know, it's a drug, right? Like, if you, if you buy something that hasn't gone through proper testing, that that product doesn't have the value know that if it went through the proper testing, right? Like, even if it gets you the same result, the confidence once you go through that process of verifying to some extent that things actually work, that already increases the value of what you're using, and it happens the same way with software.
AS: I agree with it, I think the effort put into doing testing is valuable in its own right, it shows care has been taken. I think that we'll go into it, but different methodologies or different mental flows, will allow you to provide better quality software. So, for instance, we will go probably soon into the activity. But if you develop with a battery of tests, a priori, you will go through all the possibilities, or the branches that you have in your code. And then see all the weak points, their points of failure, correct them before you launch it into production, or even into staging. So before QA.
JRH: Oh, that's another very good point now, because actually, doing the testing will actually reduce the cost, basically, of finding defects or things like this, right? Especially if you're using automated testing, right? Like, the more you can do that feedback loop of checking if things are going the way they want, the faster you can do that. The better, right? Like if you only find defects all the way when you get to production? No, that is a custom mistake. Right? You can definitely reduce that by using testing now. And you also mentioned another point that I think is very interesting, which is, you know, does it lead to good quality code and so on. Right? What do you think about that?
AS: So I think that quality is an interesting question in its own right, I'm not going to go too deep into the concept of quality. But I think part of the craftsmanship mentality is taking pride in your work and taking care to do it, right. And I think for me, doing testing, at least being mindful of doing testing and knowing that it's something which lends value is part about being a craftsman in terms of taking that pride, we wouldn't think actually the truck metaphor you're referring to before, it was quite useful in that respect. Because if you were taking pride in providing something for other people's benefit, you wouldn't just want to rush it out and ship it out the door as fast as possible without knowing that it's actually going to benefit the people you want to sell it to. It's much similar in software. I think that the testing provides quality insofar as you're taking the time to ensure you're giving the product to the customer wants so I think yeah, definitely good. Good testing. Good testing has a very positive impact on quality and ensuring quality, bad testing maybe not but that's a different conversation maybe for later in the conversation. Yeah.
JM: That's a really interesting thing. I mean, how to define good and bad testing, and the first thing that comes to my mind is what we said, you can test like doing in the command line, your test really dirty, and you know that it's working. But this is for me, bad testing. I mean, obviously, you test your solution, you make sure that it’s working, or partially sure, but yeah, when you develop a battery of automatic tests, you ensure yourself, and also you ensure the people and the rest of the team and the people that come after you. So one key aspect of testing is documenting the functionality. So that's why for instance, we try to use really verbose names for the test. So, specifying what is the goal of that test, which is the functionality that we want to put under test. This provides a lot of value. This is, for me, at least a good testing, despite both being testing.
JRH: And also it gives you that safety net, no? Personally, as a developer, I suck. I'm a very bad developer. Yeah, and this is why I like testing so much, basically automated tests. Because it allows me to, you know, check myself, I know that I'm not missing stuff, and if I do, I can catch it really quickly. And that provides me with, you know, the mental space to actually focus on other areas that are more, you know, important in that sense. Right? So I think that there is a mental benefit now of stability, and then safety in using tests in that sense.
AS: In fact, I really love that you brought that up, because in a previous demonstration I did to a client about why testing is useful. Did you guys ever watch the movie “Free solo”? About that guy who climbed Mount Yosemite with no harness or stuff. For me, test driven development is kind of like, climbing with all the gear, right? And like not doing TDD this is quite opinionated, but not doing TDD is like having no gear and having to climb this huge mountain. Right, at least if you're doing TDD if you fall over, you know, it usually gives you some indication. Kind of like, if you fell off a mountain, but you had a rope attached, you're gonna stop somewhere on the way down, you're not gonna fall all the way to the bottom. Yeah, I love that, that idea of testing, allowing you to free yourself from the fear of the task at hand. You know, it lets you get your head out of the fear of “Oh my God, if I'd done it” and just get on with the work. So for me, I like that metaphor of rock climbing is the gear, right?
JRH: And you can, and you can explore, right? Like, you can try things out, if it doesn't work, you know, you can revert whatever. And you can be sure that you didn't screw at least the stuff that you're testing now. So that's a good point.
JM: You especially notice that when you don't have it. I remember once when I arrived to a project, in which there were no tests at all. So one day, we found that there was a mismatch in a number. So, like a random error. So what is it called? Flaky error? A flaky test or like this. So it took like one week, because you had to dig in the code to see in which part the error was, and in the end it was an Assam that was not properly done, cause someone forgot to add one parameter. And the amount of time and money that that cost the whole company... It's incredible. And it's really interesting. So, I have found that sometimes there are some competitions between the product team and the rest of the project, and the developers know that they want to follow these good practices. So it's an interesting topic. Have you also observed these situations in which there is a lot of pressure? They say don’t test too much or…
AS: Yeah, absolutely. I have seen this both in, before I was working for Codurance, there's a few companies I worked for where that was frequently brought up, you know, I work for these companies, which didn't necessarily see the value in testing. And they certainly didn't test drive, they tested in retrospect. And there was a phrase which was bandied about, which is, “why are you spending so much time on the testing because the code already works” or something like that. And the arguments you have to push back with, it makes you feel awful, honestly. I mean, that's, again, this is all my opinion, but it makes you feel terrible, when you have to just go with the fact that someone's telling you not to test because of a pressure, time pressure, because you actually you're kind of you're letting yourself down in a way, you know that there's a better way of doing it. And you know, you're not being allowed to do that. It's not a pleasant feeling. Yeah, for sure. Absolutely. There's been a few times.
JRH: There is this, you know, Corey Haines, one of the biggest promoters for, you know, Global Day of Code Retreat and all this stuff, right? I remember watching one of his intros to code retreat, and he said, you know, the difference between how you do things when you have all the time in the world now and how you do things, when you're under pressure, right? Is it a measure of how much you suck, right? Because this is another argument, a lot of people say “Oh, well, you're writing twice the amount of code, you're writing the code that you needed to write, and then you're writing more code” . So that's taking away from it. But the reality is that a lot of the time, it doesn't really take that much of a difference, right? If you really are proficient with it, the benefits definitely outweigh, you know, any cons that might be in there. And that's another aspect, right? And I think, going into the topic of how to build a culture of testing now that we know all of the benefits and all, we talked about all the things, you know? Those arguments are the resistance not to actually testing, and in many cases is not just TDD, but testing in general, it’s one of those things that makes it sometimes harder not to have that conversation. If you don't have the right arguments, or you're not proficient enough now, and you're the one trying to push forward, one of these changes. So, what do you think of them? Where would someone start, when trying to introduce something like this, right, like to shape the culture towards a culture of testing?
JM :Well, I feel the culture should come from an agreement within the team, if possible. If the team has the independence to take this issue, if not, it should come from the CTO within the organisation. So at the end, I think it is related to seeing the testing as the part of the code that needs to be developed. So the same way you need that the form in the web page looks in this way, or have these fields, the product owner needs to demand to ask for the amount of test that covers the functionalities, the same way that he would ask for this button or a new column in the database or whatever. This should be asked for within the definition of the task.
JRH: Again, going back to the quality aspect of this, I think that should be owned by the team, no? The product owner definitely wants the product to behave the way that it should behave, right? Whether that happens, you know, with a test, manual test, or with an automated test or whatever, that's a different thing. So I think, to that point, the product side of things or the delivery side of things should drive the main need, because a lot of the time is again, about speed is about delivery, right? But it's something that needs to be owned by the team itself, as you said, it needs to be an agreement internally within the team and as a way to do their job better. I don't know, what do you think?
....to be continued..