Lisa Crispin, Co-Author of Agile Testing Condensed
A Leanpub Frontmatter Podcast Interview with Lisa Crispin, Co-Author of Agile Testing Condensed
Lisa Crispin - Lisa is co-author of the Leanpub book Agile Testing Condensed. In this interview, Lisa talks about her background, donkeys, the importance of psychological safety for developing high-quality software, her books, conventional publishing and self-publishing, arranging translations of a self-published book into multiple languages, and at the end, they talk a little bit about her experience as a self-published author.
Lisa Crispin is co-author of the Leanpub book Agile Testing Condensed. In this interview, Leanpub co-founder Len Epp talks with Lisa about her background, donkeys, how she got into programming and software testing, the importance of psychological safety for developing high-quality software, her books, the difference between conventional publishing and self-pulishing, arranging translations of a self-published book into multiple languages, and at the end, they talk a little bit about her experience as a self-published author.
This interview was recorded on August 31, 2021.
The full audio for the interview is here: https://s3.amazonaws.com/leanpub_podcasts/FM188-Lisa-Crispin-2021-08-31.mp3. You can subscribe to the Frontmatter podcast in iTunes here https://itunes.apple.com/ca/podcast/leanpub-podcast/id517117137 or add the podcast URL directly here: https://itunes.apple.com/ca/podcast/leanpub-podcast/id517117137.
This interview has been edited for conciseness and clarity.
Transcript
Len: Hi I'm Len Epp from Leanpub, and in this episode of the Frontmatter podcast I'll be interviewing Lisa Crispin.
Based in Vermont, Lisa has worked in software testing for two decades, working on projects for companies from startups to big, established organizations, and she's currently working in the role of Senior Software Engineer in Test at OppLoans.
Along with her colleague Janet Gregory, Lisa is also co-founder of Agile Testing Fellowship, and you can check out their Donkeys & Dragons podcast on YouTube.
You can follow Lisa on Twitter @lisacrispin, check out her website at lisacrispin.com, and read her blog at agiletester.ca/blog.
Also along with her colleague Janet, Lisa is co-author of the Leanpub book Agile Testing Condensed. In the book, Lisa and Janet provide an overview of how to build a quality Agile software testing culture, including how to fit testing activities into the cycle of Agile software development, and how to get everyone on the team engaged, and much more.
In this interview, we're going to talk about Lisa's background and career, professional interests, her book, and at the end we'll talk about her experience in self-publishing.
So, thank you Lisa for being on the Leanpub Frontmatter Podcast.
Lisa: Thank you for inviting me. It's quite an honor to be on it, and you've covered such a wide range of topics already in your podcast. I can't wait to go listen to them all.
Len: It's an honor to have you here.
I always like to start these interviews by asking people for their origin story. So, I was wondering if you could talk a little bit about where you grew up, and how you found your way into a career in software development and testing?
Lisa: Well, I am originally from Texas, and grew up in Houston. I majored in animal science with a focus in beef cattle production in college. This was back before anybody was going to hire women to run ranches or beef cattle facilities.
So then I got an MBA in organization development, and I worked in research for a couple of years for the Engineering extension service. Then I got laid off, and it's like, "Well, dang. I need to be employed, and I'd like to move to Austin."
I wandered around Austin job hunting, and wandered into the University of Texas employment office and saw a big sign that said, "Programmer trainees wanted. No experience necessary." I said, "That's me." I think I had one computer course in college. This was really a long time ago, so -
Anyway, they had a great training program, and that's where I got into programming. Fast-forward more than ten years - a little more than ten years probably - and I got into testing, because I was working for a software company and our customers were getting really annoyed. This was when people called you directly on the phone. They're like, "How could you not have known that this release had this terrible bug in it?" That was really embarrassing.
So we asked the developers , "Could you give it to us a little before you give it to the customers?" We were in customer and tech support. "We'd like to try it out ourselves and see if we find any bugs." And so we started doing that.
Then we could at least be working on a patch by the time the first angry customer called. Our managers were like, "Huh, testing? That's an interesting idea." So we started a release and testing department, and I put my hand up and joined. I never looked back. I enjoyed being a programmer, but I don't know - for some reason, I enjoy the testing side of things more.
Len: That would have probably been the days, when you first started in testing - when typically people did waterfall, rather than agile development. Is that correct?
Lisa: Well, the place I was working - I don't know what you could call it, just mostly chaos. But I did do it in waterfall. And actually waterfall gets a bad name - I had a testing job in the early 90s with a database company, and we did a great job of waterfall. Our quality was very high. Because, guess what? We had huge - like 90-something percent unit test coverage, and we had automation test coverage at the UI level as well. Testers and developers were involved in each project from the beginning, from the beginning phase of analysis and requirements. There wasn't a hand-off mentality. Everybody still worked together.
We had continuous integration, we had automated deployments. These things are not new. There just weren't very many people doing them back in the 90s. So when people say waterfall - and if they mean chaos, that's not true. There's nothing inherently wrong with the waterfall process. It's just that you cannot compete in today's world moving that slowly. We were releasing every six months to one year. That doesn't happen anymore.
Len: We went ahead and introduced technical terms, without really setting the stage. So just for anyone listening - we will explain what these things mean, as we go on in the interview. But yeah, in the software development world - there's this sort of generally understood contrast between waterfall, which is - basically you can think of it as like there's somebody who sets out every step of a software development process. Which sounds like a good idea, until you realize that often that makes it difficult to adapt to changes when there's a problem with the plan. The person who's making the plan might not be the person who's had experience recently with actually doing software development, so they might not know.
The part of it that fascinates me the most - and I don't come from a programming background myself, but I've had some experience with clients and things like that - is, once you've got an instruction manual like that for what to do, it actually means you're going to have a command and control management structure, right?
There's going to be someone who commissions someone to make this plan. There's going to be someone who makes the plan. They're going to hand it off to people who are then responsible for making sure parts of the plan are delivered. Then they're going to issue like more and more finely detailed commands to the people beneath them. This is a very unwieldy and often very unpleasant structure to work in.
One of the reasons I just wanted to go into detail explaining it that way, is because I think a lot of people - both their first experience with software testing, if they've ever done it - but their understanding of what it might be, would be something like this. You're even further down the ladder, right, in that kind of thing.
And what happens is actually, I think - I forget who it was, it might have been David Greenlees who I interviewed years ago for the podcast?
But their first job was like just this terrible experience - as a tester, a terrible experience of like - you're in a cubicle, you're at the bottom of the ladder - and you get literally a checkbox list of like, "Go through this set of actions and report the result. And go through this set of actions and report the result." You're basically like a thing carrying out discreet set of tasks, that they haven't been able to automate yet. But testing doesn't have to be that way. And I'm sure we'll get into that.
Lisa: Well, the other side to that too, is - not only were you given all these instructions, but you were at the very end. And so there was always a hard deadline. Instead of - you thought you were going to have a three week testing phase, and instead it's like, "No, we're going to release tomorrow, and can you just test today?"
Len: I think it's really important to say, the pressure - after all that work has been done, and all that money and time has been invested - it's really difficult to be the person to go, actually like you said, "How can you release this with a bug?" Which is a common sort of ordinary, everyday person's experience with any software that's broken or doesn't work. Like, "How could you possibly do it?" And it's like, well, there was this huge tsunami of pressure behind the person who found the bug who just maybe went, "I don't want to -"
Because one of the things that will happen too, and I'm talking way too much - but I once had the experience of being someone who was using software to do my job, but it was - the software was still a work in progress. I clicked a button, and shut down the whole operation by virtue of having clicked a button in a UI, doing what I was supposed to do, right?
Lisa: Oh, God.
Len: I still remember there was the commotion. Everybody, the whole office, was shut down. Nobody could work anymore. I remember these two guys who I kind of knew came up to me and they were like, "The boss, Mike - is in the office shouting and angry at what happened here. And the person he's mad at is you, because you broke the system." It was my first real job, I was like 24 years old - and I just turned around and gave a profanity-ridden rant that said like, "I know what's happening here. There's no way that someone clicking a button should have been able to break it. It's your f-ing fault, and you can go fuck yourselves."
And off they went. I kept my job, and no one was ever blamed for that again.
But in any case - just setting asie all the colorful stories there - this is one of the things I wanted to ask you about.
I'm sure you probably get asked when people find out what you do all the time - how is it that in today's day and age, there can be things like the Boeing problem? Which I talked about in a previous interview with Lena Wiberg, that she talks about in her talks. When you're sitting around a dinner table and someone asks you that, what's your answer for how these huge projects can go so terribly awry?
Lisa: There are so many ways that they can go wrong. I really think in the case of things like Boeing and NASA, a lot of times it boils down to psychological safety. People didn't feel safe to bring up the problem, and it just got covered up.
In the modern agile that Joshua Kerievsky has promoted or educated people about, one of the prerequisites is psychological safety. People have to feel safe to point out the problem and ask the questions. I think it - I mean, that's one of the most important things - is you can't feel like you're going to get in trouble if you quote unquote, "break something". Or you found something that's broken.
It usually boils down to culture and communication. Most software bugs are communication problems, and it's about not having that shared understanding upfront of, "Here's what we need to build, here's how we're going to solve a customer problem. Here's the software that we need to deliver in order to do that." While agile development is all about, "Oh, we don't want to do big upfront design, we don't want to have these big analysis phases at the front."
Still, we have got to have that shared understanding that we're all seeing the same vision of what we're going to create - paring that down to the absolute minimum to start with, so that we can get something out and start getting feedback on it. But that takes a lot of discipline, and it takes a lot of understanding of why it matters to invest in quality. Everybody wants quality, but they don't want to necessarily make the big investment in it. Those are usually cultural problems and leadership problems.
Len: Speaking of culture - we introduced the idea of waterfall, and explained it a little bit. But the other term that came up was agile, of course. I was wondering if you could maybe, for people who might not have heard about it - contrasting it with my cartoonish explanation of waterfall - what's agile?
Lisa: Agile is - it boils down to, we're making, we're delivering value to customers frequently. I think Elizabeth Henderson, I'm kind of paraphrasing her - without accumulating technical debt, and without making a big mess as we go, that we have to go clean up later, or that will drag us down. To do that, we have to use the good technical practices like - the original -
One of the early flavors of agile development was extreme programming. Poorly named. But there were a lot of practices that went with it. And, again - like I say, these weren't new practices. We were doing them in the early 90s. But test-driven development - so, we use our tests at the unit level to guide our development. We write a little tiny test for a little piece of code - and then we write the code to make that test pass, and we build on it. Also, guiding tests, like with guiding development with business-facing tests - so that gets into more of a tester job. The TDD is more the coder's job.
Now we're getting more to the tester's job, to get people together. "Let's get a product person, let's get some developers, let's get designers - whoever we need together - and talk about this feature we want to deliver. How will we know when we're finished? Let's write some acceptance tests to guide us - so that we know when we have written enough code, to deliver that value to the customer. Try to break that into the smallest increments that we can, to lower the risk."
If we make a small change and something happens in production - we know exactly what we did to cause that problem, so we don't waste time trying to analyze what happens. So, that moves into continuous delivery. That's part of agile development - every time we make a change, a small change - we test it, we're happy with it, we think it solves a customer problem, "Let's get it out in production and let's get the feedback from customers using it." If it doesn't work for them, we turn it off. So yeah, I guess that's kind of it in a nutshell. The small, incremental iterative delivery of value.
Len: it's so fascinating. I mean, the changes in the industry over time, right? At the beginning of software, basically - I mean, when we're thinking about the personal use of software, right? You went to the store, and you bought something, like a disc, and you put it into your computer. So, releases happened rarely. Because you have to print or create - like if it's a disc, a CD or whatever - you have to basically create this product. You have to ship it out. Stores have to put it on the shelves. You're paying for where to put it on the shelf. You've paid to put your logo on the box, and what have you. If something's broken, it's terrible.
But with continuous development, and this started happening years ago, but - and a lot of people, of course, would take it for granted now. But you can just make a change, put it into production - as they say - and make it live for everybody all around the world using it. Whether it's an update, "Update me now,", or it's actually updates just happening in the background. But in order to operate that way, you can't be having to like go through a really big, structured, heavy kind of process, in order to adapt.
Lisa: Right.
Len: That's where the metaphor of agility partly comes in.
Lisa: Exactly, yeah. It's keeping everything very lightweight. A lot of it is about transparency and visibility. Anybody in the business can understand at a glance, "What's going on? What are the development teams working on? What's the status?" Just that visibility is really important too. So that - unlike the bad old days, where we've been working on something for six months - and then we're supposed to release tomorrow, but now we have to go and tell the CEO, "Oops, sorry. We're not ready." If we can tell him, "Yeah, we hope to release it next week and -" But tomorrow I discover, "Oh, we've got a problem that's going to take us a little more time." Nah, they've got more time to deal with that. There are no unpleasant surprises at the very end.
Len: Actually this reminds me of - it's a bit of a sideways movement in discussion, but we're talking about doing something - dedicating a lot of effort to something for a long period of time, and then going out into the world and realizing, "Uh oh, maybe it doesn't quite match what I expected it to be," or, "The world's changed since then."
One of the questions that often comes up in various formulations on this podcast, is - depending on a person's background - and the way it'll go for you is - so, you didn't do a four-year Computer Science degree, but you ended up in a career in software development and engineering. If you were starting out now with the intention of having a career like the one you've had, would you want to do a four year Computer Science degree?
Lisa: I personally would not. I don't enjoy coding that much. I don't really want to know - I can drive a car without knowing how the engine works, I don't really want to know how the engine works. It's kind of interesting but - a big component of any successful business - and this has been proven with data, is diversity. When we have a diverse group with lots of perspectives, we have a better chance of solving our customer problems.
Lots of people can be good at testing. But I feel like as testers, we usually have a fairly unique viewpoint. Some things we bring to the party that not everybody brings through. Asking those "what if?" questions that other people didn't think to ask. Thinking of those nightmare headlines and things like that. "What don't we want to happen? What don't we want to see on Twitter right after we release this to production?" I mean, it's -
I personally - if I were assembling a team - I'd want some liberal arts graduates, I would want some music graduates. In fact, when I started out as a programmer at University of Texas at Austin data processing division, I think four of the people of the team of about 30 people were PhDs in Music. It was really interesting. Or people who major in languages and stuff. There's just something about the pathways in their brain that are a little different than the rest of us. We need all those perspectives, and we need people from all kinds of cultural backgrounds and educational backgrounds and socio-economic - whatever it is, the more diversity you have the better.
Len: I could not agree with you more. My background is in English Literature.
Lisa: There you go.
Len: I ended up in investment banking right after that.
Lisa: Wow.
Len: And, well - there were lots of people in my team who were - their background was in chemistry or bridge-and-tunnel engineering, or mathematics, or anything like that. The idea that - I mean, I don't want to go down this path. But basically, conflating education with job training, is a huge mistake. They're not the same thing. Someone who's worked hard, whether it's in a formal institution to get a degree or not - someone who's worked really hard to understand something complex, has a lot of skills and abilities that are going to apply to a wide range of areas. It is one of the reasons I would say that diversity is actually so, so important, right? Because if everyone on your team has been funneled through a very specific kind of grinder -
Lisa: Yeah.
Len: You just really aren't going to have the broad range of insight and experience and knowledge that you need to tackle broad problems.
Lisa: It's really important to not get into a groupthink mentality. If you have a really homogeneous team, you start being blind in a - we have all cognitive biases, that's a big problem. My hope is that we all have different ones. I'm not sure? I don't have any science to say that that's true. But because diversity is proven to be helpful and companies with diversity make more money, etc., I think that it must be true, that when we're all together working, you're able to see something I'm not able to see, and then when you point that out, then I'm able to see something you weren't able to see. So, I think it's really important.
Again, going back to communication - and like I say, ost software bugs are miscommunication in some way. We didn't get the right requirements. Or we didn't even really understand what the customer wanted. Because we didn't really ask the questions the right way.
That's why liberal arts degrees are so important. People who are good at writing, people who are good at speaking, people who are good at listening - we need all those skills. They're really important. companies send their programmers to training in all these coding things, but who sends them to training in communication? I don't know, not very many.
Len: I could talk about this forever. One thing that's really interesting, from our everyday language - is the metaphor of reading a room, for example, or reading a situation. It's like actually like thinking hard about reading and what it is, makes a huge difference to your assessment of any particular situation and what to do next, right?
Actually, there's something very specific that you talk about in the book - you and Janet - about having a shared definition of valuable, definition of "done." Which we'll get to talking about in a little bit. But that idea of actually explicitly defining terms, right? The thing you learn to do at the beginning of any debate, in debate club or whatever - it's this just crucially important step for any kind of group activity - that in ordinary everyday interactions, we actually just pass it over as though it can't really be that important to like get our terms right. It's, "Oh, you're being pedantic," or something like that.
It's like, "No, nothing could be more important for the long term health and success of any project than actually getting your terms defined."
But before we actually go on to the next part of the interview and talk about the book, I wanted to ask you about something specific. Which is, I mentioned that your and Janet's podcast is called "Donkeys and Dragons." The reason there's donkeys in there, is that you have some donkeys yourself.
Lisa: Yes, we do. We have a little 31-acre farm here in Vermont, and we actually own three donkeys. Two miniature donkeys, and a standard size donkey. But we also have a permanent guest miniature donkey who lives with us, so four donkeys frolicking around the field. Our donkeys all - they all do work, they pull - we drive them in carts, wagons, they haul brush and do work around the farms, get logs. They love to work. I've learned a lot about agility from them, because donkeys completely work off trust.
I grew up riding horses, I had horses all my life. With the first donkey I tried to train, the techniques I used with horses were not working. I finally met somebody who had decades of experience training donkeys, and he's like, "The first thing you have to know is that the donkey thinks you have to love him more than anything else." I'm like, "That sounds weird coming from this cowboy who's this horse trainer." But it's really true. If they trust you, they will do absolutely anything for you. If they don't, that's why they get the reputation for being stubborn. You can bully them, you can bribe them - if they think that what you want them to do is not in their best interest, they will not do it.
Len: How do you establish trust with a donkey?
Lisa: Very carefully. They start out with trust - but if somebody's abused them or whatever, they lose trust. Part of it is through their stomach - feeding them. Spending time with them, showing them that you're going to keep them safe. If they're scared of anything - get them away from the thing they're scared of, or get that thing away from them.
Over time, they do learn to trust you. Then, it's really important to keep the trust. Like when I drive my donkeys, I need to make sure that their harness and collar are fitting correctly. If it rubs a sore, then they're going to say, "Hmm, that didn't feel good, and I don't know if I want to work with you anymore." So I have to be really careful what I do. Because it's really hard to build it back up if you break it.
Len: Sorry for the very specific question, but I love animals myself -
Lisa: Oh, good.
Len: I don't have much experience with donkeys, but do they get along with other animals too, like dogs and cats and stuff like that?
Lisa: Oh, yeah, they do. My donkeys have lived with goats and llamas and horses, of course. I mean, I've seen birds stand around on their backs. I've known cats that get on donkeys, backs. My donkeys do love dogs, but one of my donkeys is very playful - and so he'll like chase a dog or a cat. He's only trying to play. They don't necessarily know that. Something bad could happen, it could end in tears. So we keep the dogs and cats away. Of course, big dogs or coyotes or predators - that's not a good thing. that's why I have a standard-size donkey, because the little ones can't defend themselves. So she keeps all the coyotes away, the big one.
Len: My next question before we go on to talk about your books, is - one thing I started doing much longer ago than I wish it were, is introducing a little section where we ask people about their experience of the pandemic.
Lisa: Oh, okay.
Len: So we've got a little archive of people's experience from different industries and professions and levels of career, and all around the world. Somebody was on the last flight out of London to Poland when they were allowed to go if they weren't a citizen kind of thing, and people who've had COVID.
Lisa: Oh, wow.
Len: People who thought they did. Some people live in amongst other people. I remember interviewing someone from London who was like, "I can't get to Hyde Park without going down these narrow streets."
Lisa: Wow.
Len: Being concerned about that. So, you live on this beautiful farm. I was just wondering if you could talk a little bit about what your experience has been like?
Lisa: I feel extremely lucky to have been in Vermont during the pandemic, for lots of reasons. We are in a rural location, it's pretty easy to self-isolate. But Vermont is very - we have, I think, 635,000 people in the whole state. So that helps. But what also helps is - it's just a culture of people are generally very nice, and willing to follow rules that our lawmakers set out, or that businesses establish. "Don't come in here without a face mask." "Okay." We're just people who get along. So there were not the issues of people saying, "I'm not going to wear a mask. I'm not going to get vaccinated." We were the first state to get to 80% vaccination rate. So yeah, I think a lot of hippies and draft dodgers moved here in the 60s and 70s, and it just changed the culture.
Also, we're surrounded by farms. All these farms are organic, sustainable agriculture, humane treatment of animals. We didn't have to go to the grocery store to get food. We could get all the food we need - and still do, from the farms around us.
It was interesting how they all - it used to be you'd go to the farm and just talk to the farmer. Or you'd go to their farm shop, which is on the honor system. You took what you wanted and left the money, or they had your credit card on file. Well, with the pandemic rules, they had to get a little more sophisticated. It was shocking how quickly they got Squarespace or whatever, made a website. So you could order ahead, and they could have it all ready for you.
They pulled together. The farms that were bigger and more established with their farm stores, helped the other farmers who didn't have that, and then they put all that stuff on their website and you could come and buy it. It really changed how they were able to sell. I think it was actually really good for the small farmers here. So yeah, I mean, it's been - we've been very lucky.
Also with the donkeys, it was a social opportunity. Because people would want, "I'd like to come walk your donkeys with you," because we take them for walks. Friends would come over, we're all wearing a mask, we can stay socially distant - and walk four donkeys down the road, and talk as we walk and have a good time. It was just a nice outlet for people.
Len: Thanks very much for sharing that. You reminded me of a friend of mine in Montreal, who told me that they heard that there were farmers in the surrounding area, I think, who were actually renting out time with their cows.
Lisa: Aww.
Len: So you could drive out and hang out with the cow. For like 50 bucks, you could hang out with a cow for an hour. Which, I mean -
Lisa: That is really nice.
Len: Especially when people are lonely, and maybe haven't seen their family for such a long time - being able to just get out of your house, go somewhere safe, because it's outside - and then just interact with a friendly animal.
Lisa: I love that.
Len: A wonderful thing. Actually, just on that - specifically though, I imagine, you were probably already working remotely in all your positions before that?
Lisa: Oh yeah, workwise - I was already working remotely. When we moved to Vermont, it was with the idea of, "We're going to live in this rural location where it's easy to have the donkeys." Because commuting to a metropolitan area and affording acreage to have donkeys on was not compatible. So, I was already working remotely.
But it did open up a lot more job opportunities for me. Because now - the company I'm working for now, they didn't have remote people before. So it definitely opened up a lot more opportunities, especially in places where maybe you're the only remote person, or there are a couple of remote people - that's very difficult to deal with, because it takes the people in the office a lot of discipline to remember to include you and have a Zoom call. Or, then, you have all these audiovisual problems - where we're in a conference room and they've got their camera and microphone on, but I can't hear them, or they can't hear me. That's a nightmare. When everybody's on Zoom, the playing field is level.
Places I've worked before already had got to realizing, if everybody wasn't co-located, it was better just to have everybody get on Zoom, and level the playing field. But now, lots and lots of companies have discovered that. So lots of silver linings to the pandemic.
Plus being able to go to all these meetups that we're now online, and listen to talks - and lots of people did webinars, and so I got to learn a lot as well.
Len: One of the many fascinating things to see over the last - I mean, 18 months or whatever is now - has been, as you say - so many people in so many different industries and walks of life innovating and adapting, even in very conservative industries like law or medicine or - I actually interviewed a Leanpub author a while ago who works for an American defense-related company. She said that, "It's getting easier to try and say, 'We might want to have remote workers, rather than everybody have to be coming to the building every day.'" That way, actually the opportunities are - it's not just, as it were - just for the people getting the jobs, it's the people who are looking for people to fill them."
Lisa: Yeah, yeah.
Len: Their opportunities as well. Well now - all of a sudden, the best candidate in the world is available to you, because you've changed a - not just as simple as changing a policy - especially in something like defense or medicine, or whatever -
Lisa: Right.
Len: You're going to have to change practices, too, around security and things like that, it's very important.
Lisa: Exactly.
Len: But it has been amazing to see the way so many things have changed.
Lisa: Yeah, and it's pushed the technology forward as well. I mean, there've been so many improvements to accommodate all that.
Len: Oh, yeah. The one that fascinates me the most is - I've always been one of those people who thought commuting was absolutely crazy, and to see a shift in that attitude - I'm just fascinated to see - when something is purely a convention, but people regard it as something like a natural law - to see the tectonic shift towards what, when you've seen through the veil, is just common sense, is just really fascinating.
Lisa: Yeah.
Len: Just moving on to the next part of the interview. We're going to talk about your book Agile Testing Condensed.
The first question I'd like to ask you about that, is - you've had this collaboration with Janet for quite some time now. Could you tell us a little bit about how -? Also, I don't want to talk about your book without - and your other work without talking about Janet. How did you two get together?
Lisa: Well, back in 2000, I joined my first extreme programming team. Extreme programming was originally a process for small teams to develop software. The idea was that your customers, your business or product people - sat with your small co-located team, and developed the software in these small increments and short iterations. The original publications about extreme programming were all about testing and quality and people, but they didn't mention testers.
When I read Kent Beck's Extreme Programming Explained, I was like, "We've been trying to make this work as a waterfall, but this is really what we need to do." It's all about people and it's all about quality, and I was keen to do it. I got some friends who'd started a start-up and were going to do it to hire me as a tester. Then we're like, "Okay, what's the tester going to do? Because we're doing test-driven development. That seems like everybody's doing testing now."
So, Brian Marick, who's somebody in the testing world, who was also getting into the extreme programming and agile world - he knew Janet, so as a mutual friend, he introduced me to Janet. He's like, "Oh, you all are kind of doing the same thing, and you should talk." She was doing a similar thing up in - she's based in Calgary.
So at that, I'd thought, "We need a book for testers on extreme programming teams and somebody should write this." So, I ended up writing it with Tip House. And Janet was our tester.
We would write chapters of the book and send them to Janet, it's like, "Okay, what do you think of this?" She said, "Let me try that with my team." Then she'd report back. She tested out the techniques and processes we were recommending, and practices we recommended. That's how we got started collaborating.
We ended up starting to do some writing together, do some presenting at conferences together. When I was asked to write a more general book on agile testing, not just extreme programming - Tip didn't want another book, so I said, "Janet, how would you like to write a book?" I had to work on her, really talk her through it. But it's been great ever since, and worked out well.
Len: I think in a little bit, we're going to talk about your experience actually writing those two books for a conventional publishing company and things like that. But the book Agile Testing Condensed that you've got on Leanpub - who is the audience for that book?
Lisa: It's anybody interested in testing and quality on an agile team. The audience is everybody involved in delivering software. Because we really have seen,, over the past 20 plus years, it takes the whole team getting engaged in testing. Then building, thinking how to build quality - and for it to work.
It doesn't work to have just a separate tester or testing department trying to cover all the testing. There's an old saying in the testing world, "You can't test quality in." So, we have to be thinking about quality from the get go, starting with that shared understanding I talked about.
Testing feature ideas - are we solving the customer problem? Testers are important at that stage too. But getting everybody - asking the questions - getting everybody talking, having conversations. Using various models. There are lots of different models. Janet and I really have found our agile testing quadrants useful, originally developed by Brian Marick - and he let us adapt them and use them.
What's the business facing quality? What's the technology facing quality? How are we going to guide our development? How are we going to evaluate whether our product is meeting customer needs? Thinking about all those aspects of it, and planning our testing around that. We found those things really helpful.
Len: I mentioned the book would also be for executives and people like that too, who want some visibility into the way things are working underneath them.
Lisa: We wanted to write a really short book, because we really hoped managers and executives would take time to read a short book about testing and quality. Our first couple of books were 550 pages each, and this was 100. So, fingers crossed that - but we've had a lot of good feedback from people in all kinds of different roles who've read it, and found that very, very helpful.
Len: It's really interesting to - to people listening, it might seem common sense that you'd want to have a whole-team approach, and get everybody together when you're developing a product. But a lot of people find that really counterintuitive, right? There should be someone at the top who's issuing commands to the people beneath them, who's issuing commands to the people beneath them. Then you can particularize the responsibility that everyone has all the way along the chain of making things, and deciding how to proceed, and stuff like that.
If there were some executive out there listening to this episode of the podcast, going, like, "Well, what? It all sounds a bit kind of like -" I don't know? "Warm and fuzzy, to talk about getting everybody to talk. Isn't that just going to end up in a lot of wasted meetings and no responsibility lying anywhere?" If there were a skeptic listening to this, what would you say to them?
Lisa: We can save a lot of time by having these sessions together, and improving our communications. Janet and I both promote using a lot of visuals - virtual whiteboards, virtual sticky notes, and visual models, to help guide our conversation - so that we don't waste time. Because if you just start talking around, waving your hands - you can just get bogged down. But if you're drawing a flow chart or you're brainstorming with sticky notes, and infinity diagramming those sticky notes, and, "Oh here's the area we want to focus on."
Or if you're doing story mapping. To find the - "How are our users going to use our feature, and what's our minimum viable product for that feature? What can we get out first, to get information back and learn?" Having all the visuals to enhance the communication is really, really important. It means we're going to have less rework, less wasted effort. Our time - from thinking about what feature you wanted, to getting that feature in production - is going to be much reduced.
Len: I think that's a really great answer. "We're going to save you time."
Lisa: Yeah.
Len: "This is going to mean we're going to make more money." There's the answer.
I'm drawn to the idea of having shared material, like shared visualization. Things like that. It's not - the point of that is not to give each individual their own sense, it's to give you a shared understanding of like, "This is what we've committed to. This is how we've agreed to depict what we're doing. This is how we've agreed to talk about what we're doing." I actually think that that's one of the things - having everybody do everything on Zoom, has actually probably improved a lot of productivity in a lot of places.
Because it means - a lot of the stuff that's inevitable that happens in meetings, you can't get away with - to put it in a kind of adversarial way, right? Like, "What are we really doing here?" If you're standing in your living room, it's a lot harder to do the theatre of like, "Oh, some work just got done." ut also being in disparate places, and not being able to have the watercooler, "Let me catch you up on what happened," kind of thing, means you actually have to have a shared document. I think at Amazon, famously, you can't arrange a meeting without writing a memo that everybody has to read before they go.
Lisa: Right.
Len: Having shared visualizations, these shared documents, these shared definitions - things like that - can be really important.
On that note, with respect to definitions, I wanted to take the opportunity to ask you about some specific things that people who work in software might hear from their testers, but don't really know what they really mean by it.
Lisa: Okay.
Len: We've got Lisa Crispin here, why not ask? What is a regression test?
Lisa: A regression test is done to make sure that as we change our code or change our product, maybe the configuration? That we don't - or as we add new features - we don't break existing features. We don't break the behavior the users are already using, unintentionally.
Regression testing - just make sure that all those things work as they did before. It doesn't typically find new bugs. If you're manually regression testing, it may be that you notice something that's new. But automated regression tests, generally - all they tell you is, "Yeah, you didn't break anything that was already there. There might be bugs in what you did, we don't know that. But the stuff that's already there, still behaves the way you expected."
Len: Okay, so make sure your product doesn't basically go back in functionality?
Lisa: Right, right.
Len: What's exploratory testing?
Lisa: Ooh, that's a hard one to define. It's learning about your product or your feature by - Elizabeth Henderson explains it the best - it's like you're an explorer, it's like you're Louis and Clark - and you're going out to discover the west. You have some materials, you have some resources - you've got your communities, you've got your maps, you've got your botany guide, you've got your Native American interpreter - or whatever. You've got some tools. You have a charter. You've been told to go out by - "Catalogue all the flora and fauna. Find a way to -" Whatever they were trying to find.
You've got some missions. with exploratory testing, you take your resources and think about - what is it you want to learn, what do you want to discover?
The way I like to do it, it's just a time box, "Okay, I'm going to focus just on this part of the feature, and I want to learn about this specific area." As we're doing that, you're trying to keep an open mind. This is something that I find best done in pairs or larger groups, so that you have more people observing.
Just to keep an open mind, of, "Oh, there's something I didn't expect. We're looking for the unknown unknowns that we couldn't think of, as we were trying to get the shared understanding of our future - there were just things we missed, we didn't think of those things. We need to do more. We need to add some user stories, to add some things to that feature." It's a kind of a learning. It's somewhat structured, but it's not writing down a bunch of test cases and going through them.
Len: It's super interesting, you mentioned working with someone else. Just to give people an image of what this looks like - is this two people sitting in front of a screen with a keyboard and a mouse, and kind of clicking away on something - trying to find a path to a product? Or is it perhaps, that - and in addition to that - looking at maybe diagrams of how something's supposed to behave, and then analyzing those?
Lisa: At this stage, usually, we've already verified that the feature meets its requirements. As far as we can define in advance, it does those things. This is more about looking for the unexpected, or the things we didn't think of.
The way I like to do it is - with two people, to do what's called "strong-style pairing." If we were doing this in person, we'd each have our own keyboard, but we'd probably have shared monitors that are mirroring each other.
It's really easy to do remotely. You have a driver and a navigator. The driver just shares their screen, they've got their hands on the keyboard - and the navigator tells them what to do.
The idea behind it is - the navigator shouldn't type, that everything has to go through the driver. That frees the navigator up. It's like riding shotgun in a car. If you're driving, you can't pay attention to everything around you. But if somebody else is driving, you can look all over and tell that person, "Slow down, I need to look at something more." Or, "Turn here." Then, switching that pretty often, every few minutes. Switching those roles.
It really works well in a group, as well. Where I work, we do ensemble programming, or, what is also known as "mob programming." We do the strong-style pairing, where we have a driver, a navigator. But we switch those roles around the whole group, and that gives you a chance to have a diverse group of people. Maybe you've got somebody from customer support, maybe you've got a product person, maybe you've got a designer, an operations person? You can have all these different people in the room - and they're all looking for something different, based on their interest and their knowledge. It's super, super effective - and you get a lot done in a really short time.
Len: So in this example, specifically what's on the screen is the code?
Lisa: Well, no. It's like - well, I test web apps - so you'd be testing your web application or whatever.
Len: Okay.
Lisa: I mean, you can code that way as well - but I'm talking specifically about testing.
Len: It's so interesting. I keep invoking the somewhat patronizing figure of the ordinary person who doesn't know how it works, right? But to go ahead and do it again, a lot of people often think that programming means is, you're alone in a cubicle or something like that.
Lisa: Yeah.
Len: But actually, it's a lot of group work.
Lisa: Well, it works better that way. A lot of people go into programming, because they think they can hide in a corner and never come out. That works in some organizations, and maybe that works for some people? But it's really interesting right now where I work, we've been measuring the cycle time of - it's the time from when you start working on a new piece of a feature, to when you get it in production - based on the size of the group working on it. Anywhere from two people up to eight people.
Right now, our current data shows, that having a group of five people - is maybe twice as fast as a group of two or three people, which is really counterintuitive. But that's what our data is showing so far. It's not what we expect to happen, but the reason - part of it, is - you're just reducing the rework. Because if there is a problem, so if there is a bug in what you're doing - somebody's going to notice it right then. As opposed to handing it off to a tester - and a couple of days later, they come back and say, "I found this bug." "Oh, I don't even remember what that was. Let me get back into this code and see." Len: It's so interesting how these subtleties get baked into the language. Like the concept of rework - as opposed to fixing, for example. You know what I mean? Like - if it were me, I'd be much more motivated to avoid rework, than to avoid fixing. Putting it to me like, "You're going to reduce rework," is like, "Yeah, of course I want to do that." Having testers and people who understand the impact on different aspects of the business and the user and stuff like that. I mean, the whole team or holistic approach just makes a lot of sense.
Lisa: Well, and as a better - a lot of times - when I'm working on high-performing teams, there has been an issue where - wow, we really thought we were killing it, and we cranked through, and we delivered a whole bunch of stuff to our product people. They're like, "No, that's not what I wanted." Again, that's where I think testers come in a lot - of making sure that we understand what they want. Because it's really painful - when you think you've done a really good job of something, there are no bugs in it, it's totally solid - but it isn't what they wanted, and you've got to redo it.
Len: That reminded me of a line from the book. I think is that "Testers are the glue on the team."
Lisa: Oh, that's nice, yeah.
Len: Was that all right, to make that connection?
Lisa: Yeah, I think that is a really good way to look at it. I think one of our superpowers is, we get the right people together as well, we need to.
Len: Right. I would imagine too that partly, like - maybe as a tester or a group of testers, they see all the moving parts or - at least know that where the - I think that you talk about visualizing dependencies in a recent blog post. Seeing, even if you don't know exactly where - what's at the end of the line. You know that there's some dependency down there that someone else knows about - and you can see the connections between two paths, or something like that.
Lisa: Right, yeah. When you're programming, you've got to really be down, narrowly focused on the code you're working on. But as a tester, you have the freedom to back up and see the bigger picture, and see the application or the product more as the customer will see it.
Len: My last question, before we move on to the next part of the interview - is, I guess there's bit of a - I don't know? A pun here. But there's a definition of "done," that you talk about in the book, and coming to an agreed definition of done. Before we're done with this part of the interview, I wanted to ask you how do you go about - what is a definition of done? Can you give us an example, if you were working on a team, what a definition of done would be?
Lisa: Part of our definition of "done" might be, "Well, we want automated regression tests at the unit level, down at the basic code level, at the API service level, kind of in the middle - the components of the code, through the user interface, if we have a user interface - whatever the user interface is." We've got some automated regression tests, we've done exploratory testing, we've done accessibility testing, we've done security testing.
The product people have seen it, the designers have seen it. A checklist of things like that to make sure that we've thought of doing all those things. Depending on what's important, it might have been important to do performance testing or load testing. There are so many different kinds of testing activities. Does it work on all the devices where it should work? It's highly dependent on what you're working on. But just to make a list and make sure it's not going to come winging back - because it needed to work in a certain way on a certain device, and it doesn't.
Len: It seems like - just having a definition of done, would probably also serve the role of - I mean - there's no end to the potential testing you can do for anything, right?
Lisa: Right.
Len: You sparked me thinking about that when you talked about security, right? Having a shared definition of when we're done and it's okay to put it out into the world, is also a way of saying like, "If something did go wrong, it's like we're all jointly responsible for it."
Lisa: Yeah, that's a good way to put it.
Len: "We all agreed that this is what 'done' meant. Maybe we need to change our definition of what done meant?" But even whoever made the mistake or missed something, right? It's like, "No, but we, as a group, we decided we were ready to go, so we're all responsible for that."
Lisa: Right, yeah. I think - isn't there some saying that "perfection is the enemy of done" or something, I don't know?
Len: Yeah, something -
Lisa: It happens when you're writing a book too. Because people always say, "Well, how long did it take you to write the book?" Like, "Oh, well. Fifteen months or whatever." It's like, "Could you have made it better?" Like, "Of course. But at some point, we had to get it out to the audience."
Len: You've just given me the perfect podcast guest segue -
Lisa: Oh good.
Len: Into the next part of the interview. We're going to talk about your experience writing and publishing, and working on books.
Probably maybe the best way into this discussion would be to say, you mentioned you wrote a couple of long books. They were for a conventional publisher, with a recognizable brand name, to people who read programming books and stuff like that. But then for this book, you and Janet decided to self-publish. I was just wondering if you could talk a little bit about your experience of the difference between those two ways of writing and publishing? What your experience was like?
Lisa: Well, I think it was actually helpful to have worked with traditional publishers before. Because we had the experience of working with really good technical editors and copyeditors, and learned what is needed to put together a quality book. We were able to know the pros and cons of those different ways of publishing. For the second book Janet and I did, we considered self-publishing - and had a long talk with our publisher about it.
The reason that we ended up going with a traditional publisher for that book, was we had a big audience in China and it's - I have not discovered yet how to get a self-published book into China. Very few publishers even can do it. But the publisher we were working with is able to sell books in China. If that's important to you -?
But conversely, when you go with a publisher, they get to decide when the book gets translated into other languages. We have audiences in countries all over the world that were always like, "Well, can't you translate your book into Spanish or French or German?" "Well, no because our publisher won't do it." But this gave us the freedom to do that ourselves, or to very easily have people we trusted to do it, and to share the royalties with us.
Just the timeframe - traditional publishers - maybe they're faster now, I don't know? But we just wanted to get her done. We knew we could just hire our own copyeditor, and we knew how to do it. The platform made it really easy.
Len: Thank you for bringing that up. Actually, the notion - the concept of or the idea of translation and regions and selling into different regions and stuff like that - it's one of the things about - I think people who haven't come through the traditional publishing process, might find totally surprising that like, "I can't sell into a region of the world without selling the rights there," would be - to someone there might be, the view that a publisher might have.
Lisa: Yeah.
Len: There's this whole web of legal constructs that the publishing world kind of is, that you become a part of when you do traditional publishing. The sort of, like - I always use the metaphor of like cowboy. Self-publishing is just, "What do you mean? I just put it on the web, anybody around the world can buy it. Why would I have somebody I need to negotiate with to sell it in one part of the world over another? This doesn't make any sense."
But to the, I call it "old way of doing things," it absolutely makes sense - and there's lots of like deep reasons for that.
You mentioned translations, and how translation is important to you. One thing we've been really interested in watching on the Leanpub side of things, is how these - progressively these translations have come out of your book in Korean and Japanese and Chinese and French and -
Lisa: Brazilian Portuguese?
Len: Brazilian Portuguese, yeah. How have you and Janet managed that process? Did you put out a call for translators?
Lisa: No. People came to us, actually. And by the way, the Thai version is coming out next month.
Len: Oh, fantastic.
Lisa: German and Spanish are not far behind. Janet and I were so lucky, we've spent twenty years going to conferences all over the world. Janet's a consultant, so she's consulted and done training all over the world. So we've made all these friends and people that we work with closely, and that we can really trust. They've come to us and say, "Oh, we really want this book in Japan, can we translate it into Japanese for you?"
It was so easy to do with Leanpub. Because they just become an author, and then we can determine how to split the royalties and it's just - of course, I say it's painless - Janet does all the heavy lifting on that. But even she says it's just not that hard. And she's been able to help all of our translators get going on Leanpub.
I mean we - you have to trust the people doing it. The only translation I could read really well was the French one, and you can get - we've had other people volunteer to, "Let me, I'll review it for you." Somebody else who's a native speaker and an expert in that area.
It's been wonderful. I think that like the Japanese and the Brazilian Portuguese might be outselling English, I'm not sure?
Len: Thanks very much for sharing all that. It's so fascinating. We've actually - I'll link to an article in the transcription of this online, where we talk about how to set up a translation of your Leanpub book. if you've already published a book on Leanpub, and someone approaches you to do a translation. This is, by the way, why we save this conversation for the end of the podcast, because we get into the weeds. But for anyone who's listening who's interested in doing it, here's how you do it.
You create a new book on Leanpub. In the "About" section, there's a dropdown.
One of the reasons you do it in that order, rather than the translator setting it up - setting up the book themselves, or creating the book themselves and adding you as the co-author - is that, if you create the book, you become what we call in our Terms of Service the "Primary Author." Which basically means, you own it. You're in control and it's up to you of what the royalty split is.
So typically what you do is, you just make an arrangement with your translator - whatever you come out at, what the percentage is they're going to get.
Actually, that leads me to a question, which I haven't really talked to too many Leanpub authors about - how they've handled translations. Did you enter into any kind of like legal agreements? Like you said you can trust people?
Lisa: Yeah, we sign contracts with all of them. We give them a date, and so we agree on a date, we sign a contract - and that way, if they're not done by that date, then we can say, "Do you think you really can do this?" If they say "no," or it's clear they're not going to be able to get it done, then we can get somebody else. Usually other people have already offered.
We've definitely had that happen. We've definitely had some translations that are stalled out, because the person hasn't - for whatever reason - has not found the time, and hasn't been able to do it. Having a contract with a time, with a deadline in it, frees us up to go and then work with somebody else if we want to.
Len: It's interesting. I mean, obviously when money's involved, things start to - it often feels like, I don't know - like a prenup or something like that, right? But like -
Lisa: Yeah.
Len: You probably do want to do that when money's involved. Partly because you want to protect everybody on every side, right? Which is what contracts are for, right? You don't want to end up in this situation as an author, where the translator's like, "Hey, you said I was going to get X% but you're only giving me X minus something % of the royalties," or something like that.
Just having everything spelled out, gives everybody what measure of comfort you can have in a world like this. It is important to get that nailed down. If you're looking for examples of contracts, one thing you can do is you can search online. The self-publishing world is full of blogs and bloggers and people who talk about things like that.
Lisa: Yeah, and I'd be happy to share ours with people too.
Len: Or [contact Lisa and Janet](http://help.leanpub.com/en/articles/3852119-how-can-i-contact-a-leanpub-author, and ask them how they did it.
Lisa: Yeah.
Len: The last question I always like to ask on this podcast, if the guest if a Leanpub author, is - if there was one thing that really bugged you about how Leanpub works, or if there was one magical feature we could build for you - can you think of anything you would ask us to do?
Lisa: Be able for people in China to buy our book. But I know that's not within your control, so -
Len: Well, that's actually kind of news to me. I mean, I didn't know people in China couldn't buy Leanpub books.
Lisa: They don't - China doesn't give people access to Leanpub.
Len: I did not know that.
Lisa: Our translator is -
Len: I guess I always took it for granted that we would have heard - it's one of those things we would have heard from people about, but I've never had anyone contact us saying, "We can't buy books from China."
*[Editor's Note: If you have any information about Leanpub purchases from China, please email us about it at hello@leanpub.com, we'd really appreciate it. PayPal does operate in China.]**
Lisa: Interesting. Well, maybe they know something we don't know, but our translator - the person who did our translation - has been for months, trying to find a way to do it. not sure where she is with that, but -
Len: There was - I guess - and one of the reasons that I feel very bad about not knowing this, and not having a 100% certain answer, but another reason like the - There was one country, not China - that turned off Leanpub, at least for a while. we found out pretty quickly, because people were like, "What's going on?"
Lisa: Oh, interesting.
Len: That's something that we're definitely going to have to look into, and try and find out about. Because that's a pretty important country. Obviously, we want to have as many Chinese readers as we possibly can reach for our authors, and just for Leanpub generally, so that's definitely something that I'll look into.
Lisa: It was really exciting to get it translated into Chinese, although our previous books had also been translated into Chinese. But, I don't know? There's just something about having somebody that you know well, and they're also an agile testing expert and they've translated your book or - there's something very satisfying about that.
Len: Well, Lisa, thank you very much for being on the podcast, and for using Leanpub as a platform for your book which - again - for anyone listening, is "Agile Testing Condensed. Please go buy it. Especially if you're a CEO, you need this book. You need to understand what's happening. Thank you very much just for being a part of the -
Lisa: Well, thank you. I've like got a huge library of books I've bought on Leanpub that have been so helpful to me - it's just such a great resource.
Len: Thanks very much.
as always, thanks to all of you for listening to this episode of the Frontmatter podcast. If you like what you heard, please rate and review it wherever you found it, and if you'd like to be a Leanpub author, please visit our website at leanpub.com.
