Leanpub Header

Skip to main content
The Leanpub Podcast Cover Art

The Leanpub Podcast

General Interest Interviews With Book Authors, Hosted By Leanpub Co-Founder Len Epp

Listen

Or find us on Stitcher, Player FM, TuneIn, CastBox, and Podbay.

Jared Knowles, Co-Author of Education Data Done Right

A Leanpub Frontmatter Podcast Interview with Jared Knowles, Co-Author of Education Data Done Right: Lessons from the Trenches of Applied Data Science

Episode: #216Runtime: 01:10:45

Jared Knowles - Jared is co-author of the book Education Data Done Right: Lessons from the Trenches of Applied Data Science. In this interview, Jared talks about his background and career, his early interest in education policy, founding Civilytics, writing a Ph.D. and working full time in the public sector simultaneously, early warning systems for student outcomes and recent changes in education technology and practices, standardized testing, and books.


Jared Knowles is co-author of the Leanpub book Education Data Done Right: Lessons from the Trenches of Applied Data Science. In this interview, Leanpub co-founder Len Epp talks with Jared about his background and career, his early interest in education policy, founding Civilytics, writing a Ph.D. and working full time in the public sector simultaneously, early warning systems for student outcomes and recent changes in education technology and practices, standardized testing, his books, and at the end, they talk a little bit about his experience self-publishing, and why he and his-coauthors chose that path.

This interview was recorded on November18, 2021.

The full audio for the interview is here: https://s3.amazonaws.com/leanpub_podcasts/FM194-Jared-Knowles-2021-11-18.mp3. You can subscribe to the Frontmatter podcast in iTunes here https://itunes.apple.com/ca/podcast/leanpub-podcast/id517117137 or add the podcast URL directly here: https://itunes.apple.com/ca/podcast/leanpub-podcast/id517117137.

This interview has been edited for conciseness and clarity.

Transcript

Education Data Done Right : Lessons from the Trenches of Applied Data Science] by Co-Author Jared Knowles

Len: Hi I'm Len Epp from Leanpub, and in this episode of the Frontmatter podcast I'll be interviewing Jared Knowles.

Based in Watertown, Massachusetts, Jared is a statistical programmer and social scientist, and founder and President of Civilytics Consulting, a data science consulting company that provides analysis and data science software solutions in the criminal justice, public finance, and education sectors.

You can follow him on Twitter @jknowles and check out his website at jaredknowles.com, and you can learn more about Civilytics at civilytics.com and @civilytics on Twitter.

Jared is co-author of two books that have been published on Leanpub, Education Data Done Right: Lessons from the Trenches of Applied Data Science and Education Data Done Right: Volume II: Building on Each Others' Work.

In these books, Jared and his colleagues write about "the missing elements that are critical to success in building data capacity in education agencies", addressing the important work being done by data scientists and others in the education sector across the United States, helping schools and leaders in education to shape everything from policy to law to strategic planning and managing enrollments.

In this interview, we’re going to talk about Jared's background and career, professional interests, his books, and at the end we'll talk about his experience as a writer.

So, thank you Jared for being on the Leanpub Frontmatter Podcast.

Jared: Thanks so much for having me, it's great to be here.

Len: I always like to start these interviews by asking people for their origin story. So, I was wondering if you could talk a little bit about where you grew up, and how you found yourself on the path to a career in social science, and data science and political science?

Jared: I think geography is a really big part of what I do now, and also my origin story. I was born and raised in Montana, in the United States - which is a pretty rural part of the country, very large and very sparsely populated. I grew up in a small town there, and I moved to a smaller town to go to high school.

What I saw when I changed towns was how different the school system was. As a kid, you're very plugged into the school system, and so you can get a real sense of the differences between how adults in the school system are acting when you move between the school systems. And I thought, "We're still in the same state, we're still generally the same kinds of people or the same economy. Yet these two towns had very different school systems." It got me really thinking about politics and school boards.

I wanted to get out of Montana as fast as I could when I graduated. So, I went to college in Oregon, and I went further west. I went to college outside of Portland. There, I really studied school boards in college, and that was what I got really interested in, and what I wanted to do my graduate work in - really to learn about this phenomenon in the United States, about how we run schools with these very local, elected officials.

A couple of hops and a jump, and I went to graduate school in Wisconsin. So I kind of stayed along the northern part of the United States. I guess I said I didn't like winter, but I stay in cold places.

While I was there, I realized that in graduate school - I wanted to do something more hands-on, and so I started working for the state government in Wisconsin. I worked there for six really pleasant years.

I learned a lot about working in state government, and the way state government works and interacts with school systems. I was working in the Education Department. Then I moved to Watertown, outside of Boston, Massachusetts - when my spouse got a job here.

That's when I decided to start Civilytics - because I had identified while I was working for state government, some things that could be better served if I was outside of government, and wanted to see if that could work. We just had our five-year anniversary this year, and the book's a big part of that. So, that's kind of how we got to where we are now, in a nutshell.

Len: Thanks for sharing that really great story. I'm from Saskatchewan originally myself, in the middle of Canada. I know what it's like to come from cold, sparsely populated places. And actually, I went to six different schools from kindergarten to grade 12, so I know what it's like to notice differences like that. Do you remember the first difference that struck you as a kid, between one school system and another?

Jared: I think the biggest difference that was really immediate to me, was one school was very academic - it's still a public high school, but it was very academically-focused. Or at least there was a place to be very academically-focused and challenged.

In the school I moved to, not to say anything too unkind - but I felt immediately that there was not a set of adults who were really caring about academically challenging the students. That wasn't the focus of the school system. And you could just - you felt it in every class, and in the expectations and in the policies of the school. I think that was one of the big differences.

Len: That's really interesting. I had a couple of experiences like that myself actually. It's - I think of it as one of my own - I didn't know this before this interview, but that we shared this experience of noticing, when you're young, that there's policy, and that there's people that are deciding what you're being taught - and that it can change.

For me, the experience wasn't across different schools - it was from year to year. At one school I was going to I realized - like one year, literally it was the worst thing you could do, to swear on the playground during recess. There would be teachers out there wagging their finger at you, or punishing you in some way for swearing. And then the next year, swearing was basically encouraged - as an expression of freedom, and wild imagination and stuff like that. And I remember thinking, "Oh wow," - I didn't have the word in my mind, right? But, "This is arbitrary."

Jared: Yeah.

Len: And I knew it didn't just apply to the things I noticed changing, it applied to the things that would be the same - they could have been different, they could change at any time.

There's people behind it, and those people are authorities in fact on this, in the sense that one would naively understand an authority to act. They're just deciding, and they're deciding based on whatever.

I remember not actually resenting it - but just being kind of like, "Huh. Well, I guess that's what school is." It's very curious to notice that thing about the world.

And so in university, you said you were at - I can see here from LinkedIn you were at the Pacific University.

Jared: Yeah.

Len: You studied politics and government, and then you went and you did a graduate degree in Political Science. And then , as you mentioned, you became a public servant in Wisconsin. What was that change like, from being a graduate student to being in government?

Jared: I did something that I would never recommend anyone listening do - which was that I did them at the same time.

Len: Oh, wow.

Jared: I was actually in graduate school, and I got the bug for statistical programming. I had a statistics course that was required, that I wasn't that interested in. But it was required. They forced us to use this open source tool - at the time it was pretty rough and ready, called "R."

It's now a very popular statistical programming tool, but at the time it was still a little new and experimental. There was no instruction given on how to do it, we just had to do all our homework using it.

But I took to that, and enjoyed it. I was given a choice that I could either work as an intern in the governor's office, or I could work on a big experiment that was going on, that the university was running at schools across the country. The experiment project didn't really strike my fancy, but this idea of getting to work in the governor's office - that sounded pretty interesting for me as a political scientist, a chance to see what it really looks like.

I was helping implement an application for federal funds. I was just helping a team write an application. But I made a few graphs for the application, and they really liked that. tTey were like, "Oh, wow. These graphs are useful." Looking back now, I could have done them in Excel or something - it wasn't anything really fancy. But they liked that, and they said, "Why don't you come do an internship at the state education agency?" I said, "Well, I'm not going to stop doing my degree." They said, "Oh, we could probably work it out."

So, I started off working part-time, and then I started working full-time and doing my graduate degree part-time, and just transitioned. But I always had my foot in both of those worlds - which was a really interesting experience, and really did shape a lot of the way I think about what Civilytics does now - thinking about the importance of being a bridge between those two different worlds, and how they can help each other, and the ways they can lose track of communicating with each other.

Len: Were you jealous of your fellow PhD candidates, who didn't have full-time jobs?

Jared: Well, if you know anything in the US about compensation, it's much better to work for the government, than it is to work at the university. You get a much fairer salary and benefits package, so I did it - that was a motivating factor for me, very much.

Len: I guess this is a bit of a detail, but, did you teach yourself programming in R? You said you took a course on statistics - so they must have given you some introduction?

Jared: There was some introduction, and then I did pursue several courses in my methods sequence - and then went beyond it, to keep taking courses that sometimes the instructors were good and were interested in teaching us R. Other times they were just giving us material, and throwing us the goal and saying, "Get to the goal," and then giving you the space to learn it.

I took things in many different branches so that I could learn how to use R to do different things, like geospatial statistics, and multi-level models, and all kinds of different tools, that just allowed me that space to learn.

Len: Were there online courses and stuff available to you at the time that you were aware of, or -?

Jared: I made an online course, so it was pretty early days.

Len: Oh, there you go.

Jared: I made this "R for Education Researchers" because I was trying to help other government agencies adopt R - because the license fees for statistical software can be a real barrier to departments being able to adopt those tools. That's less of a problem now, I think. I mostly used books, some key books - and a lot of hands-on time at work - luckily my supervisors gave me the space to really solve a problem well, and use R to do it, and then learn from that experience.

Len: You mentioned you made some charts that people really liked. Preparing for this interview, I watched an interview or a talk that you gave a few years ago. It's on YouTube - I'll put a link to it in the transcription - where you show a number of charts that you do.

You explain, I think very well, the role of the - I guess - let's say, public servant in the education sector, and the way that what you're primarily doing is trying to advise people who make decisions. So if you show them say a chart that you're working on, you need to show them a finished thing that makes a point.

You have this really great chart where you show - basically, it's something along the lines of - you tested a bunch of different models to see how well they'd work, and the idea was, we should probably choose one of these models going forward, for predicting something like graduation rates, or something like that.

If you had just shown the line showing which one was better at predicting outcomes - that's the line that's higher on the chart than the lines that are lower than it - that would have been something.

But what you did, was you plotted it against how good at predicting outcomes the other systems across the country were. You showed that the best model that you'd come up with intersected with the best performing one across the other states. So, when you show that chart to somebody, now they can go, "Which one do you want to choose?" "Well, I want to choose the one, where I can tell people we're choosing this one, because it's as good as the best in the country."

I was just wondering if you could talk - did that awareness, that that was what your job was, come to you initially, or did you come to that over time?

Jared: Definitely over time. That's definitely a hard-fought, hard-won lesson of making - I talk, I do a lot of training for education data analysts, and that's where a lot of the ideas in the book come from, too.

I think a big part of it for me was like, I did get this big initial win. People were interested in my charts. Like no one was interested in what I had to say before that. I was like - I came into the office, and was like, "Well, I can make charts all day, let's have interesting conversations."

What I realized is not everyone is interested in the charts, first of all. Second of all, it takes a lot of people to agree to make a decision in the end, especially a consequential decision that's going to shape education policy.

So then it's about - well, all those different folks need a way to make a decision. You have to listen carefully to what they are interested in, what their perspective is - and try to really understand their position. Then, you can come up with that chart, and say, "I think this solves the questions that you have, in a way that gives you the ability to now make a decision."

If we've done our work - that means that it looks very simple, the chart. But it is the tail end of a lot of work, of listening and identifying what is going to be the piece that makes the decision straightforward enough, that everyone can feel comfortable about it, feel like it was a legitimate decision, and feel like their perspective was considered in it.

Len: You make a joke in the talk about how their decision in the end was inevitable, given the way you presented and structured the chart. Which on one level sounds manipulative - but when you're clear that like, "No, this is the end result of a lot of work and careful consideration in the context for what the best outcome is" - so, what you're actually presenting is an argument.

Jared: Yeah, and I think it's clear - it's good to be explicit, that - it almost always is. If you're you're doing data analysis for a purpose, it is an argument. We're not just doing it to speculate, or to just amuse ourselves. We are going to choose to do something different.

So being explicit about that, that it is an argument - and that you should show people that they had their voice in it through the whole process. I think that's a big part of what I had to learn. Was that everyone's going to -

If you show that chart, and you didn't do the eight hours of listening with all the different stakeholders in the room, and the different conversations with them - they're going to look at that chart very differently than if they'd had a conversation with you, they trust you, they know you heard their concerns and you showed them how you heard their concerns. Then you present the chart, and they're going to say, "Yeah, great. This confirms all of our best hopes, and we feel comfortable with this - and we're going to support this decision moving forward."

Len: It's interesting how granular the situational awareness you need to have in situations like that is, particularly - even with respect to individual people - as you're saying, having listened to them.

I have a story I like to tell, from back in my investment banking days - when I was putting together a chart showing some projections for some revenues, or something like that, one of my colleagues said, "That chart's too toothy, you can't show that to Martin." I was like, "What do you mean? That's the chart, that's those numbers." He's like, "No, no, no. Change the scale on the Y-axis so it's not so spiky." I said, "No, this is the scale that we use for projections of this kind." I remember flicking to the next slide in the meeting, and this guy goes, "Why is it so toothy?"

Jared: I love that word, "too toothy."

Len: Toothy, yeah.

Jared: It's so great.

Len: I learned - of course, like you say - you would say, "This is a different scale than usual, blah, blah, blah." It wasn't that there was any manipulation I would say going on. It's just that people have their quirks, and actually those - you're doing data science, you're doing social science - but you're also doing people, when you're doing something like that, and you need to keep those things in mind.

But that leads me to ask a question. One thing I've found, is that often when you show people numbers, and when you show them charts - they think what you're showing them is reality. No matter how explicit you are about, "This is the result of a lot of analysis and conjecture and assumptions," there's a certain person that just thinks, "No, you've shown me reality, I'm going to go out and tell people, 'This is how many - this is the proportion of kids that are going to graduate in fivw years.'" What do you do in your experience - what can you do in your experience, that you've learned to help mitigate that problem?

Jared: I actually think there's - it's a two sided problem because - I spent most of my early career worried about the flipside of the coin, which is people who, no matter how much data and evidence you show them, they're going to go with their gut.

I think both attitudes toward data can be dangerous in an organization. Either, "We're not going to listen to data, because we have a lot of good intuition or gut or historical knowledge - and we're going to stick with what's always worked." Or, "We're going to do whatever the data says. The next step is always going to be determined by whether the bar graph went up or down, or whatever our KPI is - and we're not going to step back and be critical."

So, when we talk about with clients about data strategy - that's one of the most important things I talk to them about, is - it feels good as a data analyst to talk to the people who are like, "Well, thank you for that. We are going to do X because you recommended X." You can get caught up in that, and you might not actually be doing the service that you want to be doing.

Because actually - as a data analyst, you know all of the reasons why there are asterisks after those figures, and what the threats to the validity of the analysis are. You need to have some clear responsibility about how to communicate that part of it as well. We don't really have a bullet graph that's really cool, to show people how uncertain to be, about whether or not we're even counting the thing we want to measure correctly.

As a data scientist, thinking about your - you holding the knowledge of the whole process of how the data gets collected - to the analysis, to the reporting, being able to communicate and weigh all of what you've learned through that whole process as a holistic result, instead of saying, "The KPI is seven, and we already predetermined that if it was seven, we would do X and if it was six, we would do Y." That sounds efficient, but unless everything else is right and tight, you don't want to be making decisions that way.

Len: Your dissertation topic was schoolboards and the democratic promise. I think you finished it in 2015. Schoolboards and the democratic promise, I suppose, are very topical these days. I was just wondering if you could talk a little bit about what your thesis was about?

Jared: Sure. I have to reach back into my mind. I don't talk about it very much actually anymore.

There's like a fundamental tension in political science about, how do we measure what is a democratic government, right? If you go into civics in the United States, you know - oh, the United States isn't a democracy, it's a republic. Which is one step back from a democracy - because we don't vote on the laws ourselves. Political scientists have a lot of different ways of categorizing and thinking about how democratic is a institution or government of any kind.

One of the things that you learn, is that the amount that people participate really is important. If you have free and fair elections, but only 10% of people vote, then people might say, "It's in theory a democracy. But it's actually not a democracy, because only 10% of people are making the decision."

This is an academic debate. And, we - the United States doesn't have great voter participation across the board. We have particularly abysmal voter participation for local elections. And in Wisconsin at the time, our school board elections were off-cycle. They were not on the standard presidential, governor cycle - they were in the spring. So they were even lower turn-out.

But there was an event that happened while I was in Wisconsin, where the governor changed the power that school boards had, and gave them a lot more power over their workers - basically made them able to overrule any union bargaining agreements with teachers. Now, school boards were really important, because if you put in a school board that was in favor of the teachers at that moment, you could protect all of the workers' rights that teachers had won. If you put in a school board that was in favor of the new law, you could drastically reshape the way teachers were compensated, and really enact a lot of reforms that were not possible under collective bargaining before.

So there was this moment where the school board was really critical, but it was still off-cycle. My theory was to test whether or not voters were aware of that.

We had a special governor election, to make them aware that this was an issue. They were aware of it. Then, did that affect how people behaved in the school board election? Did more people run for the school board? Did more people lose their seats in school board, did they turn over more board members than before? Was there higher participation?

The answer is that - yeah, there was. We did see a spike. The idea is that, perhaps - normally school boards aren't that democratic. But maybe when things are really important to the voters, they still have - there's no barriers to them really participating, so they can express their democratic will.

I wouldn't be comfortable saying it was a conclusive study, and I wouldn't believe that it was - I don't think that that's necessarily true. But I think it did show that a lot of times in political science, we think of school boards as not very democratic. But I think they're maybe more democratic - and there's more ways to think about it, than mainstream political science at the time had really been thinking about them. That was a way to test the electoral nature of school board politics.

Len: I'm really curious, did this change in the law give the school boards more opportunity to basically tell teachers what to teach? You mentioned specifically it was compensation and things like that, but did that give them -? That's obviously a powerful lever. Was that a concern that people had, or was that a thing that - a power that those school boards then actually had?

Jared: Not in the specific context of Wisconsin. School boards do have that power, but the power was really focused on compensation, leave, professional development time - things that were costing money. Because the school districts were in a fiscal crisis across the state, because of the great recession.

At the time, the new powers were really - everyone was really focused on their - the reason to put them into place, was to save the budgets of all the school districts. That was the frame that it was in. Although you can of course imagine, school districts do have - in some states, and like Wisconsin - wide latitude on the curriculum that they teach. So that is, I would imagine, another place where we would see this issue come to the forefront.

Len: It's so interesting talking about things like this for me personally, coming from Canada and having lived in the UK. American administrations break all my instincts, right? Because it's just unthinkable, generally speaking in Canada, to think about like the local school board having wide latitude over the curriculum, or people being - I mean, I'm not actually exactly sure how it works - but you can't elect sheriffs in Canada. You can't elect judges.

I spoke to a friend in Michigan once. She was working on a campaign for someone to become district attorney, or something like that. She told me that someone running to be a judge can actually ask for campaign contributions from lawyers that might appear before them as a judge. Do you think that school boards -? I mean - you don't have to tell me your opinion on this if you don't want to, because I know it's your space. But in an ideal world, do you think that positions like that should be democratic at all?

Jared: I think it's really tricky. I mean, when the chips are down, when - you can find ways that the idea of federalism, which is what we're talking about here - doesn't work. Because you can find very compelling examples, where the local decision-making has gone off the rails, and it's become tyrannical.

That's what I experienced as a high school student. The high school I went to, I was in a small town in Montana - and they drug searched our lockers with drug dogs like every three weeks. It's like - if you want to do drugs in Montana, the last place you do it is at school. You've got hundreds of miles of empty space in every direction. It was just a show of force by the school board. It was a decision they wanted to send a message to the students for whatever reason, right?

There's much worse things that happen, where school boards are making really bad decisions like that, that you get that raised up to the news, right? It's easy to say, when that happens, "Yeah, this is not a great idea," right? But then you take the flipside, right? You have a president like Donald Trump. Not to be too political - but if you don't agree with what he's doing, and he's a powerful leader - you have your locally elected school board which can resist. Because they have their own legitimate authority.

So it's not a great system on either end of the scale. But it is designed to be that relief valve, when you disagree with what the other levels of government are doing. It's much easier to get on the school board, to participate in a school board race, than it is to get to the state legislature, or get to the national legislature.

I think that is why the fundamental tension really just is fascinating to me. Because I feel, in myself, if the education leaders were going to make the decisions I'm going to make, then I don't want them to be elected. Let's just have them be unelected. But if I'm in a position where they're not making the decisions I would make, an election's a great way. It's not too difficult to get my friends and I organized, and to try to change things by getting people on the school board. So I don't have a good answer, I think it's one of the great questions posed by the American system.

Len: Thank you for that really great answer. I mean, you captured the paradox - that inescapable paradox of tyranny versus agency, very well.

It's one of the things that really preoccupies me. I mean, that thing that you're describing, for example, where there are people who are nominally grown adults, who are exercising this totally arbitrary power for its own sake, to intimidate children and get the rush of feeling like an authority over them.

That's the agency that really bothers me. It's like - this is a very high-level observation, right? But if you have an organization that's nominally for - exists for one purpose, but it has a certain ethos, that ethos is going to attract people who are attracted to that ethos, not whatever the institution actually exists to do.

For example, if you have a very hierarchical top-down management structure in your company, it doesn't matter if you're making socks, or if you're making guns - you're going to attract the person who likes being in a top-down hierarchical organization, whether that means being at the top or at the bottom. You're going to attract the person who just enjoys having underlings to order around.

If you have a situation where people understand that, "I was elected to exercise my personal will for me, I was elected, not appointed as part of an administration or an organization, but actually it's me." You look around, and you have this, the gaze of power upon the people around you. That's just something that really bothers me. But again - the other side of it, is like - there's people who think, "Well, that's what everyone in a position of authority is doing."

Jared: Yes.

Len: The very concept that you could just be a like featureless person who's simply there as a representative of an organization - some people just don't believe that that's a thing.

Jared: One way that we do try to square the circle in America, is we do - in education specifically, we do have state education agencies which are charged with setting parameters within which school districts can be free. State constitutions. for the most part, give the state the decision about how to organize schools, and they defer a lot of that to districts.

But there's a lot of power struggles between districts and states, and they do different things differently. That relationship ebbs and flows, as it did while I was in Wisconsin, and continues to do.

I think that part of it does also make it - the promise there is that - well, if things get too far out of hand and you are stuck in a community that you just can't get enough people on the school board to change things, you can make a legal case that your rights are being violated, and you can be protected that way from a higher level of government. Or you can get the state to pass laws to change or set boundaries on what the school district's allowed to do.

That I think is why I think we're always thinking we're going to do a school education reform - and then we did it, and now we're going to see how it worked. But it's actually that it's a continuous cycle of balancing the power between the different levels, especially between states and districts - and I think it's very tricky to keep an eye on.

Len: Since we're only asking easy questions -

Jared: Yeah.

Len: - in this interview, I thought I'd ask another easy one, which is - there's obviously a big education controversy going on in the United States right now, particularly with primary and secondary schools.

Jared: Yes.

Len: There's one actually in the province of Alberta in Canada right now as well, where the provincial government is proposing a province-wide curriculum that would have to be taught everywhere. The teachers and lots of parents are pushing back on various features of it. There's the high politics and then there's - they're like, "Should math be rote memorization in grade one?" I guess the question I want to ask, is - why isn't it a sufficient solution for some people to just tell your kids, don't believe everything they hear in school?

Because for me, like I was saying in my story - this is a big part of my own origin story, was, not in a negative or a reactionary way, I just learned, "Oh, school's an arbitrary place." Why is it that some people don't want that to be the answer, that you can just tell your kids, "Just don't believe everything you hear in school?"

Jared: Yeah. I don't know. But I mean, that is how most people are acting. I feel like a lot of the education debate is about intentionally trying to charge up issues, and raise them up to have a higher volume than they do. The real issues in education don't get this much attention - like the real core issues of funding and access to extracurriculars, and the decisions about - like, "Should we be teaching kids Computer Science, instead of trigonometry, for a year," right? Like, "Are our school sites of economic training, or are they sites of cultural training? How much is that balance?" Those are core issues that I think really do - most people do think about, and have really strong opinions about their schools.

Then these other issues become ways I think for partisan politics to get pushed down onto something that people care - everyone cares about education, because everyone went through it. So it's a really good way to connect emotionally with people as a political strategy. To say, "Hey, schools aren't like how they were when you were a kid." You either really like that, or you really don't. "And I'm going to try to put that into a partisan spin, so that you'll vote for me for a different level of government that's not really deeply involved."

There's a lot of really interesting research about schools resisting - like schoolteachers' resistance, to when these partisan things start to get pushed. When there is a curriculum mandated, the teachers, as professionals, say, "This curriculum isn't really that good. I'm going to work around it or hide it," right? There's a long history of that in the profession of teaching, and studies of that. I think those things are how these things actually wind up playing out.

I think the mindset you're talking about, is both a mindset that I think school officials might take, "Don't believe everything the legislature tells us is required." Wink, nudge, "Let's move on." I think that parents often - yeah, with their kids - are probably saying, "Yeah, you've got to just go along with this one with the school." I think that does a disservice, right? Because of what you said about the ethos and the culture. What is that, how does that play out? But I think it's how it does play out in reality in a lot of cases. But that's not as good of a news story.

Len: I'm going to try and do a segue from the 30,000 feet, to the real like in the mud stuff. But have you ever read Anti-intellectualism in American Life by Richard Hofstadter?

Jared: I think I've probably read an excerpt from it in a political theory class, but it's been a while.

Len: The book is quite old, as you know, from the 1960's. But I read it a little while ago, and a lot of it read like it had been written the other day. He's got a really, really great chapter in there about the history of education in the United States. I just remember when I read it - it was such a bucket of cold water, about the fact that actually mass education - the fact that - as you said, everybody has to go through it. This is a recent thing in human history.

Jared: Yes.

Len: Very recent. in fact, mass education particularly in the United States, really only started in the very early 20th century. At that time, it was tackling very particular problems in certain places, where the big administrations started to appear. Like in New York, where there were a lot of immigrants, for example. You had to teach people, like - not only Shakespeare, but Home Economics. Just how to live - how do you live in this urban environment, in this new life? With new technologies like sewerage and electricity and things like that. There were all sorts of really practical matters.

But essentially, we're still in the beginning of the experiment. We're now in the stage where we have technologies and things like that, where we can gather and process and visualize data in a way that we couldn't before. We're still at the beginning, but we're at - maybe at the end of the beginning? Or something like that, if you want to be romantic about it - of this great experiment.

So - and you've been a part of that. You did work on something called an "Early Warning System," part of your work for the state government, I believe? I was wondering if you could a little -? You've got a chapter about it in your second book. I was wondering if you could talk a little bit about what an early warning system is, and the one that you worked on - and how one can build an early warning system in an education department?

Jared: Yeah. I think that's a pretty good segue. Hats off to you. Because an early warning system is at the other end, right? It's about solving administrative challenges, and making the education system more - I tell people it's a safety net for the education system. A lot of people want to think of it as very prescriptive. But what I learned in doing it, I think, is that it's more of a tool to augment the existing work - in a way we use a lot of other tools to augment work.

Just really briefly, in the Wisconsin example - which is what I build the chapter in the book around- but I advise other people on developing similar systems, as part of Civilytics now - in education, and elsewhere. The idea in education of an early warning system, is that we are generating data from students, students are giving us information. We have data systems to collect that information. What we want to do, is get a sense of whether a student is - quote unquote, "on track," to meet some objective, or goal that we have as an education system. The easiest one to think of is just graduating with a diploma in high school.

Students who are less likely to graduate, they start to exhibit behaviors that put them on a track that they're not going - early. Schools typically have many ways to identify this. But an early warning system is another way to do that at scale - by using a limited set of information that we have about every student, and building a statistical model, that determines whether or not that student is is likely to complete on time, or not complete on time, and then communicate that information to someone who can help change that outcome - from potentially a negative outcome, to a positive outcome of them completing on time.

We know from research that it's much harder to take a 17-year-old who's about to not graduate, and move them to being a graduate. That's an intensive - it takes a lot of intensive intervention. School systems - some of them are okay at this, others really struggle with it. It's much easier to gently improve the way the student interacts with the school system, from say grade five on, and try to improve the school system to better serve those types of students at the same time. It gives us a longer chance to improve things.

There's risks associated with early warning systems. That's one of the things that it really matters, the context that it's built into. You need to be very careful about how people are going to use it, and who has access to it, and how it is kept up to date and kept current. Which is a lot of what the chapter in the second book covers. But the idea is that schools make these judgements all the time, they have to, to do their jobs.

This is a way to support that judgement, to say, "Just in case, just quickly check this list. See if you don't have a plan or a strategy to help any of these students - who you might have missed, because the pattern of behavior that they're showing is a little subtle, or you just didn't - you had too much on your plate." It's a safety net for professional educators to feel like, "Okay, we also can check this resource, to make sure we're serving all the students we need."

Len: It's just such a difficult job. I mean, I only know it from the outside, having been a student. But I can - when you mentioned risks, one might think, "Oh what's the risk of an early warning system?" It's like, "Well, a self-fulfilling prophecy."

Jared: Yes.

Len: That's one. another is, angry parents.

Jared: Yes.

Len: "My kid's smart, what are you saying? My kid's dumb?" Now all of a sudden, that might make the outcome that you're trying to avoid more likely, in another version of the self-fulfilling prophecy.

I guess, one question I have - is so, when we talk early warning system - I live in a tsunami zone. To me, an early warning system is like - my phone goes off, and it's like, "Oh no" - knock on wood - when we talk about this in like 2021, and the early warning system you talk about in your chapter in the book - is this like, a teacher gets an email, like saying, "Boom, our system says that Joe is now at stage one of difficulty?" Or do you provide them with a series of metrics that they can check student performance against, so the teacher themselves says, "Uh oh, it looks like this student might be falling into a bad pattern?"

Jared: It's really interesting, right now, where we are, DEWS, the Wisconsin dropout early warning system - I started working on in 2013, and I think we - as a team we pushed it out around 2015. That was - it's hard to believe, but it does - for me - the technology then, is very different than the technology now. Which is just wild to think about. But I think people are really innovating on these. There are school systems using tools where they are trying to give more real-time alerts. There are places where they're just giving like a report.

In the case of DEWS - this will get into the mud a little bit here - Wisconsin, the state - had data, very - our data came very slowly. School districts would report everything about their students to us twice a year. We had two time points a year at which we got new information about students.

We had a number of mechanisms to report information back to schools, but not to the teacher level. We could report something to a principal, but we couldn't report something to a specific teacher at the time. But what we learned - we did a bunch of user interviewing and presenting about this idea at conferences with educators. We're like, "We're thinking of doing this. I think we can do it pretty accurately. But we don't know how to communicate it to you, or like what would be useful."

They said, "Oh, at the beginning of the year, we often sit down and think about how we're going to help different students. We try to do it for all of our students. Sometimes we are looking through that list, and we look at their past performance. We look at how they did last year, what their grades were. If you could give us a list at the beginning for that meeting - we could bring you out to that meeting, and we could incorporate that information into it." We wrote a guide on how to interpret the information. We did a pilot with several schools to show them that information, and got their feedback about what that would be.

Then, what was unfortunate for me - again, getting into the mud - is that the first data was preliminary data. I had to build a secondary early warning system, to make the prediction on time, to be useful for them in that process. Then, we would send them an update later in the year, to say, "These are the students who changed after we got the final data. Maybe that information is helpful?" But they told us, "If you don't tell us at the beginning of the year, it's going to go into an inbox, and we're not going to be able to put it into this process that we have." That was really eye-opening for us, to think about that. I think there's now a lot of innovation about how to do that. I think there's a real risk of over -

With the early warning system, if your phone went off about the tsunami every week - when the tsunami finally came, you might not run, right? So you have to be really careful about what that behavior winds up being. I think we're just at the beginning of starting to think about that. Like human/computer interaction. Interacting with our professional job, especially a professional job like being a teacher. How does that cumulatively wind up? Is it going to be like the notifications on my phone, where I just stop paying attention after a certain point - because I can't get them to go away? We want to be thinking about that, I think that's what the field is going to be trying to work on in the future.

Len: I hadn't thought about that before. I think you've written somewhere about having too much data, and how that can be a problem. This is a problem that data scientists semi-backhanded complain about. "I've got too much data, dammit."

But providing teachers with too much data, or like - and it's the same thing with the people who are making the policy decisions too - it needs to be informative and thoughtful about the context, in every touch point.

Jared: Yeah. it comes back to really - what I had to learn - and really why the DEWS project will always be special to me, is, really spending time thinking about just the professional skill that the principals and the counsellors and the teachers bring to their job.

My algorithm is adding something, but it's adding a lot less than I thought it was when I started. Because they are - I would tell them this - they hold in their heads a very sophisticated processing system for identifying when students are in trouble, and how to reach out to them. So it's really about - how do we fit a system in that augments that, or improves it or protects them against a mistake - or helps them check, feel more effective? To maximize the really powerful part of the education system, which is the educators interacting with students.

The one way that that really came home for me, is - one of the people on our design team, who worked with me on designing it, she showed me once, a spreadsheet she used as a principal. She made a spreadsheet that was like our early warning system. She scored the students, and she ranked them. She would spend hours typing the data into the spreadsheet, and then giving her results. She was like, "If all you did was replace the spreadsheet so I had those hours back, that's going to be a really big step forward for us. Because I felt like I needed to do this spreadsheet, to be sure, to feel comfortable that we are doing everything we can. But I also was the one who had to do it." When she told me that, I was like, "That is really where it slots in."

It's maybe not as meaningful as I think some people want to imagine it is, but it's also a very - it can be a very powerful part of the system, and once it's built - like any data science product, it scales really well. We're making - in Wisconsin, they're making hundreds of thousands of predictions a year for school systems that have no capacity to set up a system like that for themselves.

Len: My next easy question is about standardized testing. I remember having an experience in school once, when I was all - I had no idea what was coming, and all of a sudden I was given a standardized test. I had to figure out, "Oh, you have to fill out circles, okay." I had to figure out how to do the test before I could even get to answering the questions. In the end - the results came back, and I was told I should be an accountant. I was in like grade eight. I was like, I didn't have the words - but it was like, "Fuck you," to the test and whoever was telling me that. No offence to accountants out there meant at all. But it was - basically, I think - and it was because I finished it so fast. It was just this - again, like I didn't have the word. But it was such an arbitrary conclusion to come to, right? "You're fast at filling out forms, so you should be an accountant." There's the real insult to accountants, right? It's actually a really complex job that involves a lot of creativity and things like that.

I guess, just my general question - if you were sitting in a pub, and if your new brother-in-law found out what you do, and he asked you, "What do you think of standardized testing?" What would you say?

Jared: I think they serve a function, and they are used in ways that is far from the function that they serve today. One of the things I grapple with very much, is like - when that happens - I think standardized tests are overused. Kids are tested way too much, and way too much is put on the stuff that it can be tested, compared to the stuff that cannot be tested. I really empathize with education officials who just feel like they're really under a lot of pressure to make this magic box produce the numbers that need to be produced, for them to justify what they're doing. I do think that that is a challenge, a very big challenge.

What I worry about is, so, could we go back to using them more justly or more fairly? I don't know. We got the power, and we used it wrong. It just has devolved. what's to stop us from - okay - we reset the way we use standardized testing. We try to go back to using it for informational purposes, for measuring long-term performance and for thinking about - using it to validate other things that we know about schools. Well, won't we just creep back into the same patterns, and start using tests to test how we're going to do on the test?

I guess I feel particularly conflicted, because the standardized test in Wisconsin was very, very useful for that early warning system predictor. It does a very good job of helping us find out whether students are on track to graduate. Not in the way you think. You don't have to do very well on the test to be on track to graduate. But if you did very poorly on the test - that is a very clear signal that the school system is not doing a good job to find out how to help students in that situation find a path to success. The problem is that we maybe put that on the student and their family, as like, "Well, they should do better on the test." Instead of thinking about actually what the test is supposed to be telling us - is that - as a school system, we're not finding a way to connect and to make this experience meaningful and productive.

I think a big part of that comes down to resources. I think a big problem about standardized testing, is that they're more seen as a powerful tool to chip away at the resources schools have, instead of a tool to help us say, "Where do we need to put more resources into the school system?" If we're going to have a subtractive mentality about it, I wouldn't want to keep using standardized tests. But I do find there is probably a way that they could be used that would be okay.

Len: You've brought up money and funding a couple of times in this interview, and it is fascinating how contentious that can be, particularly in education. Not just kindergarten to grade 12, but also in higher education as well. I remember - think it was in the early aughts in the UK, a system was brought in to sort of - I forget what it was called now, but it was to assess professors. It was because people wanted to know, "Where's our money going? You need to show me some number in the end," right

I say this aggressively and angrily, because it was - it's a bullshit mentality. All of a sudden, profs had to start tabulating their mentions in - how many times is the paper cited? How many papers have they published? How many papers have they published, multiplied by the rank, assigned to the journal that they're published in - and then how many times have you been cited? The "Research Assessment Exercise," I think it was called? Something like that.

All of a sudden, profs have spent - you mentioned before - the teacher who now, who's spending all the time doing the spreadsheet, doing analysis that hopefully someone else was - who's job is analysis, could do instead. Now all of a sudden you had a bunch of like professors doing all this analysis. Again - a completely arbitrary system - basically so a politician could say, "At the end of the day, we should be paying profs less, because they're not getting cited enough, and they're not publishing enough - those lazy jerks."

Maybe this is just one of those other paradoxes, right? If a public school system is coming from taxation, and it's being given to teachers to do a job - there's going to be someone somewhere who's like, "Show me the results."

Jared: I think public performance management is something I'm very fascinated by. Because you could poll a hundred people in your town, and they'll give you a hundred different answers of what the school system's supposed to do. It's not like a publicly-traded company where, we're supposed to return value to the shareholders. At least we can all agree, that's what we're here to do. But when we walk into a school building, we're here to make kids feel safe and supported. We're here to help them express themselves. We're here to make sure they're prepared for the economy. How do you measure all of those things? You can't measure all of those things.

Len: Actually, I think that's a really great analogy - or comparison, right? Because I mean, when you think about some of the most successful companies in the world right now, Amazon and Tesla - both famously didn't make any money for a really long time. There was the ordinary stock analyst, who's like, "Why would anyone ever invest in this business? It's never turned a profit." It's like well, now, Jeff Bezos and Elon Musk are worth hundreds of billions of dollars, because they could keep the long term in mind.

The reason I bring up that specific example, is that there is an - I think it might be in the province of Ontario here in Canada - where the provincial government was suggesting that they wanted to have a policy where public funding to universities would be partially based on employment outcomes, right? The idea being - borrowing from a discourse that I think is authentic and good that's critical of basically scam universities - that are like, basically you pay a lot of money or borrow a lot of money to go there, and then you're unemployable at the end. Borrowing from that legitimate discourse, they want to take that and apply it illegitimately to legitimate universities and say, "Well now, you guys who are like teaching like English literature, you're scamming people, right? we're going to try and catch you, so we can cut your funding - by saying, 'What proportion of your students actually have jobs within four years of graduating - and specifically how much money are they making?'" It's like - well, if you judged the performance of a university by - let's say graduation rates, right? Well, Harvard really scammed Mark Zuckerberg.

Bill Gates, wherever he went, and - or didn't go. This idea of assessment - this is my public service announcement to anyone listening, be very wary when people start trying to impose assessments of education outcomes based on things that seem straightforward, like employment and how much money you make. I mean, I studied English literature, and I ended up in investment banking.

The assessment that I'm talking about would ascribe failure to the University of Oxford, for having trained me in English and taken all my money, when I didn't get a job in English in the end. Anyway, that's my rant. Sorry, I'm sure you get those all the time.

Jared: Yeah, my favorite thing like that - there's a book by Deborah Stone called Policy Paradox, and I think it's probably the greatest political science book ever written. I really hope she hears me say that someday. Because you can open up to any chapter, and it talks about all of these issues about - well, when we want to try to measure performance, there's a political purpose to that, and there are people who are going to win and lose from that strategy. There are people who have legitimate complaints, and people who have like complaints that are made up - that are false, that are crying crocodile tears.

She just does a lot - she looks at these problems from many different lenses, and shows you how and why they are so difficult to disentangle, and why it does seem like - what starts out as a well-meaning policy, which I think 90% of people would agree - like scam universities are a problem and we don't want to have them - can be shaped and used to make real universities start to function in ways that we would not find optimal or useful or productive to society. I think that's also one of the big fundamental paradoxes of making decisions about public goods.

Education Data Done Right: Volume II by Co-Author Jared Knowles

Len: Just moving on, we've been talking for about an hour now - so just moving on to the next part of the interview, where we talk about, the writing and production of your book.

There's two books in the Education Data Done Right" series now, and there's a website that we'll link to in the transcription about this project. What was the origin story of this project? Was it you and a bunch of colleagues were sitting around saying, "We've got all this knowledge that we can share, and but no one is publishing at the metalevel about what we do?"

Jared: Yeah, Wendy, DJ and I, we're the the authors on the first book, and the editors and authors on the second volume. We met at a conference. We're all trained as social scientists, in this very academic tradition, but found ourselves drawn to this very practical work of working in government.

We were at a meeting for government data analysts to talk and share ideas and talk about data systems, and how we do our work. But the meetings mostly tended to focus around really IT stuff - which is critical, and we have a love letter to IT professionals in the first book, a chapter on why they're important, and what's really great. But there wasn't space to talk about how to be an analyst, including making friends with the IT department.

So we were coming out of this academic training, where the concept of data is like a dataset that has been carefully collected and manicured and perfectly laid out like a bonsai tree, that you would then analyze. The data we were dealing with is just like data that's coming off of these systems that are trying to work every day, and they have different conflicting definitions. So we were like, "We weren't really trained for this."

What a lot of people who do what we do - they want to talk about what we're doing. Because administrative data was a very popular academic topic at the time in education research. But they've never sat inside these rooms where we're arguing about how, in this year, all of this data in this one column is a little dodgy, because we had a mistake on the form that we collected it in. What does that mean for the rest of the analysis?

So these kinds of issues - which to us all seemed very fundamental - affect the downstream work much more than people want to think. We're like, "Well, there's got to be some book about this or some ideas about how to do this job." There's some stuff in other fields that we found inspiring, but we found there's a lot of books about education data science that start with, "You now have your dataset, here's how to run a neural network on it." We're like, "Well, there's a lot of things you might want to check before you run that neural network - including like who made the dataset, and what was their agenda?"

So we put the first book together. It really focused on I think the data preparation, uncovering errors - and being really thoughtful about documenting that.

I think the really great work that - for example, a really great statistical agency like the US Census Bureau, is very thoughtful about - they have very good procedures about that, but they are also not super public about everything they do. Because they have to protect confidentiality of every person in the country. So, I know they have really good procedures. We need to bring that professionalism and standard to the education data work happening in state education agencies and school districts, as they're building these more and more complicated data systems - and so that's really the origin story.

Len: The first book I believe was published in 2019?

Jared: Yes.

Len: You collectively chose to self-publish it, I believe and it was on Amazon, at least, and presumably elsewhere as well. Was there a reason that you and your co-authors decided to self-publish, as opposed to maybe approaching an education publisher or something like that?

Jared: I think there's two reasons. One is we wanted to go quickly at the time. In 2019, we all had this moment of like, "I think I can work on this right now. We want to move really quickly."

I was really energized by the idea of self-publishing, because I was like, "Well, we actually know the people we're writing this book for - most of them, a lot of them personally. We can really find this, and put this in their hands on our own. We don't need to deal with the timeline of a publisher."

And I also don't - I'm not interested in trying to shape into some publisher's narrative, or fit in their portfolio of books. I knew what I knew and I - and I think DJ and Wendy - we knew what chapters would be really helpful, and we were like, "Let's just put them together, bundle it up."

Our longer-term vision was, we want other people to write chapters, and then we wanted to be the ones to put their words out and to share. Because we knew a lot of our colleagues were super smart, three of them joined us for the next book - and had really great ideas, and it would benefit the field to hear from them, for them to get the space to really show what they do.

Because unlike academics, who have to get their reference scores up to make sure they meet their metrics - there's not a lot of like professional cachet for publishing for education data researchers in education agencies. But some of them still really want to do that, and they feel a sense of community, and contributing to the community.

Self-publishing seemed like it would allow us to sort of - if we were lucky, and the book resonated - that people could be able to contribute to it as well, and feel more part of the process - than if we had to go through a big publisher.

Len: This is the part of the podcast - we save it for the end, where we get into the weeds about writing and the process of putting the books together and stuff. I was curious what tools you decided to use, to actually make the ebook files in the end that you published?

Jared: I think this is also where co-authoring is tricky, right? I've already worn on my sleeve in this interview that I'm an R guy. I could write my stuff in a plain text file, and then I would use the R Bookdown package to package up the ebook files.

In the R Bookdown community, they are big fans of Leanpub, and there are many Leanpub books that come out of the R Bookdown package - and people have really lifted up Leanpub as a really great place to put that, and that's how I found out about it. So I was like, "Well, let's do it this way." Then DJ and Wendy were like, "R? I just want to write a book."

We wanted to write it together. We wrote most of the chapters as manuscripts in Google Drive, and shared documents. But then we did typeset it into R. I typesetted into R using R Bookdown, so that I could apply some formatting - and I wrote my dissertation using LaTeX, so I was able to do a little more than the basic formatting. Not too much, but a little bit.

We got a designer to put the cover on it, and then I learned a lot about MOBI, EPUB, and PDF files from that process. It wasn't quite as plug-and-play as the Bookdown manual had led me to believe, but I do think that it was - in the end - it was a really good process, and Volume Two ran pretty smoothly using the same process.

Len: Did you hire editors? Did you ask people to volunteer to read the chapters before they were put into the book?

Jared: We found colleagues who wanted to be part of the book, but were not willing or comfortable, didn't have the time to be full-on authors, and we said, "Well, could you be a reviewer?" For every chapter, we asked a colleague in a similar role somewhere, and we thanked them in the book - to be a content reviewer. Like, "Are we on the right path? Does this seem like it would help you in your job?"

Then we did - for our Volume One, we hired an editor who was a friend of Wendy's. For Volume Two, I had a research assistant who served as our editor for our chapters - and then we each also got our own chapter edited. It was a bit more - because there were six authors this time, we were a little more diverse in the process.

Len: I was very intrigued to see in the second book, I believe - you mention in the introduction that it's a living document. Which is, for Leanpub - that's our favorite kind book. We love all books. But I was wondering if you could talk a little bit about what you mean by that, and what your plan is for the book?

Jared: Well, what we hope is - one, if people catch typos - that we can quickly cover those up and submit a new file and make sure we're putting our best foot forward. I find that's actually really great. That's an underrated value of self-publishing. Once you put it out with a publisher, they're not going to want to do another run and fix your typos. But you can cover that up, it can get better - and it's a great way to get engagement with the readers, right? People - "Hey, thanks, you -" We can make a section that says, "Thanks for these edits, thanks for this person who caught the typos." People can be part of the book, even if they don't feel like they have maybe a whole chapter worth of something to say. I think that's really powerful.

I think the second thing is - perhaps, whether it's in a Volume Three - or if we add chapters to Volume Two over time, I think we're still thinking about that. But we do hope that people say, "This really fit with me. I'd like to share my experience about it." Then we can just keep adding that - that people get the new content that comes out, and they feel like - because, in our case, like I said - I can picture the room of people this book belongs to, which I think would make it a hard pitch to a publisher. The audience is small, I would guess there's 5,000 people in the country, that this would be really up their alley. But if we reach them all, that's great. We don't, that's totally fine.

But what I think is great is if I can get some of them to share what they know, and create a way that makes them feel more connected to each other - that's really the goal. Because what the real epiphany behind the book that Wendy, DJ and I had, was - actually, we wanted to write a book together, because we were all sitting in a room and we were like, "We are not alone. We all have these same thoughts and same things with our job." We wanted to give that feeling back to folks. So that just seems like a way to show that we're really open to that ongoing conversation.

Len: Thanks very much for sharing that story. That's so great. I mean, that's one of the reasons Leanpub exists - is to provide a venue for very good, highly-specialized books, that maybe have a smaller audience than any ordinary publisher would settle for. Also, people who are motivated by their mission in a way that means that slotting into a publisher's marketing schedule and publication timeline just doesn't match. There's nothing wrong with those things, it just doesn't match.

But I would say though, having gone through the books - that if you're in an organization and you're trying to bring in data, these books are really great. I mean, whatever your organization is - whether it's in government, or whether it's in a company - and I've interviewed a lot of people in this podcast who have similar problems, very similar kinds of problems to address in companies - this is actually a good case study - they're good case studies for the kinds of problems you can encounter in any organization. Trying to bring in data science, trying to manage it, things like that.

I just want to - I can't help myself with the mention about typos. One of the reasons that Leanpub - people often think it's just for programming books, and that's because it's very popular with programmers - because if you're writing a program and you've got a typo, the whole thing blows up. So the idea of books that could be instantly changed - you just click a button and it's changed for everybody - really appeals to programmers.

If you're reading a line in a book and it says, "It was the best of times, it was the blurst of times," you're like, "Oh, he meant worst. That's fine, moving on." But if you're showing a code sample in a book and the code sample's wrong, you really need to be able to fix it.

The last question I always save on this podcast, if the guest is someone who's published a book on Leanpub, is, if there was any magical feature we could build for you and your team, or if there was anything that you found incredibly frustrating or even broken about Leanpub - what would you ask us to do for you?

Jared: I'm not sure that I have any problems about - I think the thing that always - both times, and I wrote myself a big note in case we do Volume Three - is that there are a lot more metadata things to fill out when you list a book, than I think about. That's my job, I'm the author responsible for that. But in this case when I had six, five co-authors - I realized like I didn't actually agree with them on what we were going to do as the short description, what we were going to do as a full description.

I didn't have like a a checklist of things to run by all of them, so I was like, "I'm going to list the book tonight. It's going to go live." Then I was like, "Oh, follow-up email. I need you all to be sure that you're comfortable with this information that's going to be listed on the listing page." I think that listing process, just having like a checklist or something for authors to be aware of - like that in case they're working with co-authors - if you're working on it yourself, I'm sure it wouldn't be a problem. But coming up with it on the fly with co-authors was a little tricky.

Len: Oh, that's really interesting. We've never had someone suggest that before. But we do have various guides in our help center for our various writing modes, and then like, What if you're translating a book? Things like that.

Actually having one for like - just what are some of the things about co-authoring? We do have an article about how to set up co-authors on a Leanpub book - but that's more of a technical "how to" thing, rather than a kind of, "What are the challenges of co-authoring a book?" I've certainly heard a lot from people on this podcast about those kinds of challenges, including the thing you're talking about. I'll definitely put that on our list of things to do, because that could be really helpful for people to not have to reinvent the wheel every time they do something like that.

Well, Jared, thank you very much for taking the time to talk to everybody today, and thank you very much for using Leanpub as one of the platforms for publishing your books.

Jared: Well, thanks so much for having me, Len. Thanks for making Leanpub. It did make it possible for us to be able to communicate this topic, and share it with our audience - and it was - I don't think we would have thought of writing a book, if we didn't know there was a platform for which we could do that, that wasn't requiring us to go the route of a publisher. I think it is really serving that purpose. It's been strange to become an author through that process, and go through that. From all of us at the EDDR team, we're very grateful for the opportunity to be able to do that, and to build our community around those texts.

Len: Oh, well, thank you very much. We really appreciate it, and I'll make sure to tell everybody on the team about that.

Jared: Great.

Len: Thanks.

And as always, thanks to all of you for listening to this episode of the Frontmatter podcast. If you like what you heard, please rate and review it wherever you found it, and if you'd like to be a Leanpub author, please visit our website at leanpub.com.