An interview with Don Tapscott
00:00
00:00
  • November 28th, 2018

Don Tapscott, Towards a New Social Contract for the Digital Age

00:00
00:00
58 MIN
In this Episode

In this special episode of the Leanpub Frontmatter podcast, Leanpub co-founder Len Epp interviews Don Tapscott, Chancellor of Trent University, for the Chancellor's Lecture series.

The interview was recorded before a live audience on November 16, 2018.

Don Tapscott is one of the leading authorities in the world on the way technology can impact business and society. He has been writing prescient books on technology and economics since the dawn of the age of personal computing, including the first bestselling book on the impact the internet might have on business, The Digital Economy: Promise and Peril in the Age of Networked Intelligence.

Don has published many other books since then, including Growing Up Digital and Wikinomics, and in 2015 Don published a new version of The Digital Economy. It's a fascinating project, where Don republished all the original chapters of the book, along with commentary from twenty years later.

In recent years, in partnership with his son Alex, Don co-founded The Blockchain Research Institute, which is doing groundbreaking work on the impact that blockchain technology might have on all aspects of the way we live our lives and organize ourselves in the modern digital world.

In 2016, Don and Alex published the first major bestselling book on blockchain technology, Blockchain Revolution, and in June of this year, they published a new paperback edition of of the book, with a substantial new preface and a straightforwardly political afterword called "A New Social Contract for the Digital Age," which, along with Don's paper "A Declaration of Interdependence: Toward a New Social Contract for the Digital Economy", forms the basis of this interview.

The full audio for the interview is here. You can subscribe to the Frontmatter podcast in iTunes or add the podcast URL directly.

This interview has been edited for conciseness and clarity.

Transcript

Towards a New Social Contract for the Digital Age: Chancellor's Lecture

Blockchain Revolution: How the Technology Behind Bitcoin Is Changing Money, Business, and the World by Don Tapscott

Professor Stephen Brown: If we're ready, I'll take a moment, first of all to welcome you, to thank you for coming out on a day when you might preferably have been making snowmen or laying down the foundation for a hockey rink, or doing something else typically Canadian.

I would like to think that this discussion is also typically Canadian, in that Canada has now and in some ways always has been a leader in innovative thinking, and our ability to adapt to conventions that we have brought from Europe, and those that we have developed since, are part of our hallmark. One of those is the way we're going ahead with this second media revolution, to adapt those things we've inherited, like social contracts, to that new media, and that is our topic today.

If you let me quickly introduce our two guests, we will allow them to engage in the conversation they've come prepared for.

On my far left is Mr Don Tapscott. We can call him Doctor Tapscott if we like, he's got three honorary degrees.

He's also a great rock star here - I haven't heard the band - and he is the author of 16 books on a wide range of topics, including one of the most important books on the current media, and that's Wikinomics, wzhich has been translated into over 25 languages, has been a bestseller for over a decade now. And recently the, "Blockchain Revolution."

He also has a TED talk, like a real TED talk, not those fake TED talks that you can find all over the place - the one I looked at has almost three and a half million hits. What's the other one got?

Don: I don't know.

Professor Stephen Brown: You don't know?

Don: A lot.

Professor Stephen Brown: So and it's wonderful to know that one of Trent's early graduates, one of our real foundational alum, has that kind of world outreach.

But we have another person who's kind of related to Trent, in that his twin brother has been wandering our halls now for a decade and more - Len Epp, who has a DPhil in English from Balloil College in Oxford, and was for a while an investment banker, and decided he would try his hand at innovative publishing.

He is a co-founder of Leanpub, an extraordinarily innovative way of bringing texts to a wider public. An author can register with Leanpub, and know they can write their book with their audience engaging with them. Their reader's there for the whole ride.

And when it comes time to sell it, 80% of the purchase price goes to the author, which is in itself an extraordinary achievement.

If you want to, you can go online and you can read Len's satirical novel in which he takes [four] Canadian academics - a poet, a professor of English, a graduate student, a researcher - the usual gang - and they go across Canada looking for the Canadian identity. It is an extraordinary piece of work, and a very fine example of what the current approach to marketing fiction is.

So with that, I give you Len Epp and Don Tapscott, in what will be, I'm sure, an extraordinary conversation. Thank you for coming.

Len: I'd like to begin by saying that Don is typically represented as he ought to be - as an expert and world leader in thinking about technology and business and the economy, but he also writes about politics. And so I'd like to talk to him today as Don Tapscott, the political theorist.

My first question is, could you talk a little bit about the crisis of legitimacy that you see us having in our democracy's today?

Don: Okay, let's get into it.

Len: Easy one to start with.

Don: Let me just say a couple of things about the introduction. Len, 80% goes to the author. Do you know that as a bestselling author, I get 14%?

Len: That's pretty high.

Don: Not the gross sales, the net sales. So buy my book, because I make a buck ten every time you buy one.

And yes, I do have a band. We used to do a concert every year, whether our public demanded it or not, but we kind of got good. And we do charity events. So if you have a charity, our price is right - zero dollars. We've probably raised about three million bucks for good causes.

So I'm delighted to be here, I really mean that - and I'm very excited about the opening of the library today.

The topic on the table is a big one. What's happening to democracy, I think, is a good place to start.

I think we have a crisis of legitimacy of our democratic institutions. This is a problem that exists all across the democratic world. But exhibit A, and sort of the mothership for this crisis, is United States.

This is reflected in a number of different ways. Young people are not voting. Many agree with the bumper sticker, "Don't vote, it only encourages them." And that's kind of funny, except it isn't really. Because democracy to me is the best political system that we have. It has a lot of flaws, it needs reinvention. But the alternatives are not great.

Politicians are beholden to special interests. This began with Congress. 95% of Americans think there should be background checks for firearm purchases. But Congress, Senate and House, can't pass a law reflecting the will of 95% of the people.

Government for the people by, the people, of the people - this is reasonable. But it just doesn't exist.

And then we have a President of the United States now, the Executive office, who says that democracy doesn't work. That it's fixed, voting is rigged, there's fraud everywhere - that he actually won the popular vote. So, millions of fraudulent votes were cast.

Now the truth is that in a big recent study, they found six fraudulent votes that had been cast in the election. And so the legitimacy of the Presidency is in question.

Now we have the Supreme Court - where for decades at least half of the American population will not think that that's a legitimate institution.

Then there's the free press, the foundation of democracy, which, according to the Executive office and the President, is fake. The New York Times is fake news.

My son wrote an article last week for The New York Times - he spent a day doing fact checking and error correcting and editing on that. It was such a rigorous process to get that right.

So, where's all of this going to go? Part of the problem - I'm going to resist the temptation to spend the whole hour answering this question, but I'll just say one thing. Part of the problem, this is humbling for me to say, is that new technology is enabling this.

Now, I've been the biggest champion of the internet of anybody probably - I've probably sold more books about the digital age than anybody.

But I used to think, and I wrote this in a 1994 book called The Digital Economy,tThat the new media is going to be amazing. It's unlike the old media, which was centralized one way, one size fits all, controlled by powerful forces. Recipients were passive.

The new media's the antithesis of that. It's interactive, it's collaborative. People are active. It has this awesome neutrality. It will be what we want it to be. And surely there are more good people than there are bad people?

One of the things I said is, I think the internet will bring us together, because we'll all have access to the truth.

I also said, it could go the other way. We could end up following our own point of view and our own set of facts. And if we did that, we'd be in these little self-reinforcing echo chambers, where the purpose of information is probably not to inform us, but to give us comfort for our pre-conceived point of view.

And there's absolutely been a fragmentation of public discourse in the United States. So, the President of the United States can tweet that his predecessor Barack Obama wiretapped him. Which is not only impossible, it's preposterous. And 30% of the population agrees with that.

Len: On that note, one question I have for you is that, people in this debate often focus on the technology and the problems with, say, Facebook or Twitter not filtering things out. But you've been writing and talking for years about how people need their own bullshit filters - long before social media became a real thing. And particularly, that digital natives - people who grew up with the web, tend to have better BS filters than people who didn't.

I'm curious to know if you think that there might be a sort of generational divide along technological lines that's partly propelling the fake news phenomenon. Because there is real fake news out there, real propaganda.

Don: This whole generational thing - I don't know if any of you know, but in the mid-1990s, I became interested in studying children, because I noticed how my own kids were effortlessly able to use all this sophisticated technology. At first I thought, "My children are prodigies." And then I noticed that all their friends were like them, so that was a bad theory.

So I started working with 300 kids, and I wrote this book, Growing Up Digital, in 1997. It defined the net generation. I said, "There is no generation gap like there was when I went to Trent." There are big differences between kids and parents - of values, lifestyle, ideology and so on. That doesn't exist today. Kids and parents get along pretty well. What we have is what I call the "generation lap," where kids are lapping their parents on the digital track.

To your point, this is the first time in human history when young people are an authority about something really important. I was an authority on model trains when I was 11. Today the 11 year old at the breakfast table is an authority on this technology that's changing every institution in society. So I wondered, what will this do to the way that they think, the way they access information?

Overall, I'm kind of positive about that. Sure - you go to a restaurant, you see the family next door, where nobody's talking, everybody's looking at their mobile device. But to me, that's not a problem with technology - that's a problem with parenting.

In fact, my daughter, Nicole - who inspired this whole digital native net generation thing - has a one year old, and that kid is not allowed to go anywhere near a screen. She can't even be in a room where there's a television on.

It's not that she's going to grow up unconnected. Of course, she'll get access. But Nicky says, "You know what a one year old needs? It doesn't need an interactive game. A one year old needs a lot of hugs, a lot of kisses. Needs to be read to. One year old needs to jump up and down and play and dance to music and know what a dog feels like. And run around outside, and all the rest of that." So that was a very interesting thing.

To the point. For the sequel to that book, we interviewed or surveyed 11,000 young people in 10 countries. I did come up with this thing, that I'm pretty confident that they have good BS detectors. Because they've grown up with so much BS online - and the technology's sort of like the air, so it's like if you lived in a world that was full of BS - that every time somebody talked to you, chances are, it was BS. Then you would start to develop the ability to scrutinize too.

I think that that's so important. Because, how are we going to inform ourselves in a world where the old ways of doing that are collapsing? I guarantee that your newspaper is not going to be made of atoms delivered to your doorstep in a decade. They're all going to be gone.

And there are all these other ways that we need to inform ourselves. But when you read something online that says, "Vaccinations will cause your baby to grow a second head," or something, we need to develop ways of scrutinizing information.

I think, back to the social contract. Every kid starting in kindergarten, every year, should have a media literacy class. And every year you develop it, you work on it, you practice. You learn how to detect BS, but also how to manage this torrent of information that we all get.

Len: What about people who aren't digital natives? How can we educate them?

Don: There is no hope.

Len: It's interesting, when it comes to things like policy, I was just reminded of Orrin Hatch, in the United States, who was asking Mark Zuckerberg a question in testimony. And he said, "How is it that you can make money with a business where you don't charge people to use it?" And Zuckerberg was kind of taken aback, and he said, "Well Senator, we sell ads."

If we want to develop policy and a new social contract to deal with some of the problems that technology is bringing to us, but the very people who are in charge of us, are themselves the ones most in need of education, what can we do?

Don: Alternatively, we will need a generational change. The problem with that is there a lot of very urgent issues right now that can't wait.

Most people in this room will know where I'm going on this: climate change.

I was with, the CEO of MaRS, the big incubator in Toronto. It's like two million square feet. It's the biggest incubator in the world. We were talking about climate change and the role of technology in solving this problem, and he reminded me of the statement, I think originally came from Bill Clinton, that if we reduce carbon by 80% in the year 2050 - not by 6%, by 80% - it'll still take 1,000 years for the planet to cool down.

In the meantime, some bad things are going to happen. You can expect a billion to a billion and a half people to lose most of their water supply in the next 15 years.

You think we have a problem of migration today? Of terrorism? Travel conflicts? Wars? It doesn't matter what we do right now. That's going to happen.

So if we leave this until a new generation, your generation, takes power, what's that? Another 20 years or something, 15 years? I think that we're going to be in deep, deep trouble.

I don't just talk to young people. As a Chancellor, I speak at the convocations. And I have several times said that, "My generation is leaving your generation with a bit of a mess. Sorry about that. You're going to have to fix this." But the truth of the matter is that my generation's got a lot of stuff to do now, and we have a lot of responsibilities as well. Which is why I'm becoming a very small-p political person.

Len: That was something very striking I found in reading your work, from the beginning to the end, preparing for this interview. It was in the afterword to the latest edition of Blockchain Revolution that you have this - you just "go there" and talk about Trump and Brexit and things like that. It was a very interesting turn.

On that note, I wanted to ask you - you talked about things that have been done wrong in the past. In particular, you talk about how the post-World War II social contract was broken, and one of the origins of that breaking was political changes in the early 1980s. I was wondering if you could talk a little bit about that?

Don: Well, the idea of a social contract goes back a long way, to some of the old philosophers - think Hobbes.

It's kind of like a deal in society, that's the way we think of it today, between the main pillars. The three pillars are the private sector, government, and civil society - which I guess could include unions, or what's left of them.

Ehen we moved from the agrarian age to the industrial age, over time, it sort of culminated in Bretton Woods after the Second World War. We created a social contract. We figured out a lot of stuff. We figured out people are going to need to be literate.

So we created the public education system. Created a law, that you have to go to school. It's against the law not to go to school. That was new.

We figured people are going to live in the city, they're going to need a social safety net. We need a way of funding that. We're going to have to tax income.

We figured out you can't have one oil company owning all the oil. So we created anti-monopoly legislation.

In 1934 in the United States, we figured out that if you're going to have a publicly traded company on the stock market, maybe they should tell their shareholders something once a year? Prior to 1934, if you had a public company you didn't ever have to tell your shareholders anything.

So those are four of dozens of decisions. And after the Second World War, we got together - "we" being the winners, a bunch of countries, and there were some corporations at the table as well, there wasn't really civil society or NGOs, because there weren't any - unbelievably; in 1946, the civil society was tiny. Now it's, I don't know? 10% of a Western economy, like Canada or the United States.

But we figured out all these things to try and prevent a war from ever happening again. We created the IMF and the World Bank. And then a year later, the United Nations, and then the GATT around trade. And the WTO, and a whole bunch of other things.

And today, a lot of that is breaking down. Let me just give you one example.

Everyone assumes that we achieve prosperity through having a job - individuals, right? You work, you get paid. It's different than under feudalism and the agrarian system, where you worked and you grew all this stuff, and then you gave it away to the landlord and you got to keep some of it, kind of like with data today. We all create this data, and then Mark Zuckerberg and others expropriate - we get to keep a few cabbages of data. We can talk about that if you want.

I always thought that Schumpeter - anybody here study Schumpeter, creative destruction? - that big innovations happen, big technologies, they smash all institutions and industries and new ones arise. That's what's so great about capitalism.

But we have a wave of technology about to hit us, where I don't think that's going to happen.

You combine artificial intelligence, machine learning - where computers learn to do things that they weren't programmed to dok, along with blockchain and new transactional platform - and you get the internet of things, trillions of smart objects all talking to each other and doing transactions. You get autonomous vehicles, you get robots and drones and technology in our bodies and so on.

So you put all that together, 48 of 50 states - the number one job type for men is truck driver. I think that's gone - not in 100 years, in a decade.

Number two, or the number one, job for women, is cashier.

And it's not just blue collar work, it's knowledge work. Computers can diagnose patients better than doctors, and they can analyze x-rays better than radiologists. They can dispense pharmaceuticals better than druggists - which I'm happy to hear, because twice in my life, I was almost killed by a pharmacist who didn't read the prescription right. And my son was very, very sick as a result of the same thing.

I think this is going to disrupt labor markets, and we're going to have structural unemployment.

So what's the new social contract around that? There's lots to be done. Climate change, poverty. Our economies are growing, and prosperity's shrinking. There's wealth being created, and the middle class is getting smaller in most OECD countries. There's lots to be done, but it may not be done through a traditional job in the sense of a private sector company giving you a salary to do something. I think we'll need something new.

Len: Specifically, you're a supporter of a universal basic income, I believe?

Don: Well I am, but to me that's sort of the beginning. We're wealthy as a country, and many of these societies - you just can't let people starve to death, or be homeless, or have illnesses that are not treated.

But I kind of like the idea of upping the ante on that one to say, "No, everyone has a guaranteed job." It won't necessarily come from companies - but there's lots to be done, so we're going to have other vehicles for funding jobs - civil society, philanthropy and government.

I think that what we put in the commons is going to grow. Now, you can call that socialism or whatever you want. Throwing names on things, to make them a pejorative, is not helpful for me. I just want to talk about the idea. What are we going to do? Who's got a better solution?

Now the trouble is that you contrast that with what's happening today, where there's sort of a survival of the fittest - I almost said "Darwinian," but Darwin was a great guy. - the survival of the fittest sort of mentality, that the best - it started with Margaret Thatcher and Ronald Reagan and now it's typified by Steven Bannon and Donald Trump and so on - that it's all about the individual. It's all about individuals taking action.

This debate with Steve Bannon. I didn't go to it, but I did read about it in the newspaper. It's all about - people are sick of elites. How ironic is that? Congress can't pass a law reflecting the will of 95% of the population, because of these powerful moneyed interests that control it.

And Donald Trump, I don't know? Is he like the little guy, or would you put him in an elite category? Setting aside the irony of that particular point of view, I think we're going to need a new social contract.

Len: Another aspect of the new social contract that you propose, in addition to perhaps something like a universal basic income, is a portable safety net. I was wondering if you could talk a little bit about that. I think it comes from some of the issues in the gig-economy style of work, where you might be doing a little bit over here, and a little bit over there, and not be a formal employee at all, even though you're working all the time.

Don: I think there's going to be a lot more of that. We're going to have to strengthen the social safety net. The idea of portability is interesting, and it starts with our identity. So right now, the virtual "Sherry" - there's a mirror image of Sherry online, all its data, and the virtual Sherry knows more about the real Sherry than she does herself, in a whole bunch of ways - because she can't remember what she bought a year ago, or said a year ago, or her exact location a year ago, or what her heartbeat was a year ago, or what she got on a test a year ago, or what medication she took a year ago - you know what I'm saying.

The trouble is, the virtual Sherry is not owned by her, it's owned by Facebook and Google, all the big banks and governments - and a lot of countries, governments, big time, are collecting this data. The social score in China - if you don't pay your parking ticket, you go in a demonstration - your kid's not going to get into a good school.

Len: I don't know if everyone here's heard of what Don's talking about. But China's actually constructing a system nationwide where you're being watched all the time, literally by facial recognition and things like that. Your activity's also being tracked in other ways, like if you commit various kinds of infractions - not even necessarily the law, but just like the rules - like if you light up in the airplane or something like that. And they're thinking of establishing a score, and that score will essentially be known by people, and your treatment by other people in society will be determined in part by that score.

Don: So Orwell, he had no idea. This is so much better than Orwell could ever come up with.

So, there's all this data. Now, this data constitutes the new asset class of the digital age. It's the new oil, if you like? Maybe the biggest asset class ever, and it's what’s behind the bifurcation of wealth. And Sherry herself is sort of a digital search. This is feudalism.

We're at Trent, we know about history. Remember under feudalism, you were tied to the land. And Sherry's tied to all this technology and she creates the data, she grows the vegetables. But they get expropriated by the digital landlord and she gets a few cabbages.

And that's a big problem. Because she can't use the data to plan her life or health or anything else. She can't monetize the data. Other people are getting rich from the data that she creates. And the biggest problem to me is that her privacy is being undermined.

People say, "Don, privacy's dead, get over it. If you have nothing to hide, what's your problem?" This is absurd. Privacy is the foundation of freedom, as that social score example shows.

All this data constitutes our identities. We need to get our identities back, so that we can manage them responsibly, and use this data to help us conduct our own lives.

Imagine if our data was portable, and it's in a black box, and it moves around with Sherry, and it's sweeping up all this transactional stuff, exhaust that she leaves behind. It's measuring her heartbeat, and it's capturing all this other stuff. That would be a very, very powerful thing. The idea of a sovereign identity owned by the citizen, not owned by big companies and governments.

Len: It's a really fascinating idea. It's literal self-possession. Your self is partially incarnated in the data about you, and that data is in a sense you.

What you argue is that this is an asset and it's something that belongs to the person themselves, and has been taken away before we even had it in the first place.

How would this work, if one were to have this digital black box of one's own data that one possessed, and then one could sell access to companies that wanted access to it?

Don: This is where blockchain comes in. The way that I think about blockchain - this is the underlying technology of cryptocurrencies like Bitcoin - it's not about Bitcoin; there are now a couple of thousand of these blockchains. It's the technology that enables us to manage value peer to peer. Let me just do a quick aside on this.

For the last four years, and I've been at this the whole time. We've had the internet of information. That's what the internet's about, right? Information. But if I send you some information - a PDF, a PowerPoint, if we send an mp3 of this podcast - whatever, you're actually not getting the information, you're getting a copy. Even with a website, I keep the original.

That works great for information, but when it comes to assets - things like our identities, intellectual property, money, security, stocks, bonds, contracts, deeds, loyalty points, cultural assets like art and music, your vote - a vote is an asset, something of value that belongs to somebody. When it comes to those things, sending a copy is a terrible idea. You don't want someone copying your vote or your identity.

And if I send you $1,000 it's really important that I don't still have the money, right? Cryptographers have called this the "double spend problem" for a long time. The way that we manage that is through big intermediaries, like all these people that are capturing our data. It was originally banks and credit card companies and governments, and stock exchanges.

But now we have social media companies, big technology companies. And they perform all of the business and transaction logic for every type of commerce. They identify who you are. They identify the asset. That's a dollar, that's a stock. They clear and settle the transactions and they keep records.

There are growing problems with these intermediaries. They exclude a couple of billion people from the global economy, they capture our data, and that's a big problem. They slow things down.

Why does it take 7 days for money to go from a Filipino housekeeper in Toronto to her mom in Manila? And why is she charged 10% to 20% for that by Western Union? How much do you get charged for inter- or cross-country emails? Anyone heard of global email fees? "If you're going to send an email to another country, we're going to charge 20% for that." This is a big problem.

So enter this new thing, blockchain. The way I like to think of it is, it's an internet of value, where anything of value, from money, to stocks, to identities, can be managed, stored, transacted on this platform, in a secure, fully-encrypted, private way, where trust is not achieved by a bank or Facebook or a government. It's achieved by cryptography and collaboration, and some clever code.

On your mobile device, or any other device that you're on - you'd have your identity. And that identity is unhackable. The way the whole system works is that reckless behavior only hurts the individual behaving recklessly, it doesn't hurt everybody.

Today, reckless behavior - if someone hacks into the Facebook server, or if somebody in Facebook makes a big coding error or something like that - then all of a sudde tens of millions of people's private information is released. That can’t happen on a blockchain.

The way I like to think of it is - a blockchain is a highly processed thing. It's sort of - the analogy I came up with was that it's sort of like a Chicken McNugget. And to hack a blockchain would be like taking a Chicken McNugget and turning it back into a chicken.

Now, someday someone will be able to do that, but for now, that's going to be tough.

Len: I can't help but interrupt for a moment and say John Oliver actually featured that metaphor on his show. It's hilarious, and Don shows it in his talks now, the clip. [Go to 11:39 of this video to see Don set up the clip from Last Week Tonight - eds.]

Don: John Oliver, on Last Week Tonight, he had a lot of fun with that. Which is why I only use that metaphor with friends.

He says, "Hold it, that's a terrifying idea. If that Chicken McNugget was turned back into a chicken, that chicken would be bleeped up. That chicken would have PTSD and it would be out on the speaker circuit, talking about the things it saw. "My body is whole, but what of my soul? [chicken noises]." Anyway, you have to watch it.

Len: To what you were speaking about, the example of the housekeeper in Toronto having so much trouble sending pay. It's not only a cost in money, but also in time. I believe part of the story is that she would have to travel for hours to get to the right Western Union to send this money. And then it was uncertain how long it would take - five to seven days to get to her mother. And then her mother would have to travel to Western Union, to get the money. But there actually is currently a blockchain-based solution that she actually uses in order to get money to her mother in a very interesting and quick way.

Don: Yes, now she takes her mobile and she goes, "$300 - Mom." And the money arrives instantly on her mom's mobile. And then her mom looks at the mobile and there, she sees cars driving around in Manila. They're tellers. It's kind of like Uber. There's a five-star teller, he's seven minutes away. She goes, "Boom." Seven minutes later, this guy knocks on her door, gives her the money in Filipino pesos, she sticks it in her pocket. The whole thing didn't cost 12%, it cost 1.5% and it didn't take seven days - it took seven minutes.

So Western Union is toast. And as far as I'm concerned, all these companies that do foreign exchange are in deep trouble - and that's a good thing. Because they've just - this is the global diaspora. People have left their ancestral lands, and they send money back home. This is upwards of a trillion dollars a year. These people have been getting ripped off.

I don't want to be too gloomy about all of this. There are all kinds of really exciting solutions to many of these vexing problems that face us today. But we need to think differently to make them happen.

Len: One thing I wanted to ask you about was, you mentioned it earlier - the advent of driverless trucks. Obviously along with that, will probably come driverless cars as well. Do you think this is going to have an impact on the popularity of personal car ownership? Where people can not own cars anymore, because companies just own fleets and people just hail rides, and basically rent?

Don: Well a lot of - who here is under the age of 30? Under the age of 30, who owns a car? What's that? It's like 5%?

Len: 5 or 6, yeah.

Don: Yeah. So young people already are not buying cars. I think that transportation will become a service, rather than a thing. I've been saying this for a long time, and people say, "Come on, how about the hot rod, and we all love our car?" Yeah - why don't we all go back to Happy Days and the Fonz or something like that?"

This is something we don't think about: subways cost a billion-five a mile. I like subways as well as the next person, I'm a big believer in public transportation. But all we need to do is smarten up our roads at $1,000 mile, and you can have a virtual mass transit system, where all of these things are moving along autonomously, a few feet from each other. There's no slow down because of traffic, because they're autonomous. There are no accidents, and there's no traffic congestion.

Think about the implications of that. Half of the budget of the Toronto Police is traffic. What happens if you free that up? What could you do in Toronto to build a better city, or to deal with homeless people or mental health, or have better housing or whatever else? We look at these old ways of solving problems in society and we don't see how the possibilities enabled by new technology - new forms of social organization - could enable radical changes. Now there are problems, potential problems with that, but -

Len: Thanks for that great explanation. You touched on a really interesting paradox about massive change.

Which is - for example, you said there will be fewer accidents. Over a million people might not know this, but over a million people die in automobile accidents every year around the world. And 20 to 40 million are injured. Wouldn't it seem straightforward that we would all do anything we could to solve this? If this was SARS or something like that, we would be doing everything we could to solve that problem.

But what happens if we move to driverless vehicles, and the three million truckers in America lose their jobs? I'm just going to paint the darkest scenario: every automotive section in every Home Hardware goes away, and all the companies and all the people that work for them, that provide those parts, are gone. And that Police department loses half it's budget.

That's one of the reasons I like this example so much, because the benefits are just so stark and obvious. But when you start to think about the cost, that's when, even people who are immediately convinced, start to walk back in their minds. Because it's a genuine paradox.

Don: Yes. And we could decide, "No, we're not going to do this totally rational thing, because it will cause job loss." But I think ultimately, that's not going to be a viable way of proceeding. Because the math is just too high. Why not free up all of those resources, save all those people's lives, radically reduce the cost of transportation, and figure out a new social contract whereby all those people can be redeployed?

Now, those truck drivers are not going to become professors, okay? Some of them might. But there are things that they could do. Those things may not be provided by current labor markets, which is why we're going to need a new social contract to figure out what they could do.

Another interesting thing on that: what happens when an autonomous vehicle kills somebody, one person? Tens of millions die, but it kills one person. That's going to be the headline, isn't it?

Len: Yes.

Don: It already has been. The vehicle - it's driven millions of miles, the only accident it ever had was when it was rear-ended from behind by a human. But there was a case of another one that made a bad choice. I own a Tesla, I don't let that thing go by itself. I put it on autopilot and it's safer than I am - on a highway. But if there's anything odd, like I have to turn or something - it requires me to make it happen.

But again, do we have the kind of wisdom in our systems of public discourse to be able to rationally consider that that one death is far outweighed by all of the benefits of moving to some new platform like this?

Then, there are a million other issues. Who's liable for that death? What happens if a car's got a choice to make? It's going to run into a bunch of school kids at a bus stop, or it's going to save you, or it's going to kill you, because the only other thing is to hit into a pole in the other direction.

All of these problems - everywhere I go, people say, "What about this and what about that?" What I like to do is I say, "Let's take this problem, and we're going to put it in one of two boxes. Box number one is the reason why this is a really bad idea and it's not going to work and we shouldn't do it. Versus box number two - implementation challenges. And all the stuff that people constantly say to me about why these ideas might not make sense, go in box number two so far.

Len: My last question, before we go to the audience to ask the questions - we've got about thirteen minutes left. My last question is a bit of a selfish one, but one I've been very much looking forward to asking you.

You've been writing about digital technology and computing since the early days of personal computing. I think you wrote a book or published a report in the early 80s talking about how personal computing was going to become a real thing. Not everybody believed you at the time.

But the reason I want to ask - you touched on something earlier about how - if a driverless car gets in an accident, people freak out. When you ask them, "Would you ever get into a driverless car?" They'll say, "No, I wouldn't trust it." But you trust the stranger that you get into the taxi with. You trust the angry - I'll say like 43 year old person who is a little bit drunk because they had a bad night, you trust them. You don't know who's in that car, you don't know who's coming at you.

But somehow, if there's a computer involved, people get concerned in a specific way. It's not just in driving - it's all sorts of things. Like with ebooks, the world that I inhabit, there are people who say, "I want a real book."

We even use the term "virtual" to refer to what happens on computers. Of course it's not virtual, it's not spiritual, it's not ghosts - it's real.

But there's something about computers, which I think is related to the invisibility of the work being done. With the conventional car, you can hear it, open it up, you can put the gas in.

What is it about computing that is so kind of uncanny to some people?

Don: Well the first thing - in the 1970s, I was at [?] doing research, I'm really dating myself here - and we were studying how multi-function workstations connected to a vast network of networks, based on a thing called the ARPANET would change the nature of work and the design of organizations. I wrote a book in 1982 saying, "Computers are not just about data processing. They're going to become a communications tool, and everyone will use one." And for close to a decade, everybody said, "That's garbage." Well - not everybody. But I'd go around and people would just - I don't think anyone in this room would guess the reason why that was a dumb idea, why my idea was a dumb idea.

The thing that people said to me, why this will never work, was: "Regular people will never learn to type." The stigma around being on a keyboard - it'd be like going to the moon, for a regular manager to think about how to use a keyboard. And if you sat down at a keyboard in an office, people will make fun of you. "Hey Bob, your secretary sick today?"

Len: That's a very, it's a very real thing to this day.

Don: But the thing about the fear of technology - again, young people don't have that kind of fear, because it's not there. The technology's not "there." It's like the air. They've grown up bathed in this.

But there is a real legitimate concern here about technology. I'll just give you a couple of examples, about what's actually going on inside the computer.

So you think about Facebook and what goes on behind the scenes when you post something. If I take this phone - we're in the physical world now, I'm holding a phone. If I let go of it, what's going to happen to it? Is it going to go up? Is it going to go sideways? Is it going to go - ? No, it's going to go down. That's because there's a law, and that has an algorithm, called gravity.

You go onto Facebook and you post something, what happens to it? I often notice that I post something with a URL that takes people to another site, and nobody "likes" it. Maybe there's an algorithm that says, "We want to keep people on Facebook."

I'm sure all of you have had the experience. You go online and you're looking for a cappuccino machine on Amazon. And all of a sudden there's an ad for a cappuccino machine on Facebook. So those would be two of a million examples of - in the digital world, all of the rules and the algorithms that govern our human interaction with other humans, are unknown to us. And that's a problem to me.

A second set of problems has to do with artificial intelligence. We create software that does things that it wasn't programmed to do. It's called machine learning. The software can learn. It can adapt itself and it can develop new programs and new capabilities and so on. Harnessed correctly, wow, what a force for good. Solving big problems in medicine, climate, and you name it.

But there is a point where, not just bad actors might get involved - you create some kind of autonomous agent on a blockchain, it's flying around - and you create it to do bad things. You've essentially created a virus with a bank account. And ultimately that thing could go hiring people to do bad things, like really bad things.

After the second TED talk that I gave, Kevin Kelly, the founder of WIRED, got up and gave a real upbeat talk about AI. And then they had a guy give a doomsday kind of talk about AI. They were both very interesting. And the doomsday speaker - I'm sorry I forget his name, but he's well known, he said, "We're humans, and we're way more powerful than ants. Well, computers are going to become way smarter and more powerful than humans. You don't just go around looking for ants to kill. But if ants become a big problem, if you become infested by ants - then, as a human, you've got this capability to just wipe them out. Well, supposing that we create these really super powerful technologies, and they decide that we're a problem, we're just ants to them. And they're acting in their own self-interest, and arguably abiding by the terms of their original formation - which was created by humans." That's just an extreme example, and it's not going to be a problem in my life, I don't think. But it might be in the lives of the non-car owners who put up their hands here.

Len: On that note, I'd like -

Don: That was a cheery note, wasn't it?

Len: A cheery note. Actually, I should say that one of the very interesting things that Don speculates about, is the possibility of ownerless businesses - that through AI and things like blockchain technology, with things like smart contracts, where there's a little code that has rules that are carried out when it, say, receives some money - or when it spends some money - you may end up with autonomous organizations that actually have funds that they can make decisions to do things with.

Don: In fact, one of those existed. When we were writing Blockchain Revolution," we said, "Using all this technology, could you have a company with no people." There'd be a bunch of distributed applications on a blockchain using smart contracts, just like what it sounds like. Contract that self-executes. And autonomous agents. Little bits of AI that are learning.

We almost didn't publish it, because we thought, "People will think this is too futuristic. There's a fine line between vision and hallucination." We called it the "Decentralized Autonomous Enterprise," and we went ahead and we published it.

A week after the book came out, an organization was launched, called, "The Decentralized Autonomous Organization." It had no CEO, no management, no people. It was a venture capital company, and its job was to go raise money, and invest the money. In three weeks, this entity - with no people - raised 164 million US dollars.

Now, it's not a happy ending, because there was a programming error in the smart contract. And the creators of this thing decided to give the money back before this error ended up being a really big problem. But the fact that this could exist - Bob Dylan, "There's something going on here, and you don't know what it is."

Len: I guess in a way, human error is kind of an positive thing sometimes -

Don: That's funny.

Len: That our own failings will be reflected in the things that we deal with and that this is a bit of a check and a balance in its own way, even though it can have negative consequences.

We do have time for a couple of questions. So if anyone wants to raise their hand in the back.

Yes I can hear you, and I'll just repeat your question if I can remember it.

Len: That was a great question. What role was there for humans in the future, where there may be autonomous organizations and machines making all these decisions and even carrying out actions?

Don: This is why I like the idea of, everybody gets a job. Because to me, I'm old school on this. I think you get real satisfaction in life from work, from being productive, making a contribution.

Right now we define work as, "Get me a job," largely funded by a company, where you get wages for that job, and you do something. I'm not sure most jobs give you a full sense of purpose and value, but at least you're making money.

I think that in the new environment, there is a lot to be done. We're going to have to reconfigure all of that.

I don't think there is a post-human world. To me, humans are what matters. Now, animals in our biosphere and all the rest of that matter too. But they matter because they help us have an existence and have a good life.

I'm not really super negative about this. We focused on the problems today, but I think there are real solutions to many of these problems.

I'll tell you one thing. In the future, little companies are going to be able to have the capability of big companies, without all the main liabilities. Legacy, culture and bureaucracy, and systems and so on. If we do this right, there should be a halcyon age of entrepreneurship.

We can bring two billion people into the global economy overnight with blockchain, if we want to do that. 70% of the land titles in the developing world are not valid. You're in Honduras, a dictator comes to power. He says, "You may have a piece of paper that says you own your farm, but our central government computer says my friend owns your farm." This happened on a mass scale.

You put land titles on a blockchain, they're public. Nobody can mess with it, unless you can turn a Chicken McNugget back into a chicken. Oops.

Len: On that optimistic note, I think our time is just about up. But before thanking Don, I wanted to read something that he's written, and he's said, a few times. "The future is not something to be predicted. The future is something to be achieved."

I think that in that spirit, we should take everything we learned about or we've discussed today as an opportunity to improve the world, and particularly to address the crisis of legitimacy that we're all experiencing - and we'll all be learning about what more has happened in the last hour, when we all go on our phones, in about two minutes.

So, thank you very much Don for coming here, and taking the time.

Don: Thanks Len.

Podcast info & credits
  • Published on November 28th, 2018
  • Interview by Len Epp on November 16th, 2018
  • Transcribed by Alys McDonough