David Weinberger, PhD, is a renowned Harvard Business School researcher, technologist, philosopher and the author of Everyday Chaos: Technology, Complexity, and How We’re Thriving in a New World of Possibility. In his books, talks, and articles, David explores the effect of technology — especially the internet and AI — on our lives, businesses, and ideas.
In this episode of The Orbit Shift Podcast, David talks to us about the chaos of the internet that is prone to the butterfly effect, how the way in which founders build products has changed and about the bias in AI and machine learning that should concern us going forward into the future.
Q. How has the internet created a chaotic world?
David: Over the past 20-25 years, we’ve been living on the internet, and the internet has gotten us accustomed to a chaotic world. In the sense that there are so many small pieces connected in intricate patterns where everything affects everything else because it’s all linked and because of the butterfly effect. An effect stated as when a butterfly lands in Argentina, it might set off a hurricane in Albuquerque because everything connects to everything else. A small event can start a cascade that results in a significant event.
One of the trivial ways the butterfly effect shows up on the internet is our inability to predict what will take off and go viral. For example, I’ll give you a simple case of the butterfly effect on the Internet: the ice bucket challenge where people pour ice over their heads to raise money for charity. If I told you that’s going to go viral, you’d probably laugh at me. Not only did it go wildly viral, but it also raised over $100 million for this charity.
Q. How has this changed how products are built?
David: Traditionally, in product management and product marketing, you figure out what your market wants, and you build the product that meets that need. The Ford Model T stayed the same for 19 years and sold 15 million cars because Henry Ford was a genius at anticipating his customers’ needs.
Nowadays, you need to take a Minimum Viable Product (MVP) approach. For example, a hundred years after the Model T, in 2008, Dropbox launched with the smallest feature set to sell to customers. Then they watched what customers did with it, what they wanted from it, where it was not working for them—and then they set out to build the product from there. Rather than anticipating the market, people now launch with the least amount of prediction they can do.
Another example is the iPhone. It has virtually nothing in it that had not been in prior phones. But it revolutionised everything because of the App Store. It was like Apple saying, “We cannot anticipate what everybody is going to want to do with this device. We’ll give people our starting set on the phone, but then anybody in the world can build something that we didn’t think of or didn’t want to invest our time in and make it available through an app store.” That App Store has well over 2 million apps in it now.
Q. A trend we see now is participatory consumption, where the users want to be part of what’s being developed. How does this impact the way products are being built?
David: This is one of the earliest things that the internet-enabled. Up to the mid to late nineties, a business was a self-contained entity. It controlled all the information that came out of the company. If you wanted to buy a car, you would go to a car dealer, and almost all the information about the vehicle, including the reviews, would come from the company. When the internet started taking off, it started getting filled with information coming from customers and outside experts—and this changed the power relationship between businesses and customers.
This has evolved to the level of participatory consumption that we see today. People care about the products they use, and they want their ideas to be listened to, whether it’s good or bad. Smart companies have started listening to their customers and including some of their ideas, but some companies are still struggling to deal with it.
Q. How do companies adapt to this change in power dynamic?
David: It depends on the company and the culture of the company. Some companies in software can alter their product easier than others after listening to their customers.
Companies like Slack or Dropbox provide a means by which customers can alter the product to fit their needs. This works well because, apart from the customer adding personal value to the product, they feel that they have contributed to it when it meets their specific need.
Companies can also set up mechanisms where they can listen to their customers on social media. Something I did not foresee is what’s going on now on Twitter with brands. In marketing, 25 years ago, a brand was inviolable. You never played with your brand. Google showed that you could be a successful company and play with your brand. Every day, their search logo is something different. I also enjoy how companies engage in trash-talking with other brands or competitor brands: it makes them more human.
Q. You say in your book that the future has changed because the internet has changed many things that we do today, and AI is now reimagining it. How should a startup founder look at this?
David: Machine learning, a type of AI, has become one of the required buzzwords today. We are all users of machine learning in one way or another. If you have a smartphone and use it for weather reports or use the auto-correct option, these are all examples of machine learning.
Machine learning can bring tremendous benefits to startup founders for their businesses—especially founders who have a lot of data that they want to make predictions on. Machine learning helps us make predictions, which we cannot do efficiently or accurately right now. Founders can use machine learning to find a vast net of relationships, some of which may be rubbish, but some of them would be relationships that a human could never have thought of.
However, the original sin of machine learning is bias. It’s using data that comes from a society that contains biases, and that data will almost inevitably reflect those biases. The machine learning that’s learning from that data is going to reflect and possibly amplify those biases. For example, let’s say you’re training a system on unemployment data because you want to use it to evaluate incoming job applications. In our culture, women have not gotten their fair share of management and senior management jobs. Suppose you train a system on existing data. Unless you take care, it will look at a woman’s incoming application and say, Women, don’t do very well as managers, and repeat the bias.
Now I’m oversimplifying. You certainly would take out gender information, but the gender information can sneak back in by being correlative to other seemingly innocuous data. The data problem is incredibly real, incredibly important, and a lot of effort is being put into trying to lessen the risk and redress the issues.
Q. What does this mean for companies in banking and insurance when they’re unable to detect why certain relationships are being made?
David: Credit card companies routinely use machine learning to spot fraud or possible fraud. It is not used in the U.S., by law, to determine your FICO score, i.e., your creditworthiness.
A FICO score is modelled by a set of factors that can be explained to a customer who has been turned down for a loan, and these factors have to be actionable. With machine learning, we cannot pinpoint why someone has been turned down for a loan because of their low FICO score, so it is not used for creditworthiness.
We can still use machine learning to detect fraud because it can go through billions of transactions and flag a suspicious transaction, and then that transaction can be examined further for fraud. We don’t need to understand why the transaction has been flagged for fraud. We just need to identify a fraudulent transaction.
Some states in the U.S. use machine learning to suggest prison sentences or bail amounts. In some cases, those suggestions are followed. For me, this is a terrible system because if you get sentenced to jail for 25 years, and you look at the person next to you, and that person got two years, the only difference you can see is race, gender or age. You have a right to know why? If you’re using a machine learning system that cannot tell you that, then justice has left the building.
Q. Where can our listeners follow your work?
David: They can follow me on my homepage or Twitter. My most recent book is Everyday Chaos: Technology, Complexity, and How We’re Thriving in a New World of Possibility.