Amy Costello: Welcome to Tiny Spark, a podcast of the Nonprofit Quarterly. We focus on what is required to build a more just society—in matters of race, health, the environment, and the economy. I’m Amy Costello.

Rob Reich: Big philanthropy is an exercise of power, and wherever there is concentrated power in a democratic society, the civic attitude toward it should be scrutiny, not gratitude. So, I think gratitude to our big philanthropists is fine if, at the end of the day, the exercise of their power is to support democracy, but that we shouldn’t begin with gratitude; we should begin with scrutiny.

Costello: That was Stanford professor Rob Reich speaking to us in 2018 about his book, Just Giving: Why Philanthropy Is Failing Democracy and How It Can Do Better. And now, Reich is taking on Big Tech.

The book he recently coauthored provides a scathing critique of the sector. But Reich’s new book also shines some bright spots, including his descriptions of techies who have used their skills to contribute to the common good.

Reich: So, the book opens with a short account of a person named Aaron Swartz, who was a kind of coding genius from an early age, helped to develop the protocols now known as Creative Commons. His whole orientation towards acquiring technical skills was to try to unlock civic potential. You might call him a civic technologist: someone who didn’t think first and foremost about a startup to make a lot of money but wondered how to use technical skills to unlock human potential and to increase the possibility of a common civil society together.

Costello: But to Reich, this humanistic approach to tech is, increasingly, an anomaly in places like Silicon Valley. Today, computer science-types who are solely interested in bettering humanity are certainly around and working hard on a range of initiatives. But Reich and his colleagues are taking aim at the rest of the sector. And if you want to picture the mindset of many of today’s tech hotshots, just imagine walking down a high school hallway, or dropping into one of their parties when Mom and Dad are out of town.

Reich: Right now, in Silicon Valley with Big Tech, we’re in the kind of late-teenager phase of development, where technologists have become fully aware of all of their power. They’re exploring the different ways that they can unleash their agency in the world. But their frontal lobes aren’t completely developed yet, and they don’t have a sense of full connection and responsibility to the larger society. And the way in which that developmental trajectory moves ahead is when lots of other people get involved, lots of other adults in the lives of a teenager, to steer them in a better direction rather than a worse direction, to install some guardrails so that the worst possible outcomes don’t happen.

Costello: So, in his new book, Reich is joining with some other adults and, like any good parents, they’re trying to steer the aspiring techies in their classroom toward a healthier and wiser destination.

Mehran Sahami: My name is Mehran Sahami. I’m a professor in the computer science department at Stanford, and in previous life, I was an engineer at Google for many years.

Jeremy Weinstein: And I’m the third member of the teaching team and the author team for this book, Jeremy Weinstein. A professor of political science, and in addition to the work that I’ve done as a social scientist, I’ve also served multiple tours in government, so bring the perspective of a former policymaker.

Costello: The new book by Weinstein, Sahami, and Reich is called System Error: Where Big Tech Went Wrong and How We Can Reboot. The Stanford professors merged all their disciplines and interests—computer science, policy, and even philosophy—and identified the guardrail needed to guide this unwieldy and increasingly powerful sector toward a higher mission.

Reich: Democracy is that guardrail. We welcome and celebrate the innovation of the private sector, and the role of democracy is to come in and to ensure that we harness the great benefits that innovation can bring to society and really try to limit or mitigate the harmful effects that also happen.

Costello: Reich and his colleagues realized that would require more focused education. Computer science is one of the most popular majors at Stanford—it’s taken by nearly 20 percent of the student body. But in recent years, this trio of professors noticed their students were often attracted to the field mainly because of its financial rewards. And, too often, they weren’t giving enough thought to what their societal and ethical obligations should be if they were going to work, and potentially become leaders, in this sector.

Stanford is extremely competitive. Its computer science program is world-renowned, and the average salary for its graduates is more than $125,000.

And just like the philanthropic sector that Reich has critiqued for years, tech too has immense power.

In addition to writing this book, Weinstein, Sahami, and Reich also teach a class together where they confront directly the power that their students will wield as they embark on their careers in tech.

And this moment we’re in right now makes this work critical—because our democracy, that all-important guardrail that Reich described, the one that is so badly needed in order to rein the sector in and keep it working toward the common good? Well, Weinstein says democracy is meeting a powerful foe in tech.

Weinstein: We think about this as a kind of ongoing race between disruption and democracy. So, disruption is the powerful force of innovation that we see emerge in the private sector, driven by our scientists, by our innovators, by those who turn these innovations into products that transform our lives. Historically, in the United States, we’ve seen just tremendous periods of change. In each of these moments, what we have seen is democratic institutions struggle to keep pace, in part because our democratic institutions are trying to pave the way for innovation and creativity, and in part because it’s challenging to get your head around what are the potential harmful effects that you’re trying to mitigate, and then how to build the political consensus around actually taking action. And for anyone who’s watching our democracy in this moment, as we think not only about governing technology, but also the challenges that we’ve seen in recent years with political polarization in the United States, our inability to get our head around how to address climate change and its harmful effects, this is a challenging moment for democracy.

But the central argument of the book is that democracy is the technology that we have in our society to referee the really critical tensions and value tradeoffs that are implicit in technology and that are manifesting themselves in terms of the harmful consequences that we see—harmful consequences like the spread of disinformation and misinformation, or harmful consequences like automation and its effect in particular on low-wage workers. But we need a set of democratic institutions that are really up to the task of governing technology. And that means using this as a moment, not only to hold our elected officials accountable for acting on what we might want to see with respect to privacy, or requirements to audit algorithmic decision-making systems, or a reinvestment in skills and training for those who are experiencing the threat or consequences of automation. Yes, we need a democratic government that addresses those issues.

Costello: But tech is advancing at such a fast clip, how can government possibly keep up in their role as the parent in the room? How can regulatory agencies and lawmakers possibly foresee what tech might get up to, even just a couple years down the road? To be able to anticipate problems and opportunities, rather than just responding when it can almost feel like too late?

Weinstein: We also need a democratic politics that’s prepared to address the impacts of technology that aren’t yet visible to us.

Costello: To get there, Weinstein says government needs to start doing some serious heavy lifting.

Weinstein: And that means building a democratic government that not only has technical knowledge and know-how at its own behest, in the hands of politicians and policymakers and experts—something that we’ve massively underinvested in—but also bringing technologists into government so that they are working on behalf of the public interest and not only shareholder interests. And we need to do that in a systematic way. It’s a re-formation and reboot of our government structure, but we need to do it, in part because otherwise our elected politicians and those who serve them really rely for technical understanding and know-how on the lobbyists that are employed by companies that have particular corporate interests in outcomes with respect to regulation. So, we need a government that’s really up to the task of regulating technology, and we need an approach to regulation, taking advantage not only of the federal government but also state and local governments, that enables us to test and experiment with the effects of technologies before these technologies reach such a scale that their harmful effects are felt by all of us.

Costello: Yeah. I mean, one thing that struck me in reading your book was this notion that seemed to come up again and again that we have to get out in front of these technologies before they take off. You note that technology is an amplifier, and it requires us to be explicit about the values we want to promote and how we trade off among them. I’m wondering if one of you would like to speak about this idea that it does seem we’re often playing catch-up when it comes to Big Tech and how we may or may not want to regulate it, and you have multiple examples in your book of really bad things that happen, and then we’re trying to respond after it’s unfolded. Talk to me about the need for us to get out in front of these technologies, out in front in Silicon Valley when a little tiny startup might be about to explode. What questions do we need to be asking? What are the guardrails we need to be putting in place?

Weinstein: Our answer in the book is that we don’t leave mitigating the harmful consequences of technology to government. There’s actually a role for everyone to play in this process. And so, the work that we’re doing as educators at Stanford is part of what we see as the first pillar of a kind of societal response, which is cultivating an ethic of responsibility among technologists, like the ethic of responsibility that emerged among practitioners of medicine over generations. That is, we need those who are designing technologies to understand the values that are being traded off, the way in which choices that they’re making have effects that go far beyond anything that’s immediately located in the technology that they’re building. And we need a set of companies that see this not simply as a compliance issue—that is, “Are we following the law?”—but, more broadly, “We are stakeholders in society, and are these potential consequences of technologies? How do we weigh them off against some of our other goals?” You need to have that built into your design process. You need to have an orientation, not only about what the broader value commitments that you have are, but also how you’re going to explore and experiment with these technologies to understand these potential unintended consequences before you build something to scale. And so that’s work that needs to happen inside companies and among technologists, and so you need that to happen alongside building a government that is responsive and capable and technically literate, because ultimately, government isn’t always going to know what’s coming down the pike until it appears. We need those who actually wield the power in this ecosystem, which is both technologists, who design products, and financiers who, through venture capital, finance the growth of these companies, to recognize that they have a stake in getting this balance right, and that ultimately the industry will only thrive and prosper if there’s legitimacy and democratic support for the continued growth of this industry.

And that’s really the crisis moment that we’re at: a tremendous lack of trust in the leadership of these companies, a tremendous lack of trust that technologists have interests of society at heart and not only their own interests. We need a fundamental rebalancing where there’s a role for government, but a role for companies as well.

Costello: Have any of you seen a place, or a company, or a startup, or an idea where this has happened in a thoughtful, ethical, sound way? Can you point to any examples that kind of give you some hope or optimism, or that we can point to to say, you know, this could be a model for how we might move forward in a more responsible manner?

Sahami: Well, one startup that’s trying to make ethics the business of everyone in the company is a startup that works in the machine learning space called Hugging Face. And, you know, despite sort of the cutesy name, part of their understanding is that they don’t want to just relegate the notion of ethics to, say, a Chief Ethics Officer or some compliance group. The CEO has gone on record saying this should be the job of everyone in the company. And that’s where we think we should get to, is that when everyone’s thinking about the consequences of what they’re building, then you get a much broader picture, both from the standpoint of the general direction the companies want to go with their products, the kinds of affordances that they want to provide to consumers…but at the same time the very micro choices that are made by the engineers in terms of developing those products and what kind of values get encoded in what they’re trying to optimize when they build.

Costello: A big concern about Big Tech, and things like AI and optimization, is how it can lead to racist results, sexist results. And you talk in the book and say that whoever makes the choice of what to optimize is effectively deciding what problems are worth solving. And you talk about the glaring lack of racial and gender diversity in the ranks of technologists and startup founders, which means that these choices rest in the hands of a small group of people who are not representative of the wider world. Talk to me a little bit about your specific concerns as it relates to race and racism, and gender and sexism, and how aspects of technology that are really kind of moving at a very fast pace right now can help to exacerbate inequities as we are experiencing them right now.

Sahami: Well, so, more and more algorithms are being used to make decisions about consequential matters in our lives—things like who gets access to credit, who gets a mortgage, dating apps that we use, even things like the criminal justice system as to who gets granted bail and who doesn’t. And so these algorithms, when they’re trained on historical data, depending on how that training happens, can encode various kinds of biases. One of them is that, historically there has been racism and sexism in various kinds of systems. And so when you take that data and run it into an algorithm to train it, believing that it’s going to be somehow more objective because it’s encoded in a program gives you this false belief that you might be getting rid of that bias when, in fact, all you’re doing is reinforcing it, because what the computer will learn is to perpetuate the biases in that historical data. And in the book, we talk about some examples of that. So, we talk about a system Amazon built, for example, for résumé screening. And what they found after they built that system was it actually was biased against women. So, the word “woman” appearing on a résumé—for example, like “women’s soccer team”—would actually cause that résumé to be downranked, or the appearance of certain names of all-women’s colleges. And they realized this, and they went back, and they tried to fix it. This is Amazon, one of the most technically sophisticated companies on the planet, and they realized they couldn’t actually fix that bias, and they ended up scrapping the system. Now, the good…

Costello: Why couldn’t they fix it?

Sahami: …well, because the complexities of building a system like that, these are often “black boxes.” They’re tuning a bunch of parameters, which are just numbers, with a high degree of complexity. So, we’re not talking about tuning five or 10 numbers. We’re talking about tuning millions of numbers, and human beings just can’t look at the set of numbers that encode some complex function and really understand what it’s doing.

Costello: And Sahami says this is key. Those involved with the Amazon résumé initiative saw these biases. They acknowledged that they could not solve for them. And then…

Sahami: The good news was they actually realized that, and so they didn’t deploy the system at scale. But the question we need to think about is, as more and more of these systems are being used, where are the cases where that kind of bias is not actually detected, and these systems then get used and perpetuate the biases that are encoded in them? And that’s why in the book we talk about things like algorithmic audits to be able to understand the impacts of these algorithms for trying to have greater transparency, and regulation that actually talks about, when we think these algorithms are ready to deploy, what mechanisms we provide for due process, so someone can challenge the decision that’s made by an algorithm in a meaningful way. So, these are the larger guardrails we need around this kind of technology, because it’s wishful thinking to think that it’s possible to build it all without this kind of bias actually creeping into systems.

Weinstein: I want to make one broader point, Amy—

Costello: Yes.

Weinstein: —around this question of descriptive representation. I mean, you began your question with “who’s making the technology, who’s financing the technology,” and the reality that it’s unrepresentative of the society for whom technology is designed. And I think that’s a really important thing. We lift this issue up in the book, both with respect to what’s happening in the financing space, but also who’s choosing to become technologists and the work that we’re doing at the university level on this front.

I remember an exchange in one of the early iterations of this class with an undergraduate, where it was asking the undergraduate, who are these technologies being built for? And the answer was “human beings,” okay? “Everybody.” And I said, “Really? Everybody?” Right? And gave them examples of different technologies. I said, “Are these designed for everyone equally in the United States, or people who are living in rural areas without access to electricity and internet in developing countries?” And they’re like, “No, no, of course not everybody.” So, then I said, “Well, when we think about questions like the public interest, and whether technology is in the public interest, how would we define the public interest?” And one student’s response was, “Well, I’m a member of the public. So, what’s in my interest is the public interest.”

And I think these exchanges just exemplify the challenge of the kind of narrow perspective that young technologists often have, and how, when combined with an almost utopian, sort of hero worship of technology’s potential, we lose sight of the problems that we’re trying to solve, who has those problems, whether the solutions are actually designed to address those problems. And we end up with the creation of streams of technology that are designed to really benefit those who already have privilege in society, to meet their pain points, but then potentially have these kinds of harmful consequences on other groups, perpetuating experiences of discrimination and marginalization.

Costello: What I appreciate in your book is the way that you focus on technologists themselves rather than on technology itself, which I think we can often focus on a lot, is technology, and not necessarily thinking as much about the men and women behind the technology. And your book, in many ways, focuses on these people, many of whom are coming through the halls of Stanford where you all teach. And I’m interested in this diversity question, about who is creating technology and for whom. I think there is an awareness that we need a lot more diversity in every kind of way when it comes to technology, and especially Silicon Valley has been criticized for its lack of diversity, for being mostly white and male. And, as university professors at one of the most prestigious tech places on the planet, what do you notice about…I don’t know what the diversity statistics are on your campus, or within the tech sector and computer science itself. But I’m curious—like, there, on kind of the front lines as you are, where do you see opportunities to diversify the people who are creating this technology? Because I think it’s been a real challenge to diversify this group that is so powerful, so powerful in this country and yet so, you know, homogenous. What are you noticing on campus, and what opportunities do you see for diversifying the people who are creating this technology?

Reich: Amy, I think you’ve got it exactly right here. The diversification of the tech industry is really important—not merely for, let’s say, fairness reasons. There’s extraordinary wealth being created, and it’s concentrated mainly in white dudes at the moment, and it’s important to have much broader opportunity. It’s also important for the reasons we just have been discussing—namely, if we diversify the people who are at the table and building and designing frontier technologies, we should predict that they’ll try to solve different problems. They’ll bring their lived experience to the table, and the kinds of things that occur to mainly white guys around a table designing a technology will be a different set of questions or ideas in a completely diverse group.

Now, we should also be honest here: Stanford and higher education needs to diversify itself as well. The student body here at Stanford is far more diverse than the faculty happens to be. And at least one, I’d say, promising indicator is that a few years ago, computer science became not merely the largest major on campus for undergraduates in general, but the largest major on campus for women. And there are a lot of efforts, a bunch of which Mehran has taken a lead in developing on campus, to ensure that the kind of gateway classes to acquiring these technical skills are not courses designed to weed people out, as if it’s a kind of early training for medical school application and that early classes separate the wheat from the chaff. The idea is that no matter what your high school background happened to be, the computer science major here will give you an opportunity to succeed. And that’s helped to diversify the major compared to what it used to be, say, a decade or two decades ago. And we need many, many more efforts of that kind, not just at Stanford, but across the landscape of higher education.

Costello: Mehran, what are you noticing with respect to diversity, or the lack of it, on campus? When you see these young, bright minds creating the next big thing, what are you observing, and where do we go from here so that the technology that we are creating can be more inclusive, can be less racist, can be less elitist, can be for more people who might not have money and access?

Sahami: That’s a great question. It’s an area that we’ve done some work separately on in terms of looking at, what are factors that cause people to choose their major, and where does this start? And when you start pulling that thread…I mean, we’re at an institution of higher learning, but when you begin to pull that thread, you find that it actually starts much earlier, right? It starts very early in the K–12 educational process. And there’s some social factors in it—like, who takes the keyboard in a class when you have men and women there? The boys are the ones who tend to take the keyboard out of the hands of the women. Part of it is the routing process that goes on in terms of the different kinds of educational pathways that are created. And, so, when we look at the factors that influence, for example, gender dynamics, one example I can tell you is that in our introductory class, for example, it’s almost 50/50 between men and women. And so, you might wonder, well, does that mean the major is 50/50? And the major is not; the major is actually about two-thirds men, one-third women at this point, which is…actually, the shocking part of that is that’s better than the national numbers.

But the interesting point there is that women tend to take their first class in computer science later than men do. And so, if they decide they like that class and they want to potentially major, for many of them, if they took the class late, they don’t have the opportunity to complete the major if they took that first class, say, in their senior year. And so, what really behooves us is we need to create more educational opportunities early on for everyone to be able to participate in technology. That doesn’t mean that we want to just produce more technologists. What it means is that we want everyone to have an awareness of technology. Then, everyone gets a better sense of the role technology plays in their life and the role they can play in impacting that technology.

Costello: Rob Reich, I did want to circle back to something because you’ve been on this program a couple of times talking about your pretty grave concerns about philanthropy and the power that they wield and the ways that they are kind of destroying our democracy. And I did find it interesting that you’ve now coauthored a book about the power that technologists wield in our society. And I’m wondering if you could just speak briefly about the parallels that you see between technologists and philanthropists—you know, aside from the simple fact that often Silicon Valley billionaires become philanthropists.

Reich: That’s right.

Costello: But, I’m curious about some other parallels you’re seeing, because I imagine you’ve been drawn to this subject area for some of the same reasons you were drawn to kind of critiquing philanthropy, I’d imagine, and issues of power and democracy. Could you talk a little bit about that?

Reich: Absolutely. So, this is a topic that I get excited about, excited in both a positive and negative sense. My background is as a philosopher, and a lot of the work that I’ve done in the past has been to try to understand democratic theory—democratic institutions—and anyone who’s paying even the smallest bit of attention these days to the health of American democracy, or indeed democracies across the globe, alarm bells are going off. We see democratic decline, dysfunction, and all kinds of fragility in some of our longstanding democratic institutions.

And so, in the philanthropy book, I was concerned to try to explore how the grand aspirations of philanthropic problem-solvers doing good things on behalf of society might also be in tension with the very basic expectations of democratic institutions and the problem-solving of ordinary citizens, the kinds of things we do together in our capacity as citizens rather than the kinds of things we look to people with a lot of money to do in their capacity as a donor. And here the through-line in technology, of course, is very similar, in that big technologists, Big Tech companies, are wielding extraordinary power in our lives these days, and they’re doing so in a way that doesn’t pay attention to what we in the book call the civic or political externalities—the ways in which when we look at Twitter, or Facebook, or TikTok, or Snapchat, the newsfeed constitutes an informational ecosystem where our political and civic and news lives take place. It’s not just a delightful thing about cat videos. And unless and until the power of these companies takes on board as an important aspect of their responsibility the civic and political health of our lives as democratic citizens, we should expect to see many deep problems with the Big Tech products and our democratic health, and therefore our lives as citizens.

So, you know, the main message of the book, similar to the main message of the philanthropy book, is that it’s so important in democratic societies that citizens feel empowered to make their voices heard, to feel like we have agency in collectively shaping our lives, rather than, in the first case, looking to philanthropists to do it for us, or in this case, to think that all of the relevant decisions are being made in tech companies: “Let’s leave it to the expert technologists to make choices on our behalf,” and we just experience technology as a kind of force that acts upon us. To the contrary, it’s so important that we all feel like we have a role to play, whether it’s as a technologist, a policymaker, a citizen, a user. Across the board, there are roles for every person to play in shaping a technological future that will support rather than undermine democracy.

Costello: Jeremy Weinstein, I imagine, given your background in policy on government, that you would have some additional reflections listening to Rob, then?

Weinstein: You know, I think this is a moment where so many of us are watching what’s happening in Washington, or watching what’s happening with the extreme divide between our parties, or watching what’s happening at Thanksgiving tables where people can’t talk to one another across ideological divides, and feeling hopeless, feeling that there isn’t a way forward. But I think the urgency in this book is a message from the three of us, these three different perspectives—the philosophical, the engineering, and the social scientific—that that we have to reclaim control over this process, that no one is going to fix this problem for us, and that ultimately there is no single right answer to any of these questions. How much privacy do we want, versus how much security do we want? How much do we care about protecting free speech versus protecting people who are the targets of hate speech? How much do we want to realize the benefits of automation, and how do we think about those who are left behind? These are ultimately decisions that have to be made collectively, and there is no other mechanism for doing so than our democratic politics. That means talking to one another across these divides. That means energizing our politicians to take seriously the consequences that are before us. But it also means, as Rob describes, those with power and privilege—in this case, the technology companies—taking seriously the responsibility they have for the health of the society around them, not just their bottom line.

Costello: That seems like a great segue to the last question I wanted to ask, and I wanted to put it to you, Mehran, about this disconnect between the way that many leaders of tech are kind of blind to their larger social and civic responsibilities. Because in the book, you relay a moment when you were fresh out of grad school, and this was in the late ’90s, and you were interviewing for a software engineering position at a small startup. And you finished the day by meeting with one of the company’s founders, who you describe as a serial entrepreneur, and you note that he started the interview by saying to you, “I don’t have any questions for you, but I can tell you that we’re all going to be fucking rich.”

You know, I find so many things to be disturbed about with respect to that interaction, but I did want to ask you, some 20 years on, if the man you are now could go back to that exchange, and sit across from that entrepreneur who said that to you, what would you say?

Sahami: I think the simple question to ask is, “What matters to you other than money? Why are you doing this?” And I think that’s the real question that we’re asking technologists, companies, the funders for these companies to think about more broadly, is that technology has been a sector that’s been economically very advantaged. What else do you want to do? What are the ways that you want to bring about a healthier society or healthier individuals? And that’s the bigger picture that we need to grapple with.

Now, we can’t just count on the companies to do that themselves, or come up with the right choices, or self-regulate, as some people call for. These are decisions, as Jeremy alluded to, that need to be made collectively. And I think part of the ethos that exists in Silicon Valley and a lot of the tech sector is this notion of libertarianism, right? There’s individuals, who make their own choices. So, if someone doesn’t like technology or a particular platform, they have the choice to switch off, or to delete Facebook, or to make a personal choice for themselves. But what that belies is a notion that really there needs to be collective choices made around some of these things, that individual choices aren’t enough. And I think a simple example of that is our road system. There’s a million people killed globally every year on roads, and if we took the attitude of, “Well, if you think that’s dangerous, or you don’t like it, you just shouldn’t drive,” that shows you the problem. There’s real value people get out of driving, in the same way there is real value people get out of using technologies like social networks and search engines, and so just saying your only choice is to use it or not creates a false dichotomy. What we really need to have is a system of regulation like we have on the roadways: we have traffic lights, we have speed bumps. We have all kinds of rules that make driving safer for everyone, while at the same time expecting people to take safety seriously on a personal level. That’s the same kind of place we need to get to for technology.

Costello: I think we should note that that man who you were interviewing with apparently went on a couple of years later to have a market capitalization of more than 10 billion dollars. So, these are important conversations and very important mindsets to be thinking about and critiquing, as you all are doing, so I appreciate that very much.

Sahami: And one thing I’d add is that sometimes Big Tech gets lumped as one big force or one big sector that has just negative connotations. And what I’d say is, what is Big Tech made of? It’s made up of a bunch of individuals. It’s made up of the technologists, the product managers, the people who are involved in building these companies. And by and large, they’re actually good people. It’s not that we’re saying there is an evil intent, or a malicious intent. It’s rather that we shouldn’t have to count on the positive intent of an individual to get healthy outcomes we want as a society. Moreover, we shouldn’t give the power to a small group of individuals to make those decisions for us. And that’s really the point of thinking about everyone having a voice in the future we want to see together.

Costello: Mehran Sahami, Jeremy Weinstein, and Rob Reich, professors at Stanford university and coauthors of the new book System Error: Where Big Tech Went Wrong and How We Can Reboot. Thank you all so much for your dedication to this important work and for speaking with me today.

Reich: Thanks so much, Amy.

Weinstein: Thanks so much. It was a pleasure.

Sahami: Thanks for having us. It was wonderful.

 

This article is a transcript of the Tiny Spark podcast.

Photo Credits: Rob Reich (left, Christine Baker), Jeremy Weinstein (center, Christine Baker), Mehran Sahami (right, Amanda Law)

 

ADDITIONAL RESOURCES:

Rob Reich, Mehran Sahami, Jeremy Weinstein, System Error: Where Big Tech Went Wrong and How We Can Reboot.

Jeffrey R. Young, “That Class Where Stanford Profs Projected Hundreds of Zoom Students on a Video Wall,” EdSurge, June 18, 2021.

Victor Luckerson, “The Ethical Dilemma Facing Silicon Valley’s Next Generation,” The Ringer, February 6, 2019.

Lilly Irani, Rumman Chowdhury “To Really ‘Disrupt,’ Tech Needs to Listen to Actual Researchers,” WIRED, June 26, 2019.

John Naughton, “Why Silicon Valley’s most astute critics are all women,” The Guardian, April 3, 2021.

Tiny Spark Podcast: “Is Big Philanthropy Destroying Democracy?,” November 9, 2018.

On Twitter: @robreich and @mehran_sahami