Tech leaders need to step up to their social responsibilities, but they probably won't.

In a Wired opinion piece titled “It’s Time for Innovators to Take Responsibility for their Creations”, Susan Wu calls upon Silicon Valley CEOs and product creators to step up to their de facto roles as societal leaders and to take responsibility for the products and platforms they create:

It’s crystal clear that Silicon Valley’s chief executives are no longer merely startup founders, product creators, and business executives. They’re societal leaders too, oligarchs shaping the very nature of our identities, communications, and relationships.

In a world where software and algorithms run most every part of our lives—where Google and Facebook control close to 70 percent of all digital advertising, and smartphone penetration is nearing 80 percent—creating innovative software and launching indispensable apps is no longer enough.

As basic social contracts across nearly every aspect of Americans' lives are being dismantled by a voracious, so-called free market system and gluttonous political administration, citizens each have an even more urgent need to acknowledge our responsibility to one another. Today, racking up a stratospheric market valuation without significant consideration of the product or company’s broader societal impact is reckless and irresponsible.

These tools, Wu says, are more than just pieces of software used by individuals to meet their specific needs. In many cases, they have broad cultural impact, either directly or via the externalities they create. Wu suggests that since Silicon Valley folks are already familiar with Maslow’s hierarchy, we ought to broaden Maslow’s categories to include shared societal needs rather than only those needs pertaining to the individual.

While I agree with Wu in principle, I think her proposal faces several practical challenges.

First, much of Silicon Valley is built upon notions of radical individuality—using technology to empower the individual, to give each person her or his own voice. Platforms like Google, Twitter, and Facebook are based on the idea that users should be served information and content specific to them, and that giving them the freedom to express themselves shifts all responsibility for the effects of that expression onto those individuals. We just build the platform, these companies tell us. What people do within it and how they treat one another is their problem, not ours.

Given that, I am not sure how we can expect the people upon whom Wu is calling to even accept the premise of her argument—that there is a shared social impact of the software for which they need to accept responsibility. After all, it was only yesterday that Twitter’s leadership responded to calls to ban Trump from their platform by claiming that doing so would limit free expression and political debate.

Second, I am not sure how we get to a shared conception of Maslow’s hierarchy. It sounds lovely, but such an outcome would require us all to agree not only upon what those categories of shared societal needs would be, but that such a concept even makes sense to begin with.

It is one thing to say that we think a society functions best when each of its members is granted (or inherently has) certain rights. It is something entirely different to claim that there are also shared needs and rights, especially when some of those shared needs may conflict with individual rights.

Personally, I happen to believe that these shared needs exist. We need public education, a social safety net, support for the arts, protection of the environment, a free a open press, and public forums that foster engagement and debate for all citizens. However, I am not the audience that Wu needs to convince.

The Silicon Valley CEOs and product leaders to the is speaking tend to be a rather libertarian bunch. it is no wonder that the platforms they build ignore both the needs of as well as the impact they have upon the society in which they exist. Moreover, the change in outlook Wu call for would require a radical rethinking and redesign of much of this software, a prospect that seems exceedingly unlikely. Philosophical and political leanings aside, it is simply easier and cheaper for the tech industry to claim their platforms are agnostic, that they bear no responsibility for how these tools are used or how people behave on them.

Show Comments