"When everyone has a printing press, the ones with the best ideas are the ones people listen to."
- Facebook's The Little Red Book (2012)
In March 2018, BuzzFeed published a leaked Facebook memo about their “growth at all costs” mentality, igniting a firestorm about Facebook’s long-standing attitude towards challenges like fake news, filter bubbles, and state actors.
As in any media frenzy, unfair criticisms were voiced — but at the core of the debate was a truth that has had Facebook employees thinking more deeply than ever before about the implications of their company's products.
This wake-up call has been nearly a decade in the making. For the last several years, I’ve tried to understand why certain Facebook executives have been mistaken for so long about the negative impacts of their product and how the truth distortion field at any large company can overtly — and more often, subtly — distort how employees think.1
To begin, put yourself in the shoes of a typical Facebook employee in the early/mid 2010s thumbing through their Facebook News Feed — composed (unsurprisingly) of posts from many fellow Facebook employees.
You might see a laudatory Wired cover story that a Facebook executive has shared talking about how much Facebook cares about disaster response (800+ likes):
You’ll see that Facebook is focused deeply on virtual reality and education, and see how it triggers a Facebook executive to share the story of what education meant for his immigrant parents (400+ likes):
A lot of great OC3 announcements this year but my favorite was our commitment to the intersection of virtual reality and education.
Over [X] years ago, my parents left [Y country] for the U.S. in the pursuit of higher education. But access to all levels of education needs to be ubiquitous …
You might see the heartwarming story of Internet.org, Facebook’s way to accelerate the adoption of affordable Internet in less developed countries (1000+ likes):
[Our Internet.org Innovation Challenge in Africa] was developed to support our vision of a connected world by recognizing those entrepreneurs and start-ups who are building solutions to improve education and the economic health of...
You’d miss much of the debate about Internet.org’s violation of net neutrality — or its usefulness in locking out Facebook’s rivals. (And debates about whether Facebook's tactics in India were astroturfing)
You’ll feel a sense of purpose for being led by a wunderkind, whose affirming message strikes you as just what the world needs in these fraught times (as noted in a book distributed to all employees):
Facebook was not originally created to be a company, it was built to accomplish a social mission — to make the world more open and connected.
For the sporadic flare-ups that show up, you might favor the posts of your work “family”, saying that Facebook takes these concerns seriously.3 And it’ll be easy to dismiss even the fair criticism, since the most unfair charges from external parties will be “relevant” for you to hear in in your Feed — and in company conversations.
I’ve cited Facebook posts from Facebook executives I know who are smart, kind, and wonderful in person.4
There’s deep irony that a filter bubble is a key reason that Facebook employees may be unaware of what’s going on: As our society is slowly learning, like-minded friends and algorithms focused on engagement surface the facts that justify our biases — and ignore other important realities that we should know.5
In such a news feed, what might be missing?
Before you read further: If you agree with this piece, please share on Facebook — and tag your friends who work there (and at other social networks). I’ve written this piece critically — but also empathetically. You can help break the social media filter bubble and encourage greater reflection.
The comments of Uye Swe, quoted in a recent NYTimes article, are illustrative of the truth distortion. In his village, “[Muslims] are not welcome … because they are violent and they multiply like crazy, with so many wives and children.”
Aye Swe, who had never met a Muslim before, praises Facebook, saying “I have to thank Facebook because it is giving me the true information in Myanmar [about Muslims].” I have multiple friends who’ve tried for years to get Facebook to care more about their impact in Myanmar, to little avail.
It’s also about echo chambers. [Facebook’s] algorithm chooses which updates appear higher up in users’ newsfeeds and which are buried. Humans already tend to cluster among like-minded people and seek news that confirms their biases.
Many at Facebook will note how Facebook’s culture encourages rich debate — and how thoughtful their team is. But as we’ll see, when incentives point in a certain direction, employees are recruited for their narrow backgrounds, and certain values are held sacrosanct, the debate will be missing important perspectives.
In July 2018, Mark Zuckerberg would finally say, "we were probably ... too focused on just the positives and not focused enough on some of the negatives."
Though I may be critical now, when I worked at an earlier social media company, 6 I too would extoll my own confident perspectives on the unalloyed good that social media provided.
My arc has been shared by numerous others in technology, with excitement for the potential of early social media tech tempered and disillusioned by the messy realities of fake news, surveillance, data leaking, and especially tribal filter bubbles (on which I've done analysis on over the last 2.5 years, as well as written a media literacy guide). Like most new technologies, social media can provide enormous good, but requires structures and norms to mitigate its worst excesses.7
My inquiry began several years ago when my most thoughtful friends at FB (often executives) would echo the company line and dismiss issues like fake news and filter bubbles as few and far between.
Most external criticisms would be cast aside as isolated examples that ignored all the good Facebook was doing for the world — or as whining from an overzealous press that was angry at being disrupted. Or because free speech was the only value that mattered, social networks could do nothing to correct fake news — with human nature only to blame for the issues.8
Numerous friends — all executives or venture capitalists elsewhere in Silicon Valley — would also remark on seeing this same attitude.
Most line employees, by comparison, would generally be unaware of the dangers, focused on what they were hearing from their coworkers and executives. Those who had deep concerns were too scared to speak vocally about it, or left Facebook over time.
How do regular employees in any company make their own judgements?
In this series, I explore truth distortion fields, where pieces of information are subtly organized — or downright manipulated — to favor a certain point of view. Truth distortion fields are especially powerful in influencing the views of line employees and communities at the front lines of any industry.
Management will laud what employees do, show them selective facts that justify their views, and hire/promote those who behave similarly to them. Employees in isolated teams with training in a single function may not realize the broad, unintended effects of their company's work. They'll assume the best of their friends and coworkers, without inquiring into the larger effects they're having.
Though I’ll primarily use Facebook as a current example, truth distortion fields can happen in any large company or industry town. In the case of a vastly different industry, tobacco, it might have been supremely hard to see the devastating health costs if your family/community depended on that field.9 Even after science triumphed over tobacco industry efforts to spread misinformation, I wonder how many employees questioned if their industry was being honest.
Instead, many likely saw their work as contributing to the well-being of their families/communities and valued by their customers. They might also be disdainful of the overblown claims from health advocates and government agencies that went against the free choice that each human deserves.
As we’ll see, the truth distortion field occurs even in companies where debate is encouraged and smart people are hired, as is the case at Facebook.10 Even the views of the savviest employees can be distorted by financial incentives, respect for colleagues, wrong metrics, narrow training, and cherry picked stories of the good they do.
Too often in my life, I’ve been influenced by the truth distortion field — an experience which has given me the perspective to examine these misleading frameworks from the inside and outside. Let’s examine what’s going on.
First, it is challenging to believe something not in your financial or career interests. Long before this era, the famous muckraker Upton Sinclair noted that “[i]t is difficult to get a man to understand something, when his salary depends on his not understanding it!”
How big of a half truth would you tell tomorrow to prevent yourself from being fired? And would you be willing to tell bigger lies as you become more senior, have more family responsibilities, and have even more to lose?
How deeply would you inquire into an issue if it didn’t affect your job and was at odds with your company's self-interest?
Example: Profit and Gender Issues at UberView example
|At Uber, where gender issues are front and center, an early female employee pointed out that she'd potentially pick a "win at all costs" CEO that would double her equity at the cost of a corrosive environment for women. Though the debate is never this neat, her doubt does give a window into how challenging certain tradeoffs can be, even when the downsides personally impact you.|
Management’s most powerful lever over employees is their ability to set incentives (quarterly goals, promotion criteria) where professional success requires adhering to a certain set of goals and values.
This extends to the communities, political bodies, and universities that surround companies/industries. They may contest unfavorable facts in cases where they stand to lose donations and political contributions. At a minimum, they’ll celebrate their homegrown champions. These communities can sometimes be echo chambers that may not have a thoughtful discourse.
Rather, it will seem that only outliers will disagree with the false consensus that has developed. And many politicians are motivated by the self-interest of their voters and funders, not the desire to see justice done.
Second, there can be a false sense of mission for employees, where management leads employees to believe their work — on balance — does tremendous good.11
This is especially popular in recent times as companies are expected to play positive roles in our society in a way they never were.12 Management gets employee mindshare which lets them frame discussions and subtly cast their employees’ attitudes.
Mission-setting at its best is a way to get a team aligned and motivated to work towards a challenging goal. A job that feels personally and socially meaningful improves recruitment, employee productivity, public relations, and government lobbying.
But at its worst, corporate mission can disingenuously motivate employees so that management and investors primarily benefit with huge societal negative externalities. Management can be masterful at manipulating their employees' desire to do good for their own corporate goals.
The mission may be vague or weak with high-sounding words, but little teeth in quarterly goals, incentives, or resources. Or the mission may be clear and energizing, but achieving it may have huge negative impacts that are selectively ignored inside a company.
Example: Mission in the Financial Services IndustryView example
|At Fannie Mae, a mortgage lender, employees were pitched an affirming mission of “democratizing home ownership,” while management increasingly used the company in ways that benefited their personal interests (from accounting scandals to lobbying to prevent oversight).
The former head of one of the world’s largest banks pointed out to me that “People don’t like doing the wrong thing and you have to make them feel that they’re doing good.” Even as his bank pitched a powerful mission to their line employees, it would simultaneously pay hefty settlements because his poorly set incentives pushed employees to do the wrong thing.
“Change the world” is often a rallying cry for teams (so much so that it is also much parodied). It’s a powerful way to motivate a team, but also doesn’t reflect the impact people are having. Even when it does, it’s often assumed to mean the same as “make the world better,” even though the actual effects may be mixed (and making the world “better” is a function of the values you hold).
As one Facebook employee noted to me disapprovingly, quarterly goals are indicative of what the company really cares about, no matter what the broader mission statement is. The day-to-day work for many at Facebook is to ensure that metrics keep going up.
This can influence behavior and beliefs of employees looking to rise in the ranks, who quickly see what is actually rewarded. The danger is that management may be disingenuous or short-sighted in their views, affecting the decisions of all below them.
In this era, management teams are masters at pitching their work as providing meaning, purpose, and social impact, even though this benefits their own financial interests. At Harvard’s 2017 commencement Mark Zuckerberg, presumably highlighting his values at Facebook, notes that “purpose is that sense that we are part of something bigger than ourselves, that we are needed, that we have something better ahead to work for.” Facebook’s competitor, Snap recently added a “We contribute to human progress” line to its investor site. Stanford's Professor Fred Turner argues that "[these companies] urge employees to see their work for the company as a species of personal development."
Example: Mission and Unintended Consequences at FacebookView example
|After anger at Facebook increased in late 2016, one Facebook employee argued: "The people who work here really do strive to make the world a better place, to build understanding and promote empathy, and while Facebook isn't perfect, it is, without a doubt, genuinely human and mission-driven.” (10 likes)|
Professor Tufekci countered that line of thinking, writing “History is full of massive harm done by people with great power who are utterly convinced that because they believe themselves to have good intentions, they cannot do harm or cause grave harm.” As she tweeted recently:
At some point, you'd hope a bunch of smart people [at Facebook] … think, wait, is there something more than how good we are to this thing where we repeatedly argue that, yes, sorry about ignoring that for a long time, but we are good people. Something structural? Something about incentives?
Professor Arvind Narayanan, echoing a growing cynicism amongst technologists, says “If you join the tech world when you graduate, you’ll be inundated with pitches and visions about changing the world. But I hope recent events have convinced you that you’re just as likely to screw up the world working in tech as you are to make it better.”13
Third, employees can ignore the unintended consequences of their work, and focus solely on the benefits they provide.
The harm doesn’t come from the fact that they’re malevolent people. Rather, the effects of their work are far-reaching and they’re selectively focusing on its benefits. Management especially downplays facts that point to uncomfortable realities and highlights pieces of information that paints the company in the best light.
A more thoughtful model for employees might be to inquire about the various effects of the work their company does, and reflect on the tradeoffs, especially imagining how they’d feel if the negative results impacted their own child. In tech, having children seems to lead to a critical mindset shift — which suggests that billions of people might have a different experience if a few hundred tech executives had children.
Employees may be too far away to observe the deleterious effects of their work in the field.14 These companies can be gigantic, with each team working on a narrow subset of issues, comprised of people with functional expertise in a single area (e.g., dev ops, front end engineering, brand marketing, ad sales).
Nearly every team is only seeing part of the picture within a much larger company, so no one team feels it is contributing to the negative effects.15 At Facebook, the majority of employees don't actually work on ads or the News Feed, even though these are critical elements that underpin Facebook's dominance.
In short, we’re not open to seeing the full impact of our work. This can especially be problematic when employees focus on the good their work accomplishes, while ignoring the bad (“self licensing”). As we saw at Facebook, there’s a tremendous number of initiatives that actually do good — but that doesn’t excuse the other issues.
It’s too reductionist to say that no one pushes back, but when voices go against executive beliefs — or profit, or growth — they’ll be easier to dismiss.
Example: Conflict Across Divisions at FacebookView example
As one news article noted: “One central tension at Facebook has been that of the legal and policy teams versus the security team. The security team generally pushed for more disclosure about how nation states had misused the site, but the legal and policy teams have prioritized business imperatives…”16
If you work in a function that represents your company externally (communications, marketing, sales, legal) — or is directly responsible for revenue — you'll be especially susceptible to the truth distortion field. You’re often in charge of pitching the most optimistic view of your company, and protecting the company against outside attacks.
Mark Zuckerberg highlights this, saying “ I think probably to the dissatisfaction of our sales team here, I make all of our decisions based on what’s going to matter to our community …” (Note to future CEOs: It’s generally your incentives and hiring practices that dictate how your divisions behave, not their fault alone)
One’s coworkers are often thoughtful people – loving parents, charitable community members, caring friends; we can’t imagine them doing harm. In Silicon Valley especially, I’ve also noticed that early success (both financial and career) is an inoculation against learning and self-inquiry, something that applied to many early Facebook employees.
Executives and employees tend to hire and associate with people who empathize with certain values, while placing less emphasis on competing or contradicting values.17
Fossil fuel executives may place much higher weight on the benefits of cheap transportation versus the value of a cleaner environment. In social media, the value placed on connecting (like-minded) friends is one side of the coin; the other is that this same value creates "in" and "out" groups, encouraging racism, partisanship, tribalism, and potentially even terrorism.
As one former FB executive wrote, “[Facebook] has cultivated so much faith in its mission to connect above all that it recruits exclusively those who pride that mission above everyone else. They recruit people who believe in the universal good of connecting the world and recruit those who care about nothing else ... "
Numerous Facebook employees privately noted that before the recent controversy, they found it deeply difficult to convince their superiors about the need for change. “The people whose job is to protect the user always are fighting an uphill battle against the people whose job is to make money for the company,” said former Facebook Platform Operations Manager Sandy Parakilas.
Alec Muffet, a former Facebook employee, noted that "a Facebook executive's post on growth at all costs was a significant factor in why I left the company."
At Theranos, the number two executive angrily conveyed (bold added), that "anyone not prepared to show complete devotion and unmitigated loyalty to the company should 'get the fuck out'." At Facebook, Zuckerberg chastised an unknown employee who had spoken to a TechCrunch reporter saying that "if you ever believe that it's ever appropriate to leak internal information, you should leave. If you don't resign, we will almost certainly find out who are you anyway." (This email was reprinted in the Little Red Book given to every Facebook employee as a reminder)
Thoughtful, if dissenting, perspectives are then sorely missing from critical internal debates.
Coworkers can come to share deep bonds after working together through tough times.
The risk is that this can breed an us vs. them mentality against the outside world that means you’ll tune out most criticism. To disagree would mean being disloyal to your family. Given the speed at which tech companies can be built, the David mindset of the early days can persist, long after your company has become Goliath.
Example: The Facebook "Family"View example
Facebook early employee and VP of Social Good, Naomi Gleit, pointed out how much she sees her team at Facebook as a family that she would do nearly anything for:
I think there are pros and cons of working with people that you’ve grown up with, they’ve sort of become like family. I think that can make it harder sometimes. We disagree, we fight, it’s hard. But at the end of the day you can’t quit your family.
In an environment like this, how much might any employee question their close family?
Other times, management can consciously create an “us vs. them” dynamic, especially by highlighting the most extreme or questionable voices on the other side. Continuous external criticism from the outside world will also stoke defensiveness, making employees ever more resistant to seeing the truth.
Example: Media Criticism at TheranosView example
After a team meeting at Theranos, employees shouted “Fuck you, Carreyrou!” — referring to the journalist who was critical of their products. One employee designed a game based on Space Invaders, where you could shoot the journalist:
How might you take media criticism in an environment like this?
Media frenzies are often filled with partial truths and shoddy journalism, but the siege mentality it provokes will then be used to dismiss even the fair criticism (see Facebook's internal debates after their recent unflattering memo leaked — or a Twitter founder’s frustration with the anger at Twitter).
Though we don’t realize it, most of us are in a truth distortion field of some kind if we personally benefit from the industry/company in question. We’ll often see selective, cherry-picked facts that justify the company’s favored point of view, rather than the full set of facts to make a judgment for ourselves.
Despite all these issues, a few changes can go a long way: