Skip to main content
Markkula Center for Applied Ethics

Facebook’s biggest threat is coming from this obvious place

Alt text:

Alt text: "Facebook logo on a wall with a shadow."

Addressing trust and privacy issues in tech.

This article was originally published in MarketWatch on April 10, 2018.

Ann Skeet is the director of leadership ethics at the Markkula Center for Applied Ethics at Santa Clara University. Views expressed here are her own.

Investors are now taking note of fundamental conflicts of interest at Facebook and other social media and tech companies. But Facebook, notably, shows these conflicts in a way that makes it easier to spot the ethical risks inherent in some of Silicon Valley’s business practices.

The first conflict of interest is visible in the documents Facebook filed to go public — competing interests of groups supporting the business model right out of the gate. Facebook’s original mission statement was “to make the world more open and connected.”  It went on to describe how the company would create value for users, developers, and advertisers.  Within a few bullet points, users are promised control over what they share, developers are promised the ability to personalize social experiences and advertisers promised a unique combination of “reach, relevance, social context, and engagement.”

There is a strategic challenge here in an attempt be many things to many constituents, but also an ethical one, as it remains unclear whose interests are primary.  A Facebook user’s desire to keep information private, including perhaps their network of friends, might be at odds with the filing documents’ assertion that advertisers can “specify that we show their ads to a subset of users based on demographic factors and specific interests.” By enabling “platform developers to reach our global user base and use social distribution channels to increase traffic to their apps and websites,” the company might breach their commitment to users’ control over information.

Traditional media companies have long balanced the competing interests of readers/viewers and advertisers. These companies have managed conflicts of interest with at least three levers:  an acceptance of their company’s role in society; a clear set of ethical standards placing the public’s use of the information at the top of the list of interests served; and control over the creation of their product.

Facebook and other social media companies have insisted they are platforms, often citing the new dimension that differentiates them from traditional media — they don’t control their product’s creation fully.  They have naively held on to this assertion that adding the capability for users and developers to shape the product absolves them of responsibility for how it affects society.

Facebook’s second conflict of interest, also evident in company’s foundational documents, is a stock ownership structure that gives majority ownership to its founding CEO Mark Zuckerberg, undermining the board’s ability to provide true oversight as we have seen with other social-media companies like Snap, Inc.  In the section on risks to investors there is a heading that reads: “Our CEO has control over key decision-making as a result of his control of a majority of our voting stock.” I have bolded the end of this section for emphasis:

“As a result, Mr. Zuckerberg has the ability to control the outcome of matters submitted to our stockholders for approval, including the election of directors and any merger, consolidation, or sale of all or substantially all of our assets. In addition, Mr. Zuckerberg has the ability to control the management and affairs of our company as a result of his position as our CEO and his ability to control the election of our directors. Additionally, in the event that Mr. Zuckerberg controls our company at the time of his death, control may be transferred to a person or entity that he designates as his successor. As a board member and officer, Mr. Zuckerberg owes a fiduciary duty to our stockholders and must act in good faith in a manner he reasonably believes to be in the best interests of our stockholders. As a stockholder, even a controlling stockholder, Mr. Zuckerberg is entitled to vote his shares, and shares over which he has voting control as a result of voting agreements, in his own interests, which may not always be in the interests of our stockholders generally.

Zuckerberg is playing more than one position here: CEO; majority shareholder, and board member. Even the most seasoned corporate executives struggle to keep such conflicted positions straight — one reason why many companies have shifted from having the CEO chair the board to naming a chair who is an outside lead independent director.  Early Facebook investors were apprised of the reality that they lacked control of the management team, the board, and ownership rights in the company.  

We can’t know Zuckerberg’s intentions and I assume they are good. Yet as well-intentioned as Zuckerberg has always come across in public statements, there are statements attributed to him during Facebook’s early days — such as that usersgiving their personal information to Facebook were “dumb” — which suggest he well-understood that he could not fulfill the brand promises made to the three conflicted constituents in his model:  users; developers, and advertisers.

Zuckerberg’s age is another factor. Only 19 years old when he started the company, he turned 28 the week it went public.  Moral judgment is developmental, meaning that, like reading and writing, we have to learn decision-making at a basic level before we can evolve our own fully formed conscience. If Zuckerberg is like most people, what he thinks of as “good” now is different from what he thought at both 28 and 19. The experience level and the chronological age of corporate leaders pose real risks.

Finally, history points to stories of companies with a reputation for a commitment to doing social good but a corporate culture at odds with that aspiration. “Move fast and break things,” a phrase used to describe Facebook’s approach to product development, is counter to both its original mission and its updated one: “Give people the power to build community and bring the world closer together.”  

For tech companies to live up to their public aspirations for improving the world and doing good, questioning these commonly accepted business practices would be a meaningful start:

1. Dual class stock giving founders outsize, lasting control.

2. Management by mission statement solely, rather than long-term investment in regular culture assessment and management.

3. Business models that eschew responsibility for societal impact, such as being “just a platform.”

4. Start-up boards comprised solely of investors.

Leaders wanting to encourage ethical decisions must resolve these conflicts by managing corporate culture to leverage a variety of elements, both visible and less visible, to ensure their business model and their goals are not at odds.

Additional Reading

Facebook and Our Fake News Problem

The Ethics of Facebook’s Justifications to Exempt Hate and Lies by Politicians

Facebook’s Ethically Incoherent Response to Manipulated Content

Facebook and Political Speech

Of Course the First Amendment Protects Google and Facebook (and It’s Not a Close Question)

Facebook and the French Flag

 

Photo Credit: AP Images by Thibault Camus.

Apr 19, 2018
--

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: