What will it take for us to trust Facebook again?

Ricky Yean
6 min readApr 10, 2018

--

Are we about to see a Facebook bank run?

Building trust is hard because trust has to be built incrementally, and it only takes one (perceived) mistake to undo years of trust-building. Facebook is very publicly dealing with this issue today, but the problem they face goes beyond locking down developer access to user data and investigating potential offenders. Facebook has to design for consistently trustworthy user experiences and roll back initiatives that have created mistrust.

Let’s look at banks for example. It’s mind-blowing how we almost never question whether or not banks are trustworthy because giving the banks our money is just the thing that we do as Americans. The banks have so much of our trust that no one really cares about the fact that banks reinvest our money instead of keeping it in a vault somewhere.

The way the banks built up our trust is by meeting our expectations every single time over the course of decades, but even then, they are always teetering on the edge of losing our trust. Every time we withdraw money and get it, every time we check our balance and it checks out, and every time we send a payment and it’s received, the banks earn our trust. However, the slightest hint of failure (even in perception) will immediately cause us to panic and explore moving our money somewhere else. The last real bank run was almost 90 years ago, but the financial crisis of 2007 was enough to dramatically lower the percentage of Americans with strong trust of banks from 41% to 27% today (Gallup). This is in spite of government regulations, FDIC insurance, the Federal Reserve and other trust infrastructure that have been put in place to help. It takes decades to build trust and you can lose it in a second.

Technology companies typically do not have to deal with bank-level trust problems because technology companies are much less visible than banks. Google and Apple have all of our browsing data through Chrome and Safari as well as our contacts and text messages through Android and iCloud. AT&T and Comcast are ISPs, so they can monitor our entire Internet activity. These companies have way more information about us than Facebook, yet they’re not being asked to testify in front of Congress like Facebook. Facebook’s real problem is that it is the most visible technology company, and the visibility breeds mistrust.

We very explicitly and visibly gave Facebook our data, just like we give bank tellers our money

Facebook has a high bar to clear because from the first time we signed up, we overtly handed over our personal information for the purpose of using it to express ourselves and connect with our social circle. We typed in every piece of personal information and essentially told Facebook, “I am giving you my information now, please treat it with respect.” This is a very explicit act and it comes with expectation that we don’t ask from other technology services. Google, for example, has built a profile of me in the background that is arguably more extensively than Facebook, but they’ve never asked me explicitly for it. Everything Google did happened behind-the-scenes with every search, every visit to an AdSense-powered website, and every time I use Chrome.

Facebook as a product also has evolved very dramatically since its founding in 2006. When I gave Facebook my personal information, I never thought that it would be used to log into apps on my phone. The first time I accepted a friend request, I did not expect my thoughts to be algorithmically delivered to that friend. This is like depositing your money to a bank, only to discover later that it was being used to…uh, bet on sports at a casino? In order to restore our trust, Facebook needs to create more consistencies between expectations and reality.

Facebook relies on our data too heavily and obviously to create an engaging experience on Facebook

When we use Facebook, we are constantly reminded of the personal information we gave them because the entire user experience is predicated on our social graph and our interests. We see what our friends are liking and sharing. If we interact with a post from someone, Facebook reinforces our “friendship” with that person by showing us more posts from them in our feed. The makeup of our feed changes so readily with our interaction patterns that we all understand at some level that Facebook is tailoring the experience very aggressively. On one hand that leads to a feeling of control, but on the other hand, it makes Facebook’s targeting prowess way too obvious, inspiring fear.

Other tech companies do this, too, but just less obviously. For example, when we search Google, Google will show us ads based on that search. However, the search results still feel relatively objective (even though they are personalized), and it feels like it’s happening one search at a time so the targeting doesn’t feel like it’s compounded based on all the data they’ve accumulated about me over time. Imagine if you searched for “basketball scores” and Google learns from your past search history to show you scores specifically for your favorite team, Houston Rockets. Unless it’s clearly disclosed as location-specific or based on some other factor, it gets creepy pretty quickly when it becomes obvious that Google is keeping a close record of everything we do in its ecosystem and using it aggressively.

Facebook is omnipresent, making it seem like it is tracking our every move even when we are not using Facebook

Facebook is on the sign up and login screens for every new app we download because of Facebook Login. Every article and video we consume comes with a Like and Share button. This is not like Google AdWords or a Gmail email address. We barely pay attention to AdWords and we think of our email as a neutral utility. Facebook login and the Like/Share buttons are very clearly branded, and they are proactive activities that make us think a lot more about Facebook. Even when we are not using Facebook, we are always using Facebook.

And when we do use Facebook, the rest of the Web also finds its way back into our Facebook feed. Advertisers can retarget us inside of Facebook, making it too obvious that either Facebook follows us around or Facebook is selling our name and information to advertisers so they can track us down inside Facebook. This conflates the different contexts and spaces we operate in. Is Facebook following me around? Why am I seeing ads from that website on Facebook? This makes it harder for users to feel in control, leading to anxiety.

This context-conflation also bleeds into real life. When Facebook crosses our location data with our social graph to figure out who we are hanging out with in real-life in order to show us posts and ads they showed to the friends we hung out with, it creates the illusion that Facebook is listening in on our real-life conversations through the microphone on our phones, causing even more anxiety.

Conclusion

What Facebook is battling today is the consequence of years of trust erosion, and it’s going to take years for Facebook to restore that trust. To accomplish this, Facebook needs to first 1) create more consistencies between expectation and reality whenever they ask us to hand over our data 2) dial back on aggressively using our data to create an ultra-personalized experience and 3) reduce the cognitive dissonance from context-conflation when users go from Facebook to non-Facebook Web to real-life.

--

--

Ricky Yean
Ricky Yean

Written by Ricky Yean

2x startup founder. Stanford, YC and StartX alum.

No responses yet