How can companies learn to grow user trust and prioritise user privacy?
Facebook’s recent rebranding to Meta has been a hot topic of discussion in the world of tech, raising numerous questions about the future of the company. The lead-up to this name change, as well as the company’s recent controversies, have served as an example for other tech companies to learn from, particularly on issues pertaining to user privacy and maintaining trust.
Since its inception, the platform Facebook has attracted nearly 3 billion users and the company Meta has acquired over 80 companies(1,2). It has become omnipresent in people's lives as well as in the world of tech.
Facebook recently announced a corporate restructuring and rebranding along with a new strategy, product, and name: Meta. During the Facebook Connect virtual reality conference, creator and CEO, Mark Zuckerburg, outlined his optimistic vision for the future of the company, and his hope of the company retaining its core value of connecting people while also expanding into virtual reality (3). Signpost to their entry into “the Metaverse”; Meta is carving their own space in the wider virtual reality world where users can share experiences, enjoy entertainment and connect with each other through the new VR headset: Cambria. Whilst these changes seem to signal a shift in approach and a re-emphasis of commitments to their users; Meta’s involvement over recent years in several, user, data privacy-related controversies has meant that their new direction to “build the metaverse with privacy in mind” has unsurprisingly been met with suspicion - with many questioning if this is merely a PR move and a superficial change that avoids addressing core problems within the company (4).
Could this provide the ultimate guideline for tech companies to prioritise privacy as Meta moves into this new direction? Will users be encouraged or prove hesitant to engage with this new part of the company?
The controversy surrounding Meta's management of personal data across its platforms has led to an increase in distrust with its users.
The 2018 Cambridge Analytica scandal - where improperly obtained data was used to target US voters to influence the outcome of the 2016 Presidential election - showed the world how a 3rd party company can manipulate data, and highlighted the lack of control users have when it comes to their personal information online (5). In 2020, Facebook agreed to pay $550 million to settle a privacy lawsuit claiming they had unlawfully collected and stored biometric data in violation of Illinois state law (6). Furthermore, the US Federal Trade Commission (FTC) filed their second antitrust case against Meta in August this year, contesting the monopoly the company holds in the world of social media and claiming they are stifling the competition by acquiring competitors (Instagram and WhatsApp) (7,8).
Beyond data mismanagement concerns, the company has also been forced to deal with its attempts to curb online, harmful content.
They have faced heavy criticism for failing to clamp down on hate speech, misinformation, and fake news spread on the platform - with claims Facebook played a role in inciting violence against the Rohingya Muslim community in Myanmar and helped spread anti-Muslim hate speech in India (9,10).
More recently, Meta’s former Product Manager and whistleblower, Frances Haugen, released documents that revealed(11,12,13):
Facebook knew about and failed to curb its platform being used for misinformation, hate speech, and radicalisation - as well as a lack of substantive internal change
The company’s failure to address the psychological and sociological effects its platforms (which include Instagram) have on its younger users and teen mental health
Failing to properly police and monitor hateful content in non-Western parts of the world
Failing to curb human trafficking in the Middle East
One of Haugen’s more damning indictments was that it consistently prioritised profit at the expense of users' rights and incentivised its employees to emphasise profitability rather than point out systemic changes to improve the way the platform operates (14).
With even more technological capability and access with its new business approach, one of Meta’s big challenges will be to not repeat the mistakes of Facebook. Moderating content in the virtual reality world is likely to be even more challenging and bring even more privacy concerns and points of consideration (15).
How can companies learn from Meta’s journey?
Data privacy is king
The public backlash against Meta and other organisations that mismanage data are clear examples of what happens when companies do not prioritise user privacy. Establishing a prioritisation of privacy from initial design, through to development, and then further onto deployment is not only good practice when it comes to the ethical and lawful handling of data, but can be key in generating user trust from the beginning.
Empower your employees to voice their opinions
Fostering an environment where employees feel comfortable about speaking up can not only prevent the reputational cost of problems occurring further down the line but foster an internal environment of trust for the company.
Maintain transparency
Ensuring transparency in a company’s operations by providing clear insights into how data is collected, used, and protected is critical to maintaining both customer and employee trust.
Being transparent about how you are compliant - carrying out internal reviews and audits - can foster an environment of openness and connectivity.
Build a strong ethical framework for operations
A commitment to undertaking business as ethically as possible will help tech companies avoid compromising their brand and eroding consumer trust. One of the revelations to emerge from the Facebook fallout was that the company had publicly committed to making changes and carrying out procedures which in reality they either didn’t do or, if they did, they failed to follow through on properly, whilst also ignoring issues flagged up by researchers (18).
Reputation is everything
Maintaining a good reputation includes having conviction in your business activities and taking a stand. Facebook avoided tackling certain issues vigorously to avoid the perception of political bias but their efforts to appear neutral allowed the platform to be used nefariously (19). Meta is set to spend over 5 billion dollars on safety and security in 2021 in an attempt to garner better relations with consumers and regain the public trust that Facebook lost (20).
Whilst companies are not charities and aim to make a profit, customers like companies they feel are trustworthy, which have not demonstrated poor practices, and who demonstrate a commitment to prioritising customers.
References
https://www.techwyse.com/blog/infographics/facebook-acquisitions-the-complete-list-infographic/
https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/
https://www.cnbc.com/2021/10/28/facebook-changes-company-name-to-meta.html
https://www.cnet.com/news/facebook-to-meta-a-new-name-but-the-same-old-problems/
https://www.nytimes.com/2020/01/29/technology/facebook-privacy-lawsuit-earnings.html
https://www.theguardian.com/technology/2021/aug/19/facebook-antitrust-case-ftc-monopoly
https://thediplomat.com/2020/08/how-facebook-is-complicit-in-myanmars-attacks-on-minorities/
https://www.platformer.news/p/why-these-facebook-research-scandals
https://www.theatlantic.com/ideas/archive/2021/10/facebook-failed-the-world/620479/
Further sources
https://www.cnet.com/how-to/the-facebook-papers-how-to-read-the-reports/
https://www.wired.com/story/facebooks-global-reach-exceeds-linguistic-grasp/
https://edition.cnn.com/2021/10/25/tech/facebook-papers/index.html
https://edition.cnn.com/2021/09/15/tech/instagram-teen-girls/index.html
Images
Title image = shutterstock