In a recent interview with 60 Minutes, former Facebook employee and whistleblower Frances Haugen gave a damning account of the social media giant’s internal policies, revealing that the company continued to put profits over user safety.

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen said. “And Facebook, over and over again, chose to optimize for its own interests, like making more money.”

Haugen’s interview came just weeks after The Wall Street Journal reported that Facebook knew its popular multimedia app, Instagram, was toxic for teenage users—but chose to hide that information from the public.

“For the past three years, Facebook has been conducting studies into how its photo-sharing app affects its millions of young users,” the newspaper reported. “Repeatedly, the company’s researchers found that Instagram is harmful for a sizable percentage of them, most notably teenage girls.”

To make matters worse, Facebook employees fully understood the magnitude of their findings. One reportedly acknowledged that “we make body image issues worse for one in three teen girls.”

Expressing skepticism about social media’s impact on mental health is one thing; covering up research that shows a clear link between Instagram use and negative self-image is quite another. Facebook and Instagram were not simply ignorant of their role in fueling the ongoing mental health crisis. They were determined to cover their tracks and deceive the public.

During a congressional hearing in March, for instance, Facebook CEO Mark Zuckerberg openly contradicted his company’s internal findings, claiming that “overall, the research that we have seen is that using social apps to connect with other people can have positive mental health benefits and well-being benefits by helping people feel more connected and less lonely.” Weeks later, Instagram CEO Adam Mosseri echoed Zuckerberg’s claim, telling reporters that the app has a “quite small” effect on the mental health of its teen users.

This isn’t the first time Facebook has concealed its user experience research. When the public questioned the company’s impact on mental health in 2018, Zuckerberg was quick to promise a better social media experience on the platform.

“We feel a responsibility to make sure our services aren’t just fun to use, but also good for people’s well-being,” he said at the time.

“The research shows that when we use social media to connect with people we care about, it can be good for our well-being,” Zuckerberg added. “We can feel more connected and less lonely, and that correlates with long term measures of happiness and health. On the other hand, passively reading articles or watching videos—even if they’re entertaining or informative—may not be as good.”

Zuckerberg was right. Endless scrolling is not a recipe for stellar mental health. Ironically, though, Facebook and Instagram algorithms today promote more passive content than ever before. Autoplay videos have become a hallmark feature of the Facebook experience while Instagram has been busy profiting from reels of twerking teenagers.

Quite the social media revolution, Mr. Zuckerberg.

Other mainstream social media platforms are not much healthier. The popular video sharing platform TikTok, for instance, promotes a wide catalogue of powerful content filters that encourage users to conceal their “imperfections,” thereby reinforcing the unrealistic beauty standards that have proven to be so harmful to teen mental health. Twitter, meanwhile, is plagued with its own set of insurmountable challenges—the platform’s engagement revolves solely around breaking news and “trending” topics, which are rarely positive or uplifting.

Big Tech platforms have also participated in an ongoing censorship campaign against conservative voices—an effort that has devastating implications for those who make their living online, and that stifles the voices of citizens seeking to participate equally in our democracy.

Of course, there is a good reason Big Tech refuses to make the necessary changes to protect users’ mental health. It simply has too much to lose from correcting its errors. Platforms such as Facebook make enormous profit from engagement and have little financial incentive to reduce content consumption. For that reason, mental health considerations need to be made at the start of the app development process. That is precisely why our team at CloutHub consulted neuroscience experts to build a social media platform that gives members total control over their online experience.

Legacy social media platforms have lied to us before, and they will lie to us again. Apps that were supposedly designed to make the world a better place have morphed into addiction mills and self-gratification factories that prey on innocent users.

Don’t buy into Zuckerberg’s imminent apology tour. Consider alternative platforms that have been engineered to offer healthy user experiences. If you’re looking for a new social media experience, be sure to select a new online community that looks after your mental health.

Jeff Brain is the founder and CEO of a rising social media platform, CloutHub. To learn more about CloutHub, visit www.clouthub.com/home.

The views expressed in this article are the writer’s own.