Imagine scrolling through Netflix and selecting an “explore” option. Click on it, and you can dive into someone else’s algorithm. Maybe you discover a classic mystery film you’ve never heard of or a Korean drama you end up loving. Then you hop over to Facebook. After scrolling through the posts curated for your preferences, you click “explore” and see what the app is curating for someone on the other side of the world, with the opposite political beliefs from yours. With the click of a button, you can immerse yourself in the experience of someone completely new.

This experience isn’t yet a reality, but Columbia Business School’s Sandra Matz thinks it should be. Matz is both a psychologist and the David W. Zalaznick Associate Professor of Business. Her work focuses on big data, what our digital footprint can teach us about human behavior, and how that information can help individuals and companies make better decisions.

Matz talked to CBS about the power of our digital footprint, its value as a marketing tool, and its unharnessed potential to shape our perspectives. “The more that you understand about your digital environment,” she says, “the more control you will have over changing it.” 

Getting to Know You

Matz’s research centers on the traces of ourselves that we leave online — our digital footprint — and what companies can glean about people’s psychology from those footprints.

Your footprint isn’t just what you intentionally put on social media. It’s your smartphone, which consistently tracks your location, and your credit card charges, which can capture what we buy, where we shop, and even who we’re spending time with. “Each data point alone isn’t necessarily that interesting and revealing. But once you put them together, you can learn an awful lot about people’s psychology,” Matz says.

By integrating our digital footprints via machine learning, computers can create detailed profiles of who we are. How accurate are these profiles? One study found that based on Facebook likes alone, a computer was able to assess personality traits better than a person’s friends, family, and in some cases, even their spouse.

As Matz describes in her forthcoming book, understanding people’s psychology offers a powerful path to predicting and changing their behavior. In collaboration with an online beauty retailer, for example, Matz traced digital footprints to determine if someone was an introvert or an extrovert. Using just that single deduction, she helped the company tailor its targeted ads. Sales increased by 50 percent.

Visiting Other Villages

Having demonstrated that leaning into someone’s preferences can be an effective lever for behavior change, Matz’s forthcoming book also explores the opposite: Can we help people leave their digital echo chambers and learn something new about the world?

In a recent talk about big data at CBS, Matz described the town she grew up in — a tiny village in the southwest corner of Germany. It had a population of just 500 people, smaller than many American high schools. As a teenager, living in such a small community was challenging. “Everybody knows everything about everyone else,” she says, “and in a way, it’s frustrating, because everyone is trying to pull your life in different directions.” But that familiarity came with some great benefits, as well: “It created a sense of belonging. And I felt like I got the best recommendations because they knew me.”

She likens her village to the echo chambers we exist in online. “We crave echo chambers; they’re very comfortable,” she says. And the recommendations are perfectly tailored to us. But what if we decided to travel? The companies that collect this information are uniquely positioned to give us a rare glimpse into another person’s village. “These algorithms know the behavior of everybody, all around the world,” she says. “They could let me hop into the filter bubble of other people. I could say, ‘Well, I want to see Google searches, not for what you think I want to see, but show me the Google searches for a 50-year-old Republican in Ohio.’”

“Right now, one of the dangers that we have with these echo chambers is there's no way for me to see what other people are seeing anymore,” Matz says. “Facebook and Google could offer such a magical machine tomorrow.”

Harnessing Algorithm Diversity

If these companies already have the data to do this, why aren’t they?

Especially in the shadow of the
Cambridge Analytica scandal — when Facebook data from millions of users was harvested and used for political ads without consent — allowing companies to manipulate personal data is a delicate landscape. “There’s a lot of emphasis on the potential dangers — and that’s absolutely necessary. You don’t want to think about amazing opportunities when your house is on fire,” Matz says. 

But for consumer-facing products, there are fewer ethical concerns, Matz says; the barrier to entry is simply imagination. “One of the reasons it doesn’t happen is that companies don’t see the value,” she says. Leaning into our echo chamber is part of their profit model. “Google only works because they use a recommendation-based approach. People get upset if they have to go to page two in their search result.”

But our experience doesn’t have to be an either-or choice. Companies could sell leaving your echo chamber as an additional feature rather than an inconvenience. “You could ask Facebook to let you explore the news feeds of other users who agree to be part of a ‘Perspectives-Exchange’ or an ‘Echo-Chamber-Swap.’ For a few hours, you could live your online life in their shoes,” Matz suggests. “Imagine Google optimizing their search results to guide you to the content you really should know about. The important gaps in your knowledge about immigration, for example. The arguments for stricter abortion laws you haven’t seen yet — and would likely never look for yourself.”

The data is already there to make this new future a reality, Matz says. “It all comes down to trust, excitement, and interest.”