If knowledge is power, then the vast knowledge contained in big data is powerful enough to change the world.
But simply harnessing bits and bytes into tidy tables or whizbang algorithms won’t move the needle on progress. Companies, organizations, and researchers must view the data through a social, political, and cultural lens if they want to solve problems.
That’s what global data and analytics research firm Nielsen did with its recent report on the state of inclusion and representation in television programming in the U.S. The study, titled “Being Seen on Screen: Diverse Representation and Inclusion on TV,” pointed out that while women make up 52% of the U.S. population, they only have a 38% share of screen. The numbers are even lower for women of color.
“Critically important is not just if diversity is present, but how it is present,” noted Jamie Moldafsky, chief marketing and communications officer for Nielsen and a Wharton graduate.
“Particularly for women of color, when they see themselves represented, they are viewing themselves in programs with themes like dysfunction, rivalry, melodrama, and police stations,” said Moldafsky. “In contrast, white women see themselves in programs with themes like family homes, love, friendships, and husbands.”
She continued: “Asking the right questions, listening to the data, and always, always representing the truth is what makes companies strong. When companies go wrong, it’s often around not listening to the data, not staying focused on the truth, and not taking the right action.”
Moldafsky was the keynote speaker at the 2021 Wharton Annual Analytics Conference, a virtual event focused on sharing industry best practices and the latest research insights from data science. The event was hosted by Wharton Customer Analytics in partnership with Analytics at Wharton and Wharton AI for Business.
“When companies go wrong, it’s often around not listening to the data, not staying focused on the truth, and not taking the right action.”–Jamie Moldafsky
Moldafsky and several other speakers emphasized the importance of using data along with context to drive change. Beyond making good business sense, it’s also a strategy that promotes diversity, equity, and inclusion.
“I believe we have a moral obligation as business leaders and data stewards to use our expertise as agency for positive change,” Moldafsky said. “It’s a very cool time to be in data and analytics, but the responsibility to use our skills as a force for good is a real one. This plays out on global stages in U.S. politics and in the arts and entertainment world.”
Read on for more conference highlights:
Moldafsky: Seeking Truth Through Data Science
With more than 2,000 technology and data scientists in the company, Nielsen can offer a veritable buffet of data to its clients across the media ecosystem. Moldafsky calls it a “unique perch” from which to view a world that is changing with the times. The events of 2020 — including the death of George Floyd and widespread protests for social justice — have raised the bar for diversity and inclusion. Audiences are demanding greater diversity of content; they want to see themselves and their stories reflected in movies, music, and television. At the same time, content creators have practical business concerns.
One way Nielsen is bridging that gap is by ensuring their audience panels, which are used to collect data samples, correctly represent the demographics of the population. That’s why the company has been a strong advocate of the U.S. Census, establishing Project TrueCount to help support a fair and accurate census. The company also filed a brief with the U.S. Supreme Court in 2019 to oppose the addition of a citizenship question on the census, fearing it would result in the undercounting of immigrants.
“Sometimes staying true to the truth and making sure your data is as accurate as possible requires you to take a stand and to do things that you otherwise might think are beyond the purview of your particular mandate,” Moldafsky said.
“Accurate data is reshaping content in powerful ways,” she continued. “It’s shattering stereotypes about race and challenging audiences to question their assumptions.
“What goes on off-screen is determining what goes on screen. Together we’re positioned to help the industry create lasting change. As a measurement company, we understand that measuring disparities also makes them actionable.”
Alex Vaughan: Fairness in AI
As chief science officer for talent matching platform Pymetrics, Alex Vaughan spends his days rooting bias out of online hiring programs. It’s a topic that’s received much attention in recent years from Wharton researchers who have studied how humans bring their own biases to bear when creating algorithms. Vaughan offered a simple example of a major tech firm that posted an ad looking for a job candidate with 12 years of experience as a Kubernetes administrator. The problem, Vaughan said, is that Kubernetes applications have only been around for about six years.
“When you’re making decisions about people’s lives, as we do in the hiring space, it’s really important that those decisions feel like they’re accessible to someone.”–Alex Vaughan
“This sort of thing seems like a typo or a one-off but is actually pretty pervasive in the hiring realm,” he said. “Oftentimes, the things that you try to get when you look for the right person for the right job are impossible or impossible to define.”
The solution starts with very high-quality data, good machine learning, and an exact way to measure outcomes. To develop better data, Pymetrics used neuroscience to create cognitive and personality assessments that determine whether a candidate has attributes that would be a good fit for a job. Vaughan said it’s a more equitable approach than a filter that weeds out candidates based on experience. The company’s work relies on three key metrics: prediction quality, fairness, and explainability.
“When you’re making decisions about people’s lives, as we do in the hiring space, it’s really important that those decisions feel like they’re accessible to someone,” Vaughan said. “It’s not just a black box. It’s not just an arbitrary decision.”
Faculty Roundtable: Doing Better with Data
Mary Purk, executive director of Wharton Customer Analytics and AI for Business, closed out the conference with a faculty roundtable discussion on the main takeaway: gathering and analyzing data with the right context.
Management professor Katherine Klein, who is vice dean of the Wharton Social Impact Initiative, said the first barrier is in data collection because systems are decentralized, and information is often siloed. “We need to be mindful of the limitations and the challenges,” she said.
One way to achieve better outcomes is through mixed methods research, which pulls in qualitative, quantitative, and textural data. Eric Bradlow, marketing professor and vice dean of Analytics at Wharton, calls it “data fusion.”
“To me, mixed methods research is the future of understanding consumer behavior,” he said.
Wharton management professor Stephanie Creary, who is an identity and diversity scholar, said it’s difficult for people with different expertise to “get on the same page” and share their knowledge in emergent fields of research. But it is possible, especially when so much is at stake.
Hiring, for example, is so consequential that rooting out bias is imperative, she noted. “It’s really important that we get it right when we’re making talent management decisions,” Creary said, exhorting companies and organizations to improve their tools and audit their processes.
Bradlow agreed and cautioned companies to make sure they are using multidimensional data when hiring.
“There are lots of different ways to think about the data you collect,” he said. “With the right optimization strategy, you can use data as a source for good and make business operations like hiring practices more equitable.”