As a co-founder of the Mobile Innovation Group at Stanford, and director of the Greif Center for Entrepreneurial Studies Technology Commercialization Initiative, associate professor of clinical entrepreneurship Pai-Ling Yin explores the mobile app ecosystem, from industry evolution to platform competition to entrepreneurial strategy.
Her latest study, “Child Apps, Personal Data Regulation and Home-country Compliance,” with Grazia Cecere Institut Mines Télécom (École de Management, Paris), Fabrice Le Guel, (University of Paris Sud), Vincent Lefrere (University of Paris Sud), and Catherine Tucker (MIT), looks at the connection between where a child’s app was developed geographically and whether it instills safeguards to protect privacy.
Yin said the U.S. government established the Children's Online Privacy Protection Rule, or COPPA, under President Bill Clinton in 1998. Since that time, the internet and its byproducts, such as children's apps, has grown exponentially. Today, 98 percent of children under 8 use mobile devices, and yet researchers still have not found any empirical studies of the market for kids' apps and the effect of privacy regulation."
“We need to be forward thinking in terms of policy if we’re talking about kids.” -- Pai-Ling Yin, associate professor of clinical entrepreneurship
Today, COPPA standards in mobile apps are upheld by private entities, such as the Google Play Store. In 2015, Google launched Designed for Families (DFF), an app certification program that developers can opt into.
Created in part to assuage the flow of new apps hitting the market monthly for parents trying to find quality digital programs, DFF certification has become one of the few methods available to prevent consumers from being taken advantage of in terms of privacy violations.
Yin said she doesn’t believe most developers set out to capitalize nefariously from collecting personal data, but, “Facebook is a good example of what can happen. It’s all good until we’re exploited.”
Time to Talk Privacy
Now, she said, “We need to be prepared. It’s time for public discussion about ‘what is privacy?’ and what does it mean for children? How do we think about privacy internationally?”
In her recent paper, Yin and her co-authors found that developers located in regions not covered by privacy regulation collect more sensitive data about children, relative to developers based in the U.S. or those in the Organisation for Economic Co-operation and Development (OECD), an international group of 37 member countries founded in 1961.
However, their findings also show that developers who comply with the Google self-certification program and are located in countries that do not have strong privacy regulation generally collect less data about children. “This suggests there are spillover effects on the behavior of foreign developers from platform efforts to facilitate developer compliance with U.S. privacy regulation,” Yin said.
The study provides some of the first and most comprehensive data about automated data collection practices surrounding very young children. It produced new statistics about the scope and depth of data collection about children and discusses implications for policy. For example, the study finds that developers who do not reveal their geographic location tend to show bad behavior regarding children’s privacy. According to the study, “this is an important result from a policy perspective. For instance, the platform might make provision of an address a condition for approval, which could affect the collection of children's personal data.”
Overall, the researchers said their results suggest that the child apps market does not respect children’s personal data. “Parents need to teach children. It’s like drinking, we need rules until kids grow,” Yin said. “We need to be forward thinking in terms of policy if we’re talking about kids.”
She suggests that, “Parents get informed, get the facts. And researchers need to collect facts about what exists. Which apps comply, which don’t? How do they differ? There’s no policy around this yet.”