Boston Globe | Invading your privacy? There’s an app for that.
Data privacy laws for cell phone apps need updating
Story Originally Appeared in The Boston Globe
As a society, we’ve come to accept as normal — and sometimes even welcome — daily digital incursions into our privacy. Drive to a restaurant, and the GPS app conveniently offers you a menu. We let our watches monitor our heart rate and input our menstrual cycles into our phones. Many apps use the personal data they harvest in helpful ways that make their products better; some, like Uber, barely function without knowing exactly where you’re located right now.
But the easy availability of the data collected by all those apps can be dangerous. Take the Catholic priest forced to resign after a Catholic news source discovered he was using the gay dating app Grindr, or the firm that published heat maps showing the communities where visitors to Planned Parenthood clinics live. Government agencies purchase cellphone location data, reportedly to track undocumented immigrants. Location data can identify where someone gets health care, worships, or spends their nights.
While location data is theoretically anonymous, it isn’t really. Data brokers sell records of movement tied to a particular device, which is easy to de-anonymize. The New York Times in 2019 obtained a dataset with cellphone location pings from 12 million users and combined it with publicly available information to track the movements of then-president Donald Trump through a member of his Secret Service detail. The newspaper also tracked other senior officials in the White House, on Capitol Hill, and in the Supreme Court.
The ubiquitous collection of digital data — and its availability to anyone willing to pay — cries out for regulation. App developers shouldn’t be stopped from using personal data from their customers in responsible ways. But those that sell that data to third parties must be held to stricter standards. (The Boston Globe collects, but does not sell, location data.)
The Federal Communications Commission bans cellphone companies from selling data, but that ban does not apply to apps. App developers can earn money by placing code in their app that tracks users’ locations and sends that information to data brokers. Users can disable location tracking if they know how, but some may not even be aware they have that option — and they may not know that their location data is being collected in the first place. The FCC last year sued a data broker, arguing that selling geolocation data reveals private information such as visits to churches, homeless shelters, and abortion clinics. But a federal judge dismissed the case.
Ideally, the issue would be dealt with by Congress to avoid a patchwork of state regulations. Failing that, states should act.
In Massachusetts, a coalition of reproductive rights advocates and others are supporting a bill that touches on both the collection and sale of data. It would ban companies from selling or trading cellphone data that can pinpoint someone’s location within 1,850 feet. Companies would only be allowed to collect this data with user consent for specific purposes: providing requested services to consumers, like an Uber or a GPS, responding to emergencies, and complying with state and federal law.
The group is touting the bill as a way to prevent officials in states where abortion is criminalized from prosecuting residents who obtain an abortion out-of-state. There are also concerns that debt collectors can use the information to harass people, abusers can use it to stalk victims, and people opposed to gender-affirming care can identify and harm providers and patients. Advocates point out that a state law letting abortion providers keep their address confidential is meaningless if someone can track where the cellphones that are located weekdays at an abortion clinic are taken after hours.
Location data is “uniquely dangerous,” said Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts. “It shows everywhere we go every single day for our entire lives, because our cellphones are basically attached to us at the hip.”
Massachusetts would be the first state to impose a ban. A poll conducted by the ACLU of Massachusetts and Beacon Research found that 67 percent of Massachusetts voters polled were unaware that a company can sell their location data, and 93 percent oppose that idea.
Still, more targeted regulations than an outright ban may be a better place to start. After all, there are legitimate reasons third parties might want to buy location data — for example, an investor wants to decide whether to invest in a mall; municipal officials wonder which downtown areas are attracting workers; or health researchers seek to determine where people traveled during pandemic lockdowns.
One targeted approach is an opt-out model pioneered in California, under which someone can opt out of letting a company share their personal data. Massachusetts state Senator Barry Finegold, who co-chairs the Joint Committee on Economic Development and Emerging Technologies, has introduced a comprehensive data security bill that would, among other provisions, let state residents opt out of getting targeted ads and allowing a company to sell their personal information. It would also require companies to obtain consent before collecting sensitive information, including health data, biometric data, and location data; give residents the right to delete personal information a company maintains; require companies to disclose how they use personal information; and establish heightened privacy protections for children.
Ten states have passed data privacy legislation, including Connecticut, Colorado, Virginia, Utah, and Texas, almost all of which follow the opt-out model. States vary on whether the opt-out covers both sensitive and non-sensitive data and whether consumers can opt out of having data collected or just sold.
What should be clear is that legislation resting solely on disclosure to consumers would be inadequate. Attorney Chris Hart, who co-chairs Foley Hoag’s privacy and data security practice, said the biggest problem today is the opacity and lack of public knowledge regarding what data is being collected and how it’s used. Laws mandating disclosure are largely ineffective, since few app users read privacy policies. Opt-out laws aren’t perfect either — Christo Wilson, associate professor at the Khoury College of Computer Sciences at Northeastern University, said enforcement of California’s law is weak and consumer preferences are not always respected — but would be a step in the right direction.
While crafting the right law will be challenging, Massachusetts has many bright minds in technology who understand what works, what doesn’t, and how to balance legitimate commercial uses of the technology against privacy-threatening practices. As Hart said, “We need to be thinking much more carefully about how we protect individuals, but also let organizations innovate and do what they need to do.”