As an increasing number of apps ask for more and more of our data, consumers have grown wary. Gone are the days of the default “public profile” as more users demand privacy and control over their personal information.
The Daniels Fund Ethics Initiative at the CU Denver Business School hosted a panel event on October 25, 2019 with three representatives from the Denver office of Strava, a hugely popular fitness tracking app, to discuss the ethical considerations of wide-scale data collection from Strava’s perspective.
Strava lets users record and share their fitness activities, from walking to hiking to cycling, and then analyzes the data to track just about any performance metric imaginable. The app makes it easy to see personal bests, share milestones, encourage friends and family, and find the best trails. Strava is immensely popular with both casual and high-performing athletes all over the world, with over 40 million users.
Last year, the app came under scrutiny after releasing a heat map that showed the global activity of its users. The map unexpectedly showed more than jogging routes, coincidentally revealing borders of secret military bases and soldier patrol routes, causing serious national security concerns.
During this time, the app also had less-than-ideal privacy settings for users. Like so many others, the app set users up with a public profile by default, had unintuitive privacy controls, and left users sharing sensitive data unknowingly.
Strava showed accountability by quickly and transparently addressing these major concerns with users’ needs their paramount concern.
Meg Howe, Senior Product Manager, works on a team focused on privacy, trust, and safety. “Athletes need to feel safe and comfortable using our product. For us it’s all about what the athlete needs in privacy settings.”
Strava has prioritized their users’ needs for privacy with their “Privacy by Design” framework:
- Transparency – athletes understand what they are opting into or out of
- Control – athletes have many customizable and easy-to-use settings
- Consent – athletes are explicitly aware of what they are sharing or not
“Control plus awareness equals people not being surprised,” said Howe. Strava added several new features to increase privacy and data control for users:
- Privacy Zones: Users can put a Privacy Zone around an area for protection to hide it from others. This avoids inadvertently revealing users’ homes, workplaces, or other areas that they frequent.
- Beacon tool: Users can send a text to three safety contacts so they can follow their location. Howe said, “My mom really appreciates that!”
- Right to be forgotten: This tool allows users to completely delete their profile and all data they’ve shared with the app.
In addition, Strava streamlined privacy settings and added more customizable options, making it much more intuitive for users to keep control of their shared activity. The changes resulted in many more users choosing the private mode, and a more positive user experience overall (like avoiding unwanted interactions from strangers).
Danielle Caldwell, Data Protection Officer, is responsible for dealing with the legal implications of data at Strava. She juggles legal compliance with maintaining athlete value on an ongoing basis.
Caldwell must be an expert in data protection, fully understanding the laws in 27 countries to advise the business on compliance matters. She conducts audits with product teams to “make sure that we are living up to our promises to our athletes” and must work hard to stay impartial and free from conflicts of interest to ensure the protection of Strava’s athletes’ data.
“We don’t want any second-class privacy citizens, so we start at the strictest regulations and give everyone the same functionalities – like the right to be forgotten – that extends to every Strava user even though Europe is the only one to require it,” she said.
Caldwell stresses that Strava goes above and beyond simple compliance when it comes to protecting users’ data. “We call it ‘Strava data’, but we operate as if it is our athletes’ data. We act as stewards of that data. We give them control and transparency. This is your data, we’re just using it,” Caldwell said.
Erik Sunde, Customer Success GIS Engineer, works with municipalities all over the world to plan infrastructure based on Strava’s immense pool of commuter data. He shares data with municipalities in a way that is ideally beneficial to both athletes and city planners.
“We want our data to help cities make cycling better, safer and more efficient,” he said.
Strava licenses their users’ data to cities for planning purposes, excluding data from athletes that have chosen to opt-out and/or make their activities private, and only once the data has been rigorously aggregated and deidentified.
“Strava takes aggregation and deidentification so seriously, that the data we’re using barely resembles personal information at all,” Sunde said.
Patricia Nickel, the event’s moderator, summarized, “[Strava] couldn’t have anticipated every scenario in which people would use this data. Mistakes are going to happen, and it’s important to be transparent when they do.”
The conversation has shifted recently from not caring about or having control over our data to demanding it. Businesses that respect consumers’ demands will not only earn their trust and support, but will also set the precedent for others to lead with integrity.
The Daniels Fund Ethics Initiative at the CU Denver Business School is a grant awarded by the Daniels Fund aimed at strengthening ethics education for business students and extending ethical behavior beyond campus and into the community. The Business School uses the grant to instill a deep and unwavering ethical foundation through course curricula, events, and community collaboration. Explore more ethics-focused opportunities for Business School students here.