smartwatch

It seems we’ve gotten used to handing over our personal information to every app that asks and stopped being surprised when we hear of another data breach in the news. But more and more consumers are worried about the steady rise of personal data collection and the nefarious places it often ends up. How do we as users protect ourselves, and how can companies be accountable when they gather our data?

To address these questions, the Daniels Fund Ethics Initiative Collegiate Program at CU Denver Business School, along with the CU Center of Bioethics and Humanities invited experts from Strava, a popular fitness tracking software company, and BioIntelliSense, a medical wearable vital signs monitoring company, to explore the ethical considerations of wide-scale data collection from their perspectives.

Danielle Caldwell, J.D. is a Data Protection Officer at Strava where she plays a unique role for an attorney. “I have the ability to advocate for our users’ privacy and legal rights, where many other lawyers work first-and-foremost for the company.” A self-described “data-protection enthusiast”, she has years of experience in privacy and regulatory compliance for various health care companies and earned a law degree in health law as well as a master’s degree in bioethics.

Strava is a software company primarily targeting athletes to upload, track, and share their physical activity and data. With over 70 million members, the app is a hugely popular tool for athletes of all levels to track and share activity, which means Strava collects a great deal of personal data. Danielle says Strava takes privacy very seriously. “The more data you collect, the more responsibility you have for your users.”

Strava strives for transparency and to create trust by openly disclosing to their members the purposes for processing their data, retention periods for their data, and with whom data will be shared. Strava also prioritizes asking for their members’ consent at all points, giving members control over what data they share and power over how it’s used.

Danielle says her fervent beliefs motivate her every day at Strava, “Privacy is a human right and is something that deserves our attention and preservation.”

James (Jim) Mault, M.D. is Founder and CEO at BioIntelliSense. Trained as a cardiothoracic surgeon, Jim has over 35 years of experience in the health IT and medical device field, founding five companies and holding over 80 issued and pending patents for a variety of novel innovations within the industry.

BioIntelliSense provides medical-grade, multi-parameter vital signs monitoring through wearable devices easily controlled by the user. The idea is to give recovering patients a way to get out of the hospital (where there are a lot of contagious germs) and back in the comfort of their own home, while still allowing the doctor to monitor and care for the patient.

“What we’re seeing now is a massive transformation of our healthcare system,” said Jim. The COVID-19 pandemic has accelerated that transformation—remote care has now become the primary means for a lot of care that otherwise would have been facility-based.

BioIntelliSense devices continuously monitor vital signs, biometrics, and symptomatic events like skin temperature, heart rate, coughing frequency, fall detection, and more. The data transmits via mobile app to a secure cloud and then analytics are applied. The data is fully encrypted, even at rest, which is significant for any wearable device.

The individual user also has full control of their personal information and can choose to share information with their doctor, family, employer, or even no one.

Danielle and Jim answered questions from the audience and moderator Eric G. Campbell, Ph.D., Professor of Medicine & Director of Research at the CU Center for Bioethics and Humanities.

Describe the business model for your companywhere do the revenues come from, including any parties that may pay for access to the data you collect?

Danielle Caldwell: Strava recently began charging its members for use of their service using a subscription model. They may also partner with other organizations for race promotions or challenges, but Strava does not share any personal data with them; only the athlete can choose to share their personal information directly with third parties. Strava also shares their Metro data with qualified city planners at no cost, but that information has been de-personalized and is an aggregate form, so they’re only seeing numbers like, for example, how many people have gone through a particular intersection. Strava takes strict measures to mask or exclude rural outliers and also only shares public activity data where their members have opted-in to share in that specific way.

James Mault: BioIntelliSense is an FDA-regulated medical device company. Originally, the devices were purchased by hospitals and provider groups. With COVID-19, we had a substantial new demand from the non-provider arena, like universities, employer groups, or even sovereign countries wanting to use our devices to screen for COVID-19 symptoms. So, we developed a business model to allow our devices to be purchased essentially over-the-counter—B2B and direct-to-consumer. We make it clear that the data will still be treated under HIPAA privacy rules and users also retain their control of their data sharing. We also made it very clear that under no circumstance will we sell any data we collect, even de-identified data.

In research, we’re required to get ‘informed consent’. It’s not enough to get the consent of someone to participate—we actually have to inform them of benefits and risks. How are the people that use your products truly informed about the privacy protections and risks?

JM: This has been a big passion of mine. As Chairman of the Health Board at the Consumer Technology Association, we worked with federal agencies to establish a set of guiding principles that included the need for obvious, privacy disclosures that are not hidden in legal-ese and fine print. We as an industry need to take the responsibility so that people know up front what they are signing up for with true transparency.

DC: Strava provides the Privacy Label feature, which shares concise privacy information so that users can really drill down to the answers they want without having to dig through the full legal privacy notice document. Also, when Strava has partnered with organizations that are doing research with human subjects, we insist that that partner undertakes the full informed consent process for all study participants.

Strava has had a number of major privacy and data-related controversies in a fairly short history as a company, including the Global Heatmap issue that enabled hackers to map military bases and the Relive controversy where Strava prohibited users from linking their data to third parties. What are you and the company doing now to be more proactive about more potential and future problems?

DC: At Strava, we’re always reviewing our products. Are people using this privacy control as intended? Do they understand the implications of the choices they’re making? We’re constantly providing help articles to guide people along their decisions. We evaluate all the time to determine how to improve education and functionality.

Also, with the Global Heatmap and Relive issues, oftentimes social media and headlines can sensationalize an issue that isn’t actually that controversial. The best thing we can do is educate people and make sure that the choices they’re making while using our product are what they are intending to make. Compliance is an ongoing process. We’re always moving towards making things better, more clear, and more transparent.

Family members often argue against patients coming home too early because the burden of care falls to them. Have you created a technology that simply shifts the financial and treatment burden to untrained family at home?

JM: There’s always a balance between risk and benefit. Every hour that a patient is in the hospital, they are exposed to additional nasty bacteria and viruses, that could cause them to get sicker and ultimately die (called nosocomial infections). When I was a surgical intern at Duke, patients routinely spent ten days in the hospital after cardiac surgery. Over time, we learned that getting these patients up and out of the hospital in four days resulted in dramatic improvements in their outcomes, reducing infections and recovery time.

There is also a significant advantage to the convenience of not having to drive back and forth to the hospital to visit sick family members. We also ease pressure and anxiety on family members worried about monitoring everything, knowing that the care team is handling it and can be alerted if any problems start to develop. We make it easier to take better care of people no matter where they are.

Past controversies were due to unexpected uses. How can we, as users and companies, do better at anticipating surprising uses of personal data?

DC: One of the best things you can do for yourself to protect your data is to read a privacy notice. As you go through those and something isn’t clear, know that every company has someone to answer your questions. We have to engage as users so that we can understand what we’re signing up for and so that we can make an informed decision to use or not use a product or function.

The best things companies can do is be transparent with their users. Transparent features like alerts, pop-ups, and concise guides help users make more informed choices. We disclose our privacy practices so that our users can hold us accountable. We can answer questions and explain our privacy practices to our members, and demonstrate our compliance.

To get even more insightful audience questions and answers, view the entire event recording here.

CU Denver

Business School

1475 Lawrence Street

Denver, CO 80202

303-315-8000

CU in the City logo