8 Insights from the field on data privacy
Since the management of data privacy is something we need to take into consideration more and more often when developing service design solutions, but that still makes us struggle a lot, we decided to conduct some interviews with designers who developed some expertise in the field. Thanks to their stories and perspectives we have been able to identify some key insights to stimulate reflections on this important topic and the way we approach it while involving people in participatory activities or developing service models based on data gathering and interpretation.
Interviews were conducted by students of the VII edition of the Specializing Master in Service Design of POLI.design — Politecnico di Milano within a didactic activity, in the period between 1st June and 12th June 2020. The sample was composed by 16 people: 3 User Researchers; 1 UX Designer; 1 Information Designer; 1 Anthropologist; 1 Co-Founder & Editor-In-Chief; 2 Design Leads; 1 Head of Creative; 1 Founder; 1 Director; 2 Senior Designers; 1 Design Strategist.
1.
Data privacy is a basic human right and as such should be understood by users and designers alike
Privacy considerations have become more relevant as personal information has turned out to be one of the key ingredients of certain types of product and service innovation. Thus, extensive personal data collection should be followed by an increase in designers’ data literacy and awareness in data privacy, and, at the same time, users should be put in the position of having full control over their data, becoming accountable for them as well.
“It’s like a new sector of technology in which designers should promote themselves as the guardians of people’s information.”
“Trying to create something nice, you may come up with something ugly, if you fail to maintain users’ privacy.”
“The user should control what is stored there and have the ability to remove it.”
2.
User awareness of data privacy is increasing but it can vary according to contextual factors
When choosing a service, users often pay more attention to features such as personalization rather than taking care of privacy-related issues. Nonetheless, awareness on data privacy is increasing as well as the need to involve users in their information management, being transparent about processes for retrieving/erasing personal information from the digital service. The geographical context and cultural factors are among the aspects that might affect the level of user awareness of data privacy.
“Many people don’t know what is actually happening at the back-end of the product they are using.”
“Monitoring and keeping track of data is a huge challenge for users.”
“In India, the regulation model for data privacy is still to be developed.”
“Awareness about the cultural differences is also relevant, however, designers are not always aware of these differences.”
3.
Privacy concerns can influence the recruitment at the ground of primary research
Data privacy concerns can pose serious limitations on participant recruitment methods and tools for researchers. Such concerns may cause the radical restriction of recruitment channels, reducing consequently the possibilities to build a relevant and varied sample, and leading to more biased research results.
“Recruiting participants only from newsletter subscriptions is safe, but the audience range is then restricted and more biased”
“Even for some GDPR compliant recruiting platforms, there are still unsolved issues.”
“When we use 3rd party recruitment agencies, we need to train their customer support on data privacy communication with users.”
4.
The complexity of the legal language affects the designers capacity to collaborate with legal specialists effectively
Typically, GDPR and privacy-related regulations are not “designer-friendly” to the extent that oftentimes businesses struggle in understanding all the requirements and achieving compliance. Because of this, design and legal departments more likely work in silos without informing and influencing each other, emphasizing even more the distance between the two worlds. While designers should give an important contribution from an ethical point of view, preserving users’ interests.
“It would be useful for designers to have practical guidelines, which would help them embed user’s privacy rights/interests into the design process.”
“We are not lawyers, so it’s very difficult to say that what we understood from the regulations like GDPR is 100% correct.”
“There are a lot of things that are considered legal but are ethically questionable.”
5.
It is difficult to balance design choices and technical requirements with privacy regulations.
Service models based on the collection of personal data are inevitably influenced by GDPR and regulations alike, requiring efforts in trying to maintain the original business and design objectives despite the limitations imposed by the protocols. For improved privacy decisions, understanding technical issues is probably as important as understanding legal issues, requiring designers to do the math with them as well.
“Our user boarding process had 4 steps, now because of the GDPR there are 8 steps, so the effort has doubled and half of the people can’t make it through; we literally lost many customers.”
“Regulations like GDPR are good, but if they are too rigid and strict they can slow down innovation”
“Not all the stakeholders at the table will be interested in privacy by design.”
“Sometimes, there are technical issues in identifying what data will be needed.”
6.
There are no common frameworks yet, but certain agencies have developed some best practices
Common guidelines and methods on how to properly collect, manage and visualize data are adopted in a scattered and fragmented way. Some best practices though, developed by illuminated agencies, could be taken as points of reference. For example, to keep data anonymous using archetypes instead of user-based personas, or to adopt transparent practices and patterns in UX choices in order to guarantee a strong and long-term success for the new product/service.
“Try not to only apply regulations, but craft experiences around these regulations”
“Instead of saying who exactly gave us the quote, it’s better to speak in terms of archetypes”
“Data minimization is one of the main approaches we as designers can apply to maintain data privacy: use only as much data as needed to deliver the service and not more than that”
“Our aim is to offer the best experience, so, for example, we come up with options like autosave, but if a user doesn’t want to use it, it’s okay — it’s his/her choice.”
7.
It’s hard to define for how long data can be stored
Despite regulations like GDPR providing norms for managing data in studies like usability tests, there are still many uncertainties about how long they can be stored. Sometimes even if the researcher is able to reach an agreement with the company about data deletion, no one is actually supervising the process. As a consequence, there are companies who never delete data collected through research.
“For usability tests, it’s fine — we delete the data after 3 months or so. But what about generative studies?”
“The GDPR doesn’t specify anything about data storage period for generative studies.”
“We reached agreement with the company about how long the data will be stored, but who’s supervising this?”
8.
Co-design helps overcome inherent design biases and achieve trust
A certain kind of subjectivity could affect data interpretation leading to biased or unplanned design decisions. Co-design can come to rescue us also when dealing with the design of data-related solutions. Collecting different perspectives and bringing value through diversity might help mitigate some personal biases. Moreover, giving back control to users thanks to co-creation is a way to empower trust in the long-run road to success.
“Designers are inherently biased. The whole point of design research is to remove that bias. ”
“A diverse team can help eliminate some of the biases.”
“A critical approach from the designers’ side is important and necessary when it comes to data-driven services and products. But the differences in sensitivity towards privacy bring difficulties when it comes to scaling these services/products”
“Team leaders needs to give their teams enough time to learn and improve on acquiring knowledge and skills about data privacy”
Confident that co-designing data management and protection processes with team members is the way to improve designers skills in data ethics, we created the Data Management Framework for Service Design.
A synthetic and designer-friendly reference frame that can help a design team to manage data-related issues into everyday practice.
Designed by Nare Krmoyan, Pranjali Pachpute, Xinwei (Camilla) Tao, Klong Swegwan, with the support of expert mentors Agata Brilli and Alessandro Carelli