Our thoughts on privacy laws and UX

Barrier chain fence representing privacy laws and UX

Unless you’ve been living under a rock, you’ll have heard about GDPR. This legal framework for the collection and processing of personally identifiable information within the European Union came into force in May 2018 and has directly impacted the work of marketeers, designers, sales teams and you guessed it… UXers.

Earlier this year, we saw a very interesting talk about privacy laws and UX at UCD Bristol by Lon Barfield, a computer scientist and design thinker. His talk ‘Privacy by design, changing the mindset’ focused on the modern concept of privacy and data protection and challenged how we as UX designers and thinkers, have (or haven’t) integrated this new concept into our practices.

“Legally, you have to think about privacy when you design.” — Lon Barfield

The idea of ‘Privacy by Design’ was created and developed alongside a set of seven foundational principles by Ann Cavoukian in 1995 for the systems engineering world. Lon has adapted these principles for a UX audience as follows:

  1. Be proactive in doing it
  2. Use the right defaults
  3. Make it part of the design process
  4. Avoid trade-offs and find ways to successfully implement privacy from the start
  5. Data life cycle is part of the design
  6. Be transparent: don’t lie about what you are doing
  7. Keep it user-centric (good UX design!)
Lon talking at UCD Bristol in January 2019

With GDPR and a countrywide heightened awareness of online privacy, this is something we’ve had to consider more and more in everything we do, from UX research to design projects. Spurred on by Lon’s talk, we’ve spent some time assessing what’s changed and what’s important to consider when it comes to ensuring we adhere and go above and beyond GDPR. Here are our team’s musings on the subject of privacy laws and UX.

David, User Research Delivery Manager

Even before the introduction of GDPR, Data Protection played a large role in what we do with participant data. The law has tightened things up regarding what companies must do with identifiable information and this has also impacted (and improved) how we do things here.

The type of information you can gather for a participant includes:

  • Full name
  • Email address
  • Phone number
  • Job role
  • Salary
  • Number of children
  • Address (usually just city level)
  • Bank account details (for payment)

We manage GDPR requirements when presenting to our clients by anonymising or removing identifiable data. It will be completely removed when it is no longer required (after a participant incentive has been paid, we remove their bank details, email, phone and address).

There is information your client doesn’t need to see which may have been part of identifying a persona. An example being:

Joe Bloggs - UX Consultant - £35,000 - 3 children - 22 Madeup Street, Bristol

This information is identifiable and some of it is not needed by the client. The client may not want to know how many children the participant has and certainly wouldn’t need their address. This information may have been key during recruitment of participants but isn’t necessary now. A simple tweak to the data would be to remove this, as below:

Joe - Works in UX - £25,000 to £45,000 - South West

This conveys the required information for the client but in a GDPR compliant way. It would be exceptionally difficult to identify this specific person called ‘Joe’ with this truncated information.

Let's get your UX strategy sorted

We combine traditional business planning with actionable insights through workshops, research, testing and idea validation.

Find out more

Dave Parry, Senior Product Designer

2 factor authentication (or 2FA) is a way of securing an account beyond a mere password. The measure helps protect against common ways an account can be breached at the login level. Common passwords (the most common in 2019 being ’123456’ 🤦‍♂️), are given an extra, much-needed, level of security by requiring a second authentication factor, such as code from a text message (as used by Facebook and Google) or a dedicated device, like the ‘calculator’ style device you probably have from your bank.

Sadly, 2FA is not the golden bullet for client-side security. Unfortunately a useless password and a breached second factor (both can be obtained through social engineering - sometimes called phishing) can let ne'er do wells into accounts loaded with sensitive personal and financial information, as well as private media.

Clever, elegant, transparent and discretionary authentication methods are a much needed practice in the digital space and something I strive for in all my designs.

Some of this means that we need to reexamine our password requirements. The 8-12 characters of upper case, lower case, punctuation, hieroglyph, number, forgotten symbol of an old god, sorry no spaces style of password are outdated and lead to, what XKCD memorably described as “hard for humans to remember, but easy for computers to guess.'' Support for high entropy ‘correct horse battery staple’ type passwords, unique for each website, app or device, as generated by apps like 1Password are a good start.

Related image
Image taken from www.rackafracka.com

Industry standard patterns may also need to be revised and adapted for greater effectiveness. For example, hidden characters in a new password field are an unspoken standard practice. How useful is this in a user’s private home or on a personal device? Hiding characters by default may lead to shorter, repeated or forgotten passwords. Better still, why do we even need passwords? Services like Slack use ‘magic link’ emails that provide time-limited links for users to login without the need for passwords. Logging in with Google and Facebook are also an interesting way of removing the need for a service-specific password, though anxiety around the fate of accounts linked to services we may choose to remove our presence from is real, as I discovered when deleting my Facebook account a year ago.

At the heart of GDPR regulations is privacy and security by design. The trust a user places in an organisation, the clarity with which they communicate, and transparently demonstrate discretion exist in a feedback loop. The more a website or app shows that it’s collecting only highly relevant data, rather than casting a broad net, the more confidence users have in the product or service. Facebook was recently discovered to have betrayed this trust by using their 2FA passwords for advertising and establishing connections in their social graph, without clearly stating that they would at the point of capture. Piled on-top of the Cambridge Analytica scandal, Facebook is now trying to reestablish users’ trust which may never return.

Adam, Managing Director

My concern about privacy laws is that they’re written in a way which is detrimental to users by people who have no understanding of the technology and how people use it.

For example, when did you last read the inevitable cookie message when landing on a website? During user testing, we frequently see participants dismiss them immediately by clicking “Accept” or worse still, using the site with a third of the page obscured!

These messages don’t help users manage their privacy, they impede their use of the web.

We’ve seen a proliferation of long, legalistic essays next to checkboxes related to GDPR marketing preference rather than simple statements helping users manage their preferences.

None of these things benefit users or businesses. Whilst the spirit of GDPR legislation is a great idea the execution by government and industry is atrocious.

Adi, UX consultant

I don’t feel that the changes in privacy laws have impacted the way I design too much if I’m honest. That’s because we were already pretty hot on transparency in relation to advising our clients on the design around opting in and out anyway. The only change we’ve made is in some situations, we now add a few more checkboxes and extra text where required. When you keep the human you’re designing for at the front of your mind, being open and honest with them about what you’re asking them to sign up for isn’t really difficult. And that’s what we do here, we design everything with the user in mind so transparency is key. In my view, following GDPR in relation to data collection is just reiterating stuff that you really should have been doing in the first place. It’s just best practise.

Kenton, UX consultant

On thinking about how we share user testing reports with our clients he said that “not an enormous amount has changed since GDPR became a thing. Even before then, we would only ever show what is relevant to the client.”

As part of our reporting, we include video highlights reels from our user testing. On this, he said “we’re careful that the footage itself shows only what the participant can see on their own screens. Moreover, the different clips are only identifiable by a first name and no faces are shown. Participants are also informed beforehand that footage will be recorded of their session”.

 

Privacy laws and UX are inextricably linked. As UX practitioners, we put the user at the heart of everything we do. We work to make their journey through whatever website or digital product we're focused on as easy, intuitive and enjoyable to use. Making sure their privacy is protected is part and parcel of that.

Benton reading a blog post

Keep reading...

Sign up for our monthly newsletter to hear our thoughts and opinions on the latest UX trends in tech, design and research.