Search

Lachlan's Research

ESRC Data PSST! Seminar Series

IMG_20160519_182523A couple of weeks ago I ventured over to North Wales to  Bangor University for the final seminar in the ESRC Data PSST! Seminar series. As a bit of background, the project has been running for the past couple of years, and has seen many different speakers and attendees meeting up for critique and discussion around the themes of surveillance, transparency, non-state actors and political communication.

Being rather late to the series, I was pleasantly surprised to be invited to come along as a speaker and participant for the final event by PI Vian Bakir. I had attended the previous session at Cardiff University after colleague and friend Gilad Rosner suggested I come along. The Cardiff event focused on the role of non-state actors in surveillance and the challenges posed for traditional notions of transparency. (I’ve put my position statements from both events at the bottom).

IMG_20160519_190813For this one, I had a few hours driving over to Bangor. It was a dreich Thursday evening from Ashbourne, but there were bonny sea views on the A55 and Gojira’s album l’Enfant Sauvage for company🙂 Bangor is quite picturesque as it gazes out over nearby Welsh forests, estuaries, cliffs and mountains. The pre-workshop dinner over in Menai Bridge on Anglesey was at a rather charming fish restaurant too.

 

20160520_095541

A very small group (9 or 10 of us) spent the day discussing how best to engage different stakeholders with concerns over transparency, state surveillance and data governance. I particularly enjoyed learning about the concept of translucency from Vian Bakir and Andrew McStay’s work on a typology of transparency. Interesting communication and engagement tools, from provocative short films to art projects, were discussed. An important point raised was how engaging the public and  policymakers require different approaches. The former may be more interested in educational or viral type material (like the recent Cassette Boy/Privacy International mashup on the IP Bill), whereas that won’t work for the latter, where they may be more responsive to reports, white papers and policy recommendations.

20160520_094747My presentation for this session considered practical approaches to engaging internet of things designers with privacy regulation. The privacy by design cards are a good example, but importantly  I looked at the broader shift IMG_20160519_173403towards bringing designers into regulation too. Finding the best forums to support designers in their new role is important.  Professional bodies like the ACM or IEEE clearly have strong links with their members and can guide on ethics and to an extent regulation. Equally state regulators like the Information Commissioner Office have a role in communicating and supporting designers on their compliance obligations. A particular challenge of this is the differing level of resources organisations have to deal with compliance, from startups and SMEs (with little) to multinationals (with more). The nature of support they may require will differ, and we need to better understand how compliance plays out in these different organisations.

It was an enjoyable workshop and thanks again to the organisers again for having me along🙂

I’ve put my position statements from Data PSST! Cardiff (March 2016) and Bangor (May 2016) below.

Seminar 5:

Transparency of Non-State Actors? The Case of Technology Designers and Privacy by Design

Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham

Cardiff (March 2016)

 

My position on transparency and non-state actors is framed in the context of European Data Protection (DP) Law. A key component of the upcoming EU DP reform package is the concept of data protection by design and default (DPbD). Designing privacy protections into a technology has long been considered best practice, and soon it will be mandated by law. It requires privacy concerns to be considered as early as possible in the design of a new technology, taking appropriate measures to address concerns. Such an approach recognises the regulatory power of technology which mediates behaviour of a user, and can instantiate regulatory norms.

Concurrently, regulation, as a concept, has been broadening and moving beyond notions of state centricity and increasingly incorporating actions of non-state actors. I’d argue privacy by design is a context where technology designers, as non-state actors, are now regulators. How they build systems needs to reflect their responsibilities of protecting their users’ rights and personal data, through technical and social safeguards.

However, the nature of their new role is not well defined, leaving open questions on their legitimacy as regulators. They are not normally subject to traditional metrics of good governance like public accountability, responsibility or transparency. Furthermore, the transnational nature of data flows, as we see with cloud computing for example, adds an extra layer of complication. The new DP law will apply to actors outside of EU, e.g. in US, if they are profiling or targeting products and services to EU citizens, meaning  there are national, regional and international dimensions to consider. Overall, the fast pace of technological change, contrasted with the slowness of the law has pushed designers to be involved in regulation, but without appropriate guidance on how to do so.

This is a practical problem that needs to be addressed. An important component is the role of nation states. State and non-state actors need to complement each other, with the state often ‘steering, not rowing’. The model of less centralised regulation cannot mean dispelling with traditional values of good governance. Instead state regulators need to support and guide non-state actors, on how to act in a regulatory capacity. How can transparency, legitimacy and accountability be reformulated for this new class of ‘regulator’: the technology designer. Much work needs to be done to understand how designers need support as regulators, and how the state can respond to this.

Seminar 6:

Making Privacy by Design a Reality?

Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham

Bangor (May 2016)

We have developed a tool that aims to take the principle of data protection by design from theory into practice. Article 23 of the General Data Protection (DP) Reform Package (GDPR) mandates data protection by design and default (DPbD). This requires system designers to be more involved in data protection regulation, early on in the innovation process. Whilst this idea makes sense, we need better tools to help designers actually meet their new regulatory obligations. [1]

Guidance on what DPbD actually requires in practice is sparse, although work from usable privacy and security or privacy engineering does provide some guidance [5, 6]. These may favour technical measures like anonymisation or tools to increase user control over their personal data [7]; or organisational approaches like privacy impact assessments. [2]

By calling on design to be part of regulation, it is calling upon the system design community, one that is not ordinarily trained or equipped to deal with regulatory issues. Law is not intuitive or accessible to non-lawyers, yet by calling for privacy by design, the law is mandating non-lawyers be involved in regulatory practices. We argue that there is a need to engage, sensitise and guide designers on data protection issues on their own terms.

Presenting a non-legal community with legislation, case law or principles framed in complex, inaccessible legalese is not a good starting point. Instead, a truly multidisciplinary approach is required to translate legal principles from law to design. In our case, we bring together information technology law and human computer interaction. [4]

Our data protection by design cards are an ideation technique that helps designers explore the unfamiliar or challenging issues of EU DP law. [8] Our cards focus on the newly passed GDPR, which comes into effect in 2018. They are designed to be sufficiently lightweight for deployment in a range of design contexts eg connected home ecosystems or smart cars. We have been testing them through workshops with teams of designers in industry and education contexts: we are trying to understand the utility of the cards as a privacy by design tool. [9]

A further challenge for privacy by design goes beyond how to communicate regulatory requirements to communities unfamiliar with the law and policy landscape. Whilst finding mechanisms for delivering complex content in more accessible ways is one issue, like our cards, finding the best forums for engagement with these concepts is another. Two examples could be the role of state regulators and industry/professional associations. State regulatory bodies, like the UK ICO or EU Article 29 Working Party, have a role to play in broadcasting compliance material and supporting technology designers’ understanding of law and regulation. The needs of each business will vary, and support has to adapt accordingly. One example could be the size and resources a business has at its disposal. It is highly likely these will dictate how much support they needed to understand regulatory requirements e.g. an under resourced Small or Medium-sized Enterprise vs. a multinational with in-house legal services.

Industry and professional associations, like British Computer Society, Association for Computing Machinery or the Institute of Electrical and Electronics Engineers may be suitable forums for raising awareness with members about the importance of regulation too. Sharing best practice is a key element of this, and these organisations are in a good position to feed their experience into codes of practice, like those suggested by Art 40 GDPR.

[1] – L Urquhart and E Luger “Smart Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015) Computers and Law 26(2)

[2] – D Wright and P De Hert Privacy Impact Assessment (2012 Springer)

[3] – A29 WP “Opinion 8/2014 on the recent Developments on the Internet of Things” WP 233

[4] –  We are conducting a project in the EU and US involving researchers from: University of Nottingham (Tom Rodden, Neha Gupta, Lachlan Urquhart),Microsoft Research Cambridge (Ewa Luger, Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer Patil).

EU project page and cards are available at designingforprivacy.co.uk

[5] – J Hong “Usable Privacy and Security: A Grand Challenge for HCI” (2009) Human Computer Interaction Institute

[6] – Danezis et al “Privacy and Data Protection by Design– from policy to engineering” (2014) ENISA; M Dennedy, J Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software Engineering 35 (1)

[7] – H Haddadi et al “Personal Data: Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et al “Human -Data Interaction: The Human Face of the Data Driven Society”(2014) http://hdiresearch.org/

[8] IDEO https://www.ideo.com/work/method-cards; M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010) Proceedings of ACM DIS ’10, Aarhus, Denmarkhttps://dl.acm.org/citation.cfm?id=1858189

[9] E Luger, L Urquhart, T Rodden, M Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul, S Korea

 

Some thoughts on Ethics and Big Social Media Research

This week there have been a few stories in the tech news around social media research ethics. These range from the controversial Kirkegaard and Bjerrekær case involving data scraping from OK Cupid and subsequent public release, to new UK Cabinet Office guidelines on Data Science ethics and a new report from the Council for Big Data, Ethics, and Society. This post is not a commentary on these stories, but instead they prompted me to share some notes I’ve got on the topic that have been on lurking on my hard drive for a wee while. They are not particularly polished or well structured currently, however, hopefully are a few useful nuggets in here, and post PhD it would be nice to turn them into something more formal. But anyway,  here  we go for now, but be warned…there is law🙂

social-1148035_1280

1. There is a need to balance the end goals of data driven research projects that aim at fostering some notion of the ‘greater good’ against the regulatory context. Utilitarian research goals do not preclude studies from legal compliance requirements. This is especially so for privacy and data protection considerations, as these are fundamental, inalienable human rights which often enable other human values like dignity, autonomy, or identity formation. When dealing with vulnerable populations, the need to respect these elements heightens. Broadly, virtuous research goals do not override legal safeguards.

However, this becomes problematic when handling big social media data for research because of significant technical, and regulatory issues for researchers. The ‘velocity, variety and volume’[1] of big data is a challenge computationally. These large data-sets often involve personal data, and accordingly this brings the EU data protection regulations to the fore. The UK ICO has many concerns around use of big data analytics with personal data, yet they state ‘big data is not a game that is played by different rules’ and the existing DP rules are fit for regulation. [2] They are particularly concerned about a number of issues like: ensuring sufficient transparency and openness with data subjects about how their data is used; reflecting on when data re-purposing is compatible with original purposes of collection or not (eg data collected for one purpose is reused for another); the importance of privacy impact assessments; how big data challenges the principle of data minimisation and preserving data subjects access rights.  [3]

2) Researchers, as data controllers in their research projects i.e. those determining the purposes and means of data processing,[4] have a range of responsibilities under data protection rules. They need to ensure security, proportionate collection and legal grounds for processing, to name a few. Data subjects have many rights in data protection law from knowing what data is collected, why, by whom, for what purposes and to object to processing in certain grounds. From a data subjects’ perspective, they may not even know they are part of a big social media dataset, making it very hard for them to protect their own rights (eg with data scraped from Tweets around a particular topic). Furthermore, data subject rights are set to be extended in the new General Data Protection Regulation[5]. They will be able to restrict data processing in certain circumstances, have their data in a portable format they can move and even have a right to erasure.[6] Researchers need to reflect on the range of subject rights and controller responsibilities in the GDPR, and consider how to protect the rights of data subjects whose personal data are within their big social media datasets. A particular challenge is obtaining user consent. The status quo that some argue these datasets are too large to reasonably obtain consent of every ‘participant’ is not sustainable…(we return to that below…)

keyboard-417092_12803) To understand why the distinction between personal and sensitive personal data is important, we need to unpack the nature of consent in data protection law. In general, unambiguous user consent is not the only legal grounds for processing personal data (eg legitimate interests of the controller), but for handling sensitive personal data, explicit consent is required. To clarify, personal data is “any information relating to an identified or identifiable natural person (‘data subject’)”,[7] but sensitive personal data is information about “racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.”[8] Similarly, legally speaking, consent is “freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed”.[9] The informed requirement needs clearly visible and accessible, jargon free, easily understandable information to be directly provided to subjects before the point of consent. [10] Similarly, the unambiguous element needs “the indication by which the data subject signifies his agreement must leave no room for ambiguity regarding his/her intent”. [11]

With sensitive personal data, the consent needs to be ‘explicit’ but what this requires is not defined in the law. Importantly, for both types of consent, the EU Article 29 Working Party (an advisory institution for DP law) argues that how an indication or agreement is made is not limited to just writing, but any indication of wishes i.e. “it could include a handwritten signature affixed at the bottom of a paper form, but also oral statements to signify agreement, or a behaviour from which consent can be reasonably concluded.[12] Importantly, passive behaviour is not enough, and action is necessary. [13]

4) From the perspective of social media research, this leaves the mechanisms that can be used for obtaining explicit consent quite open, within these required parameters. However, more innovative approaches for communicating with and informing the end users are required. One proposal for studies using Twitter might be to send direct messages to the Twitter handles of those whose data is captured, explicitly explaining about data being used in research and asking permission. Privacy preserving approaches like removing Twitter handles at point of data collection might be another means, or at least pseudonymising them.  However, even these might not be compliant because Twitter handles themselves are likely personal data, and handling these, even prior to providing information about the nature of processing to end users, or prior to anonymisation/pseudonymisation, would still be subject to DP rules. Whilst these are difficult challenges to address, they are necessary to be considered.

5) Beyond compliance, we also need to consider the distinction between law and ethics. Form contracts eg privacy policies, end user licence agreements or terms of service, are a dominant approach for obtaining end user consent. No-one reads these, and they can not change them even if they did.[14] Whilst many of these contract terms are challengeable as unfair under consumer protection laws, for example jurisdiction clauses, this requires testing in a dispute. This costs money, time and resources that many consumers lack. Beyond this, the use of contracts highlights a distinction between law and ethics.

successful-1095545_1280Organisations will argue contracts that include clauses allowing for research provide legal compliance, and research based on such contracts may be ‘legal’. Whether it is ethical is another question,  as we saw with the Facebook ‘emotional contagion’ study.[15] However, people rarely read Ts&Cs, challenging any notion of ‘informed consent [16] and with the need for explicit consent to process data relating to political opinions, health, sex life or philosophical views of subjects, it is hard to argue such form contracts are enough. Indeed, sentiment analysis of Tweets, for example, may often focus on topics like political responses to different topics from different communities. However, even if you could be convinced by a good legal that orator that the legal footing for processing is sound, the uses are still questionable ethically. Fear of sanctions and implications from lack of legal compliance, like litigation, will likely foster more change than the aspiration of ethical practice. Sound ethical practices could be viewed as a carrot, and the law as a a stick, but increasingly we need both. Attempts to find a complementary approach between the two are growing. A good example is the European Data Protection Supervisor recently established ethics advisory group to help design a new set of ‘digital ethics’ that can help foster trust in organisations.[17]

6) Publicly accessible research data like Tweets are often argued to be fairly used in research, as they are broadcast to the online world at large, but this is not correct. As boyd argues[18], information is often intended only for a specific networked public made up of peers, a support network or specific community, not necessarily the public at large (boyd 2014). When it is viewed outside of those parameters it can cause harm. Indeed, as Nissenbaum states, privacy harms are  about information flowing out of the context it was intended for (Nissenbaum 2009). [19] Indeed, legally, people do have a reasonable expectation to privacy, even in public spaces (Von Hannover v UK).[20]

7) As researchers, our position within these contexts is unclear. We are often in an asymmetric position of power with regards to our participants, and we need to adhere to higher standards of accountability and ethics, especially when dealing with vulnerable populations. How we maintain public trust in our work has to reflect this. It becomes a question of who is looking at this data, how and in what capacity. The context of police analysis of open social media is a comparative example (i.e. not interception of private communications but accessing publicly available information on social media, news sites etc). There, the systematic nature of their observation[21], and their position as a state organisation brings questions about legality, proportionality or necessity of intrusions into private and family life to the fore. The same questions may not be asked about the general public looking at such data. The discussions and challenges around standards of accountability, transparency, and importantly legitimacy for the police using open social media,[22] has parallels with those of researchers.

8) The DPA provides exemptions from certain DP rules for research purposes, although ‘research’ is not well defined.[23] The UK ICO Anonymisation Code of Practice  clarifies to an extent, stating research includes [24]statistical or historical research, but other forms of research, for example market, social, commercial or opinion research”.[25] Importantly, research should not support measures or decisions about specific individuals, and be used in a way that causes, or is likely to cause, the data subject substantial damage or stress. [26]  The ICO affirms the exemptions are only for specific parts of data protection law, namley, incompatible purposes, retention periods and subject access rights – i.e. “if personal data is obtained for one purpose and then re-used for research, the research is not an incompatible purpose. Furthermore, the research data can be kept indefinitely. The research data is also exempt from the subject access right, provided, in addition, the results are not made available in a form that identifies any individual.”[27] All other provisions of DP law still apply, unless it is “anonymised data for research, then it is not processing personal data and the DPA does not apply to that research”.[28]

That being said, the ICO has also stated, “anonymisation should not be seen merely as a means of reducing a regulatory burden by taking the processing outside the DPA. It is a means of mitigating the risk of inadvertent disclosure or loss of personal data, and so is a tool that assists big data analytics and helps the organisation to carry on its research or develop its products and services.”[29] Scholars, like Ohm, have also argued against the assumption that anonymisation of data is a good policy tool, because the dominant anonymisation techniques are at risk of easy deanonymisation. [30]  Narayanan and Felton have argued similarly from a more technical perspective. [31]  Anonymisation is hard because of risks of linking data between and across databases, ‘singling out’ individuals from data or inferences on attributes from values of other attributes.[32]

Refs:

[1] ICO Big Data and Data Protection (2014) p6-7 available at https://ico.org.uk/media/for-organisations/documents/1541/big-data-and-data-protection.pdf

[2] ICO Big Data and Data Protection (2014) p4

[3] ICO Big Data and Data Protection (2014) p5-6

[4] Art 2 Data Protection Directive 1995

[5] XXXX

[6] Article 17, 17a, Art 18

[7] Art 2(a) DPD

[8] Art 8(1) DPD

[9] Article 2(h); Art 7 – unambiguous consent

[10] Opinion 15/2011 p19-20

[11] Opinion 15/2011 p19-20

[12] Opinion 15/ 2011 p11

[13] Opinion 15/ 2011 p11

[14] http://www.theguardian.com/money/2011/may/11/terms-conditions-small-print-big-problems

[15] http://www.theguardian.com/technology/2014/nov/28/social-networks-personal-data-mps

[16] http://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds

[17] https://secure.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/EDPS/PressNews/Press/2016/EDPS-2016-05-EDPS_Ethics_Advisory_Group_EN.pdf

[18] d boyd ‘It’s Complicated: Social Lives of Networked Teens ’ (2014)

[19] H Nissenbaum ‘Privacy in Context’ (2009)

[20] Von Hannover v Germany; Peck v UK

[21] Rotaru v Romania

[22] See L Edwards and L Urquhart ”Privacy in Public Spaces” (2016) Forthcoming – http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2702426

[23] Article 33

[24] https://ico.org.uk/media/for-organisations/documents/1061/anonymisation-code.pdf p44 onwards

[25] ICO Anonymisation Code of Practice (2012) p44 – 48

[26] DPA 1998 s33(1)

[27] ICO Big Data and Data Protection (2014) para 84

[28] ICO Big Data and Data Protection (2014) para 86

[29] ICO Big Data and Data Protection (2014) para 46

[30] P Ohm “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” (2010) UCLA Law Review p1704

[31] A Narayanan and E Felton “No Silver Bullet: De-identification still doesn’t work” (2014) http://randomwalker.info/publications/no-silver-bullet-de-identification.pdf where they argue “Data privacy is a hard problem. Data custodians face a choice between roughly three alternatives: sticking with the old habit of de-identification and hoping for the best; turning to emerging technologies like differential privacy that involve some trade-offs in utility and convenience; and using legal agreements to limit the flow and use of sensitive data. These solutions aren’t fully satisfactory, either individually or in combination, nor is any one approach the best in all circumstances.”

[32] Article 29 Working Party “Opinion 05/2014 on Anonymisation Techniques” (2014)

Royal Holloway: Cybersecurity and the Internet of Things Workshop

13199078_1112269528824478_1567831891_oRecently I was invited to a Royal Holloway workshop on Cybersecurity and the Internet of Things, both as a speaker and then as a panellist.

13184802_1112269285491169_1448945586_oDespite getting caught up in London train issues (5th May saw huge delays getting anywhere from Waterloo station), I was glad to get there eventually. It was a great line up, not too big, and I met some really nice people.  To top it off, the campus is one of the prettiest I’ve visited in the UK in some time … although saying that, I still feel loyalty to the Scottish Ancients…especially Edinburgh’s Old College😀 .

It was a rather cr13184542_1112269205491177_869072537_ooss-disciplinary affair for me as a law/HCI researcher, with speakers bringing their own perspectives on IoT (detailed below) and a number of attendees being mathematicians and computer scientists working on cybersecurity, crypto research etc (there were some social scientists too). The talks focused on: business insight into the IoT market and design challenges in that space; governance issues around IoT algorithms; mapping, modelling and analysing IoT security threats; IoT infrastructure security; political and social implications of IoT, with particular focus on hack-spaces and autonomous cars. The great speakers included:

  • Alex Deschamps-Sonsino (IoT expert, designer, consultant and entrepreneur – creator of designswarm and Good Night Lamp – http://designswarm.com/) – Keynote
  • Dominique Guinard (CTO for EVRYTHNG and author of “Building the Web of Things”) – Keynote
  • Josh Schiffman (HP Labs)
  • Andrea Calderaro (Cardiff University- International Relations Lecturer specialising in Internet Governance, Cyber Security, Digital Rights & Freedoms)
  • Declan McDowell-Naylor (Royal Holloway – Politics and International Relations – PhD student – Politics and Ethics of IoT)
  • Benjamin Aziz –  Senior Lecturer in Computer Security at the School of Computing, University of Portsmouth
  • David McCann (Bristol University – PhD student – IoT Security)

13199045_1112269402157824_480897661_oIn my own talk, I discussed my current research, focusing on mapping the intersection between regulation and HCI, framing it discussion around regulatory  aspects of security/privacy issues in the IoT. The talk was followed with a good amount of questions, and after this, I was  invited to join the panel with Josh Schiffman (HP Labs), David McCann (Bristol) and the chair, Prof Paul Dorey (RHUL/ CSO Confidential). I particularly enjoyed this as it was a nice interactive session with a breadth of questions from across the room on various technical and social aspects of cybersecurity and the IoT.

Overall, it was a very friendly and enjoyable event, with many CDT students from both Royal Holloway and Oxford attending. So many thanks again to Royal Holloway’s Nick Robinson who did a great job of organising the event🙂

 

SSN 2016: Power, Performance and Trust

Screen Shot 2016-04-29 at 11.53.53

Last week I presented at the 7th Biannual Surveillance Studies Network Conference: Surveillance: Power, Performance and Trust in sunny Barcelona. This international conference, hosted at University of Barcelona, is the biggest in the field of surveillance studies, itself a growing multidisciplinary domain.

Screen Shot 2016-04-29 at 11.54.20

Over 4 days, as you can imagine, a lot of ground was covered. Many of the papers focus on social implications of emerging technologies from different epistemic perspectives, although primarily from across the social sciences and humanities. A glance at the abstract booklet gives you a sense of the diversity, featuring perspectives from, between and combining geography, management, business, theology, critical theory, communications studies, STS, criminology, politics, law etc. The topics covered in panels were vast, with themes covered like border security, state intelligence, mobility, trust politics and surveillance, vigilantism, identity and migration, body cams,  health, social media, quantified self, big data, CCTV…the list goes on.

https://upload.wikimedia.org/wikipedia/commons/6/63/Pati_Manning.JPG

I went a couple of years ago, and also got a scholarship to go this time which was nice. The first panel I went to, Politics, had a nice talk by Colin Bennett looked at voter surveillance and how this factors into electoral campaigns (having since watched some of House of Cards S4, the Pollyhop story arc captures some of the issues raised in the talk).In the Education panel I saw some nice work going on in Canada through co-design workshops with youths for privacy education  , and another project looking at the Canadian educational policy landscape around cyberbulling, as part of the eQuality Project. In the afternoon I enjoyed some talks around policing and communities, particularly David Murakami Wood’s talk about a Tokyo community association’s experience with CCTV. I also really enjoyed the panel on body worn cameras (BWC), a hot topic this year. I found William Webster and Charles Leleux’s work on BWCs used across different public sector organisations in Scotland particularly interesting, check out this panel from CPDP 2016 on BWCs.

Among others, I also enjoyed listening to results from some ethnographic projects on security and policing, one on the Rio World cup and techno-scientific approaches to spotting security risks. A nice panel on Art and Film unpacked cinematic representations of drones on the Mexico-US border, and another looking at the relationship between data and intimacy in Spike Jonze, Her. A last highlight was the panel on criminal justice included a range of talks: representations of policing through TV observational documentaries (like Cops); identification practices (particularly fingerprints and DNA) of Portuguese suspects (arguidos) and impacts on their identity; the role of ‘super recognisers’ matching suspect faces on CCTV in the UK and Australia (interesting contrast to smart CCTV approaches); and lastly a reflective talk on field work and ethics with covert policing.

Lastly, in my panel on Art, I was presenting some of my PhD work on theoretical challenges of bringing law and HCI closer together in the context of Internet of Things and Privay by Design. I was joined by interesting talks on community participation projects in cities, and different art projects engaging with surveillance.

My abstract is here:

Regulation by Design for Ambient Domestic Computing: Lessons from Human Computer Interaction

This paper will look at the role of design in addressing regulatory challenges posed by ambient technologies embedded in our domestic environment. Many terms capture the essence of these technologies from internet of things and ubiquitous computing to ambient intelligence and home automation. Broadly we define these as technologies physically embedded around us that sense and process human data to provide contextually appropriate services.

These systems have varying levels of visibility (physically and psychologically) and autonomy (from minimal to semi autonomous behaviour). They may prompt a direct interaction (eg through an interface or smartphone app) or/and try to understand our human needs by sensing our presence or movements (eg smart thermostats managing our home heating based on movement). The relationship between the human and ambient computer is one of daily interaction where technology often mediates routines and human experiences in the home. The goal of many of these technologies is to become assimilated into daily life to the extent they become ‘unremarkable’. There is often a complex ecosystem of actors involved in the provision of both devices and services, from the manufacturers developing and managing the systems, to the third party advertisers seeking access to the data. Increasingly we see policy and law moving towards involving non state actors in the practice of regulation. A key example is regulators looking to designers to enable regulation by design. From nudges to privacy by design we see a recognition of the power of design as a mechanism to address hard regulatory problems and the importance of designers as mediators. We recognise that the system designers of these new ambient technologies have a responsibility to their users and they act in some capacity as regulators through their ability to define how the human uses and engages with the technology.

Importantly, the technology is not neutral, it is a product of active choices and decisions of system designers (from system architects and programmers to interface and user experience designers). We are particularly interested in human agency concerns, which are themselves broad. Narrowing down the problem space is problematic but user control over personal data, (dis)trust in the infrastructure and the importance of decision making and choice when interacting with these systems are particular interests. We consider the range of tools available to system designers within the field of ‘human computer interaction’ to address regulatory concerns. When designing new ambient technologies, HCI practitioners use methods to build situated knowledge of the practices in the social settings that technologies will be built for, from workplaces to homes and public spaces, often by speaking to and observing users of these systems. They do this to make sure the systems, experiences and interactions fit the context of use. These same design tools and knowledge could be repurposed to understand regulatory issues faced by users in context. Accordingly, we reflect on approaches from the HCI that help system designers engage with their regulatory role, eg value sensitive design or the Scandinavian school of participatory design.

BILETA 2016: IoT & PbD

This week I’ve been at the 31st Annual Conference of the British and Irish Legal Education and Technology Association. This is the biggest conference of the Information Technology Law community in the UK, and had a great line up this year. The remit of the conference is broad. A Storify feed from the conference gives a sense of this.IMG_20160414_124636

I attended a number of interesting talks and panels. The three keynotes looked at legal education in the US (Eric Goldman), the need for an MIT type model to teaching of law, where students learn tech skills (eg data analytics, stats etc) (Dan Katz) and on the nature of the right to be forgotten (Lilian Edwards). There was a couple of panels I attended too – one chaired by Google on the Right to Be Forgotten with Harkinder Obhi (a lawyer who worked on the Google Spain case) and Edina Harbinja (the conference chair) with speakers Jeff Ausloos, Paul Bernal, Lilian Edwards and Giancarlo Frosio.  Another on Surveillance and the IP Bill chaired by Lilian Edwards, with Ross Anderson, Eric King, Graham Smith, Jim Killock and Andrew Cormack ; a session on privacy with talks o how legal and ethical factors are being considered in the cross disciplinary SECINCORE project (C Easton), data portability (A Diker), and cross border data transfer post Schrems (J Rauhofer).  The Google  PhD workshop was an interesting highlight following a Privacy Law Scholars type model where 3 PhD papers were reviewed in detail by one expert and then publicly discussed in the workshop, before voting on a winner (Lawrence Diver this time for his great paper ‘The Lawyer in the Machine…’). The War of Words, a Socratic method of debate was also an interesting, if slightly intimidating experience!

Within my session, there were presentations on legalities of software agents generating copyrighted works (eg poetry or paintings) by J Zatrain and another on regulatory challenges of ad blocking  by D Clifford and V Verdoodt.

I presented a paper on my PhD work called Privacy by Design and the Internet of Things: From Rhetoric to Practice Using Information Privacy Cards. This focused in particular on the regulatory challenges of the internet of things, the solution of regulation by design (using the example of privacy by design) and putting forth work I’ve been doing from a legal, theoretical and design perspective. For the latter I discussed the new privacy by design cards we’ve been developing at Horizon and MSR, and particularly, the process for adapting and translating the new General Data Protection Regulation into cards.

The abstract is below for anyone who is interested🙂

Screen Shot 2016-04-14 at 19.46.22

Privacy by Design and the Internet of Things: From Rhetoric to Practice using Information Privacy Cards
Lachlan Urquhart, University of Nottingham

This paper discusses a tool that has been developed to help move the principle of data protection by design from theory to practice. Article 23 of the General Data Protection Reform Package mandates data protection by design and default. This, in turn, increases the role of technology designers in regulation.[1]

However, guidance on what that actually requires in practice is sparse. Different technical measures to ensure privacy by default exist, such as anonymisation or encryption. Equally, organisational approaches like privacy impact assessments [2]

can be of assistance. However, the regulatory challenges posed by emerging technologies, like internet of things ecosystems,[3] require a more accessible means of holistically bringing information privacy law principles into system design.

By calling on design to be part of regulation, it is calling upon the system design community: a community that is not ordinarily trained or equipped to deal with regulatory issues. In order to implement Article 23 in practice will require far greater engagement with and support of the system design community.

Law is not intuitive or accessible to non-lawyers, yet by calling for privacy by design, the law is mandating non-lawyers be involved in regulatory practices. We argue that there is a need to engage, sensitise and guide designers on data protection issues on their own terms.

Presenting a non-legal community with legislation, case law or principles framed in complex, inaccessible legalese is not a good starting point. Instead, a truly multidisciplinary approach is required [4] to translate legal principles from law to design. This is no easy task.

Technical and human centric approaches to engaging with the regulatory challenges of emerging technologies have emerged in the fieldsof usable privacy and security (eg P3P) [5], privacy engineering [6] or more recently, human data interaction (eg personal data containers). [7]

By  looking at the interface between privacy law and human computer interaction we’ve developed a new, practical tool to engage designers: information privacy cards.

Ideation cards [8] have an established lineage in design as a tool to help designers explore and engage with unfamiliar or challenging issues.

They also are also sufficiently lightweight and can be deployed in a range of design contexts, for example at different stages within the agile software development process. We have developed a set that draw on European data protection law principles.We have tested different iterations of them with designers and found a number of barriers between the two communities that need to be overcome.[9]

For example, data protection knowledge of system designers (ranging from software architects to user interface specialists) is limited and needs driven. Meeting DP regulations is also often seen as a limitation on system functionality and is not really the job of designers.

Our new iteration of the cards translates a range of user rights and designer responsibilities from the whole post trilogue General Data Protection Reform Package. Through workshops with teams of designers in industry and education contexts we are trying to understand the utility of the cards as a privacy by design tool.

In this paper we will discuss our findings so far, seeking feedback from the IT law community. We present a number of issues and lessons from this work on what privacy by design actually means in practice, and the challenges and barriers between the design and legal communities. We situate many of these discussions within the context of the internet of things.

[1] – L Urquhart and E Luger “Smart Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015) Computers and Law 26(2)

[2] – D Wright and P De Hert Privacy Impact Assessment (2012 Springer)

[3] – A29 WP “Opinion 8/2014 on the recent Developments on the Internet of Things” WP 233

[4] –  We are conducting a project in the EU and US involving researchers from: University of Nottingham (Tom Rodden, Neha Gupta, Lachlan Urquhart),Microsoft Research Cambridge (Ewa Luger, Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer Patil).

EU project page and cards are available at designingforprivacy.co.uk

[5] – J Hong “Usable Privacy and Security: A Grand Challenge for HCI” (2009) Human Computer Interaction Institute

[6] – Danezis et al “Privacy and Data Protection by Design– from policy to engineering” (2014) ENISA; M Dennedy, J Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software Engineering 35 (1)

[7] – H Haddadi et al “Personal Data: Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et al “Human -Data Interaction: The Human Face of the Data Driven Society”(2014) http://hdiresearch.org/

[8] IDEO https://www.ideo.com/work/method-cards; M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010) Proceedings of ACM DIS ’10, Aarhus, Denmark https://dl.acm.org/citation.cfm?id=1858189

[9] E Luger, L Urquhart, T Rodden, M Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul, S Korea

 

 

Upcoming Conference: Surveillance and Society 2016

Alongside BILETA 2016, I’ve another conference coming up in April: Surveillance and Society 2016. I will presenting a paper “Regulation by Design for Ambient Domestic Computing: Lessons from Human Computer Interaction”. I’ll post how it was afterwards, but for now the abstract is below:

This paper will look at the role of design in addressing regulatory challenges posed by ambient technologies embedded in our domestic environment.

Many terms capture the essence of these technologies from internet of things and ubiquitous computing to ambient intelligence and home automation. Broadly we define these as technologies physically embedded around us that sense and process human data to provide contextually appropriate services. These systems have varying levels of visibility (physically and psychologically) and autonomy (from minimal to semi autonomous behaviour). They may prompt a direct interaction (eg through an interface or smartphone app) or/and try to understand our human needs by sensing our presence or movements (eg smart thermostats managing our home heating based on movement).

The relationship between the human and ambient computer is one of daily interaction where technology often mediates routines and human experiences in the home. The goal of many of these technologies is to become assimilated into daily life to the extent they become ‘unremarkable’. There is often a complex ecosystem of actors involved in the provision of both devices and services, from the manufacturers developing and managing the systems, to the third party advertisers seeking access to the data.

Increasingly we see policy and law moving towards involving non state actors in the practice of regulation. A key example is regulators looking to designers to enable regulation by design. From nudges to privacy by design we see a recognition of the power of design as a mechanism to address hard regulatory problems and the importance of designers as mediators.

We recognise that the system designers of these new ambient technologies have a responsibility to their users and they act in some capacity as regulators through their ability to define how the human uses and engages with the technology.

Importantly, the technology is not neutral, it is a product of active choices and decisions of system designers (from system architects and programmers to interface and user experience designers).

We are particularly interested in human agency concerns, which are themselves broad. Narrowing down the problem space is problematic but user control over personal data, (dis)trust in the infrastructure and the importance of decision making and choice when interacting with these systems are particular interests. We consider the range of tools available to system designers within the field of ‘human computer interaction’ to address regulatory concerns.

When designing new ambient technologies, HCI practitioners use methods to build situated knowledge of the practices in the social settings that technologies will be built for, from workplaces to homes and public spaces, often by speaking to and observing users of these systems. They do this to make sure the systems, experiences and interactions fit the context of use. These same design tools and knowledge could be repurposed to understand regulatory issues faced by users in context. Accordingly, we reflect on approaches from the HCI that help system designers engage with their regulatory role, eg value sensitive design or the Scandinavian school of participatory design.

A Legal Turn in HCI: Towards Regulation by Design for the Internet of Things

I’ve uploaded a new working paper onto SSRN. This is with Prof Tom Rodden and draws on work from the theoretical part of my PhD. The paper is called “A Legal Turn in HCI: Towards Regulation by Design for Internet of Things”. There is a lot in this paper so I’m going to pull out some of the key arguments over a series of blog posts – please get in touch with any thoughts🙂

Here is the abstract:

This discursive paper explores the role of law in HCI through the concept of ‘regulation by design’. Technology designers are increasingly being called upon by law and policy to act in a regulatory capacity, for example in ‘privacy by design’. This is problematic as technology designers are not traditionally involved in regulation and regulators may not fully appreciate what these designers do. We argue that to practically and conceptually achieve ‘regulation by design’ requires greater understanding of and interaction between the regulation and design communities.

This paper consolidates and assimilates work from the fields of human-computer interaction and technology regulation. It is framed within the context of privacy by design and the Internet of Things. It lays out theoretical tools and conceptual frameworks available to each community and explores barriers and commonalities between them, proposing a route forward.

It contends five main points: 1) regulation by design involves prospective, as opposed to just retrospective, application of law; 2) HCI methods need to be repurposed to engage with legal and regulatory aspects of a system; 3) the legal framing of regulation and design is still anchored in systems theory but human computer interaction has a range of rich approaches for understanding the social, and ‘regulation by design’ needs to use these; 4) designers are now regulators and this brings a range of responsibilities; and lastly, 5) design and human values perspectives in HCI need to be extended to legal values and participatory design is a strong candidate for doing this.

Please get in touch with thoughts and feedback. Email address in paper.

 

 

Class Slides: Current Trends in Data Protection Law and Policy

Last month I taught a class on an Advanced Research Methods and Ethics course at Nottingham. It was for first year Horizon PhD students and I was teaching about current issues and trends in data protection law. The class was not of lawyers but students from across a wide spectrum of backgrounds e.g psychology, sociology, computer science, business, engineering etc. It was good fun and I got nice feedback too, which was encouraging🙂

Screen Shot 2016-03-07 at 13.48.59

BILETA 2016: Privacy by Design and the Internet of Things

Screen Shot 2016-03-02 at 12.38.37I recently found out my abstract has been reviewed and accepted to BILETA 2016. I’ll be presenting there in April 2016, and will have more details once the programme is finalised. This is a major UK Information Technology Law Conference, hosted at the University of Herefordshire this year, and it is normally a great event. It will be good to catch up with colleagues too, as the chair is a former LL.M colleague, Edina Harbinja, and one of the keynote speakers is my external PhD supervisor, Professor Lilian Edwards.

In the meantime, here is my abstract.

Privacy by Design and the Internet of Things: From Rhetoric to Practice using Information Privacy Cards

Lachlan Urquhart LL.B, LL.M Doctoral Candidate – Mixed Reality Lab and Horizon, School of Computer Science, University of Nottingham

This paper discusses a tool that has been developed to help move the principle of data protection by design from theory to practice. Article 23 of the General Data Protection Reform Package mandates data protection by design and default. This, in turn, increases the role of technology designers in regulation.[1] However, guidance on what that actually requires in practice is sparse. Different technical measures to ensure privacy by default exist, such as anonymisation or encryption. Equally, organisational approaches like privacy impact assessments[2] can be of assistance. However, the regulatory challenges posed by emerging technologies, like internet of things ecosystems,[3] require a more accessible means of holistically bringing information privacy law principles into system design.

By calling on design to be part of regulation, it is calling upon the system design community: a community that is not ordinarily trained or equipped to deal with regulatory issues. In order to implement Article 23 in practice will require far greater engagement with and support of the system design community.

Law is not intuitive or accessible to non-lawyers, yet by calling for privacy by design, the law is mandating non-lawyers be involved in regulatory practices. We argue that there is a need to engage, sensitise and guide designers on data protection issues on their own terms. Presenting a non-legal community with legislation, case law or principles framed in complex, inaccessible legalese is not a good starting point. Instead, a truly multidisciplinary approach is required[4] to translate legal principles from law to design. This is no easy task.

Technical and human centric approaches to engaging with the regulatory challenges of emerging technologies have emerged in the fields of usable privacy and security (eg P3P)[5], privacy engineering[6] or more recently, human data interaction (eg personal data containers).[7] By looking at the interface between privacy law and human computer interaction we’ve developed a new, practical tool to engage designers: information privacy cards.

Ideation cards[8] have an established lineage in design as a tool to help designers explore and engage with unfamiliar or challenging issues. They also are also sufficiently lightweight and can be deployed in a range of design contexts, for example at different stages within the agile software development process. We have developed a set that draw on European data protection law principles.

We have tested different iterations of them with designers and found a number of barriers between the two communities that need to be overcome.[9] For example, data protection knowledge of system designers (ranging from software architects to user interface specialists) is limited and needs driven. Meeting DP regulations is also often seen as a limitation on system functionality and is not really the job of designers. Our new iteration of the cards translates a range of user rights and designer responsibilities from the whole post trilogue General Data Protection Reform Package. Through workshops with teams of designers in industry and education contexts we are trying to understand the utility of the cards as a privacy by design tool.

In this paper we will discuss our findings so far, seeking feedback from the IT law community. We present a number of issues and lessons from this work on what privacy by design actually means in practice, and the challenges and barriers between the design and legal communities. We situate many of these discussions within the context of the internet of things.

[1] L Urquhart and E Luger “Smart Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015) Computers and Law 26(2)

[2] D Wright and P De Hert Privacy Impact Assessment (2012 Springer)

[3] A29 WP “Opinion 8/2014 on the recent Developments on the Internet of Things” WP 233

[4]; We are conducting a project in the EU and US involving researchers from: University of Nottingham (Tom Rodden, Neha Gupta, Lachlan Urquhart), Microsoft Research Cambridge (Ewa Luger, Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer Patil). EU project page and cards are available at designingforprivacy.co.uk

[5]  J Hong “Usable Privacy and Security: A Grand Challenge for HCI” (2009) Human Computer Interaction Institute

[6] Danezis et al “Privacy and Data Protection by Design – from policy to engineering(2014) ENISA; M Dennedy, J Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software Engineering 35 (1)

[7]H Haddadi et al “Personal Data: Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et al “Human-Data Interaction: The Human Face of the Data Driven Society”(2014)  http://hdiresearch.org/

[8] IDEO https://www.ideo.com/work/method-cards; M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010) Proceedings of ACM DIS ’10, Aarhus, Denmark https://dl.acm.org/citation.cfm?id=1858189

[9] E Luger, L Urquhart, T Rodden, M Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul, S Korea

Blog at WordPress.com.

Up ↑

Follow

Get every new post delivered to your Inbox.

Join 816 other followers