Lachlan's Research

From Leeds to Braga to Leiden

IMG_20170425_202357076_HDRThe past month has involved a bit more travelling to various conferences and workshops. It started in Leeds at a workshop on electronic monitoring on 6th April. This is part of the Tracking People Seminar Series, with this one focusing on ethical and legal debates. Personal highlights for me were talks by Mike Nellis discussing both technical and criminological dimensions of electronic tagging, and Michael Nagenborg, giving a philosophical discussion of the ethical aspects of tracking. It was nice to catch up with Michael again, as I’ll be participating in the ICT and the City stream he co-chairs at ETHICOMP 2017 in Turin next month, presenting the paper called “Ethical Dimensions of User Centric Regulation” (more on this in due course).

After the Easter break, I went to the main UK IT law event, BILETA 2017, which had gone on a sunny excursion this year to Universidade do Minho, Braga, Portugal. I was presenting a paper written with CDT student Neelima Sailaja and Horizon Director Derek McAuley on legal, commercial and technical challenges around realising the new EU Right to Data Portability in practice. This involved discussion around the legal importance of personal information management systems, like the Databox project, in realising the right.

Hosted at the Escola de Direito, it was 2 packed days of parallel sessions on fake news, algorithmic governance, post-mortem privacy, IP, living in smart cities, biometric criminal identification, and cybercrime. There were very enjoyable keynotes from Burkhard Schafer (hello again, viva examiner!) on law and algorithms, former Spanish Data Protection commissioner Jose-Luis Pinar on GDPR and Joe Cannatacii about his work as UN Special Rapporteur on Privacy. As an aside, the conference dinner had all the Portuguese ingredients of Bacalhau, Vinho Verde and Fado! Next year it will be in slightly colder Aberdeen, but will seek to combine the unusual mix of privacy, Haggis and a Ceilidh…

IMG_20170426_133527579After BILETA I flew over to the Netherlands for a week-long interdisciplinary workshop called Privacy by Design Beyond the Screen hosted at the Lorentz Centre in Leiden. This was an intensive event bringing together invited specialists on Privacy by Design from many backgrounds to discuss all aspects of the concept from theoretical framings to practicalities of doing PbD in practice. Kindly organised by Bert-Jaap Koops, Tjerk Timan and Jaap-Henk Hoepman in the Lorentz’s hospitable space (they invite proposals to host and fund workshops if you’re interested), the workshop involved a mix of presentations, group discussions and break-out sessions.IMG_20170424_093101794 I presented my PhD research on the concept of user-centric regulation, and over the course of 5 days, we discussed the pros and cons of many different ways of conceptualising PbD from more legalistic discussions around what notion of privacy is appropriate to design frameworks like value sensitive design and requirements engineering.
We also had some interesting discussions about the differences between PbD as a process and a product, with insights from product design too. As part of group work, we looked in depth at privacy implications of augmented reality glasses used by police officers to attend domestic violence cases. We have a lot of material to sift through but hopefully, a few short papers should emerge in due course!

New Directions in IT law: learning from HCI

Yesterday, my new journal article with Tom Rodden “New Directions in Information Technology Law: Learning from Human-Computer Interaction” was published in the International Review of Law, Computers and Technology. It is part of a special edition on law and algorithms edited by Joseph Savirimuthu from Liverpool University. Other articles in the edition consider accountability in algorithms, algorithmic surveillance, deep learning, and health wearables. The abstract is provided below and there are allegedly 50 free copies available at this link, so help yourselves and snap one up before they all go (around 20 left at last count) 🙂

Screen Shot 2017-03-29 at 11.45.44

Abstract: Effectively regulating the domestic Internet of Things (IoT) requires a turn to technology design. However, the role of designers as regulators still needs to be situated. By drawing on a specific domain of technology design, human–computer interaction (HCI), we unpack what an HCI-led approach can offer IT law. By reframing the three prominent design concepts of provenance, affordances and trajectories, we offer new perspectives on the regulatory challenges of the domestic IoT. Our HCI concepts orientate us towards the social context of technology. We argue that novel regulatory strategies can emerge through a better understanding of the relationships and interactions between designers, end users and technology. Accordingly, closer future alignment of IT law and HCI approaches is necessary for effective regulation of emerging technologies.

In other news, there are a couple of working papers up on SSRN looking for feedback if anyone feels so inclined 🙂 One is written with Neelima Sailaja and Derek McAuley on the new GDPR right to data portability, considering the legal, technical and business dimensions of realising the right in practice.  There is also a paper on artcodes and intellectual property law up there too, unpacking the copyright, trademark and design right dimensions (with a focus on creative commons licensing too). Lastly, but by no means least (!)  I passed my PhD viva with no corrections (just a few wee typos) with Burkhard Schafer (Edinburgh) and Derek McAuley at the start of March! So I’m feeling very pleased  🙂

New Paper: Ethical Dimensions of User Centric Regulation

turin_monte_cappucciniA new working paper has been added to Social Science Research Network called Ethical Dimensions of User Centric Regulation. This paper is set to be presented at CEPE/ETHICOMP 2017 in Turin, Italy later in the year, in the stream ‘ ICT and the City’.

We question the ethical role of information technology (IT) designers in IT regulation, unpacking the nature of their responsibilities. We illustrate our argument through the emerging technological setting of smart cities and use our concept of usercentric regulation (UCR) to consider what a closer alignment of IT design and regulation could mean in practice.

We situate how IT designers can respond to their ethical and legal duties to end users. Our concept asserts that human computer interaction (HCI) designers are now regulators  but as they are not traditionally involved in the practice of regulation, the nature of their role is ill-defined. We believe designers need support in understanding what their new role entails, particularly managing ethical dimensions that go beyond law and compliance.

We use conceptual analysis to consolidate perspectives from across Human Computer Interaction, Information Technology Law and Regulation, Computer Ethics, Philosophy of Technology, and beyond. We focus particular attention on the implications of designers mediating interactions of users with technologies and consider the distinction between intended and actual use, where regulation needs to accommodate for both.


User Centric Regulation for the Domestic Internet of Things,_University_of_Edinburgh.JPG

At the end of last week I returned to bonny Edinburgh for a talk at the Law School called “User Centric Regulation for the Domestic Internet of Things“. It was nice to return as an alumnus presenting on my PhD research and I was kindly hosted by the University of Edinburgh’s IT, IP and Media Law Group. The enjoyable event involved detailed discussions on the interplay between designers and lawyers in addressing the regulatory challenges stemming from the  internet of things. I presented not just theoretical and legal perspectives but also a range of empirical and design perspectives to situate the role of technologists in regulation. The empirical data draws on interviews, questionnaires, workshops, focus groups and so forth, and this work is in forthcoming journal papers and book chapters, with working versions here, here and here.


User-Centric Regulation for the Domestic Internet of Things (Lachlan Urquhart, University of Nottingham), 

Fri 2nd Dec, 2pm

“We are delighted to announce another discussion group event which will take place on Friday 2 December at 2pm in the Neil MacCormick Room (9.01) of David Hume Tower. Our speaker, Lachlan Urquhart, previously studied at Edinburgh and is now a Research Fellow in Information Technology Law at the Horizon Digital Economy Research Institute (University of Nottingham). He will present on the following topic:


Increasingly, technology designers are being called upon to address regulatory challenges posed by emerging technologies. However, their role in regulation is not settled and needs to be situated both conceptually and practically. We present a multidisciplinary response through examining what the field of human computer interaction (HCI) can offer. We do this by presenting a number of conceptual, empirical and design led perspectives from the interface between IT law and HCI. We ground these within the case study of doing information privacy by design for the domestic internet of things. HCI focuses on how users interact with technologies in practice. In designing user experiences, HCI practice draws on a range of approaches and concepts to develop a rich picture of the social context of technology use. By reframing these to consider regulatory and ethical dimensions, we argue the role of technology designers in regulation can be better understood. 

The talk will be followed by a Q&A section and refreshments will be provided. All students and staff are welcome and no registration is required.”


Two Forthcoming Papers

I’ve put working versions of two forthcoming papers up onto Social Science Research Network (SSRN).

The first, with Tom Rodden, is going to be in a special edition of the International Review of Law, Computers and Technology on algorithms and law being edited by Joseph Savirimuthu. It is titled “New Directions in Information Technology Law: Learning from Human Computer Interaction” and can be found here. The abstract is as follows:

Effective regulation of emerging technologies, like the domestic internet of things (IoT) and the underpinning algorithms, requires a range of approaches. In this paper we focus on the use of technology design as a regulatory tool.

Within IT law, there has long been recognition that technology design can be used to shape and regulate individual behaviour (Lessig, 2006; Reidenberg, 1998). In this paper, we assert that regulation, as a concept, has broadened sufficiently that designers are now regulators. Accordingly, we need deeper understanding of their epistemological positions to better situate their role within technology regulation.

Accordingly, we look at a specific domain of design, human computer interaction (HCI), and three prominent concepts from this community. We present these concepts to reframe regulatory dimensions of domestic IoT showing what HCI designers can offer as regulators, and more broadly, highlighting channels for conceptual alignment of the HCI and IT law communities.

Understanding how technologies impact rights of users, and how designers can respond effectively, requires a turn to the context of use. The user centric focus of HCI can provide valuable perspectives on designing effective regulatory strategies. Furthermore, we argue current models of technology regulation in IT law do not give sufficient weight to the lived, contextual experiences of how users interact with technologies in situ.

To understand what an HCI led approach can offer IT law and technology regulation, we focus on three prominent concepts: trajectories (Benford et al, 2009), affordances (Norman, 2013) and provenance. We reframe these design concepts within the context of regulation.

The second is going to be a chapter for the book Future Law being edited by Lilian Edwards, Burkhard Schafer and Edina Harbinja. It is titled “White Noise from the White Goods? Conceptual & Empirical Perspectives on Ambient Domestic Computing”and can be found here. The abstract for this one is:

Within this chapter we consider the emergence of ambient domestic computing systems, both conceptually and empirically. We critically assess visions of post-desktop computing, paying particular attention to one contemporary trend: the internet of things (IoT). We examine the contested nature of this term, looking at the historical trajectory of similar technologies, and the regulatory issues they can pose, particularly in the home. We also look to the emerging regulatory solution of privacy by design, unpacking practical challenges it faces. The novelty of our contribution stems from a turn to practice through a set of empirical perspectives. We present findings that document the practical experiences and viewpoints of leading experts in technology law and design.”



Towards User Centric Regulation

It’s been a busy few months since my last update on this blog! I submitted my PhD and I’ve now started a new job too!  A couple of weeks ago I moved desks from the MRL over to Horizon to start as a Research Fellow in Information Technology Law. This shiny new role will involve working between IT Law and HCI across a range of topics and projects within Horizon.

Given all the recent activity, the rest of this post is a bit of a round-up of things I didn’t manage to blog about over the last months…hence, at times, it reads a bit like a stream of consciousness!

In the end, my PhD was titled “Towards User Centric Regulation: Exploring the Interface between Information Technology Law and Human Computer Interaction” and was in on time on the 30 September. I’ve attached the abstract below for anyone interested in the work. I’m doing a talk at the Law School, University of Edinburgh (room/time TBC) on 2nd December called “User Centric Regulation for The Domestic Internet of Things” for anyone who wants to hear more about it.

I’ve also got a couple of articles I’ve been working on that are coming out soon.

The first, co authored with Tom Rodden, is called “New Directions in Information Technology Law: Learning from Human Computer Interaction” and is coming out in the International Review of Law, Computers and Technology as  part of a special edition on Algorithms and the Law (edited by Joseph Savirimuthu). The paper looks at ways to bring conceptual tools from HCI, like provenance, affordances and trajectories, into IT law by re-framing these as mechanisms to situate the role of technology designers in regulation.

The second is a chapter I’ve written for a new book called Future Law, edited by Lilian Edwards, Burkhard Schafer and Edina Harbinja. This will feature a range of contributors from the wonderful annual Gikii conference and should prove to be a brilliant edition. My chapter is called “White Noise from the White Goods: Conceptual and Empirical Perspectives on Ambient Domestic Computing“.

I’ll add more in depth blog posts on these papers in due course, including links to the working paper versions on SSRN.

On that note, the paper “A Legal Turn in HCI…” that I posted on SSRN earlier this year was picked up and given a great review in the online journal, Jotwell, by Daithi MacSitigh from Newcastle University. He provides a fantastic summary and some rather encouraging comments on this exploratory, multidisciplinary piece!

Less related to my PhD, and more in fact to my Masters (!), is my longstanding paper with Lilian Edwards on legalities of social media policing and open source intelligence eventually came out in the International Journal of Law and Information Technology – it’s available here 🙂

Lastly (promise), here is the PhD abstract:

Towards User Centric Regulation: Exploring the Interface between Information Technology Law and Human Computer Interaction

This thesis investigates the role of technology designers in regulation. Emerging information technologies are complex to regulate. They require new strategies to support traditional approaches. We focus on the use of technology design as a regulatory tool. Whilst this solution has significant conceptual traction, what it means in practice is not clear. Deeper investigation of the role of the design community in regulation is necessary to move these strategies from theory into practice. We structure our analysis by asking: how can we understand the role of designers in regulation of emerging technologies?

We answer this question from four primary perspectives: conceptual, legal, practical and design. We situate our investigation within the context of the domestic internet of things and information privacy by design. We adopt an overtly multidisciplinary approach, critically assessing how to bring together the human computer interaction and information technology law communities. To do this, we utilise a range of qualitative methods, including case studies, documental and legal analysis, semi structured expert interviews, questionnaires, focus groups, workshops, and development, testing and evaluation of a design tool. Our contributions are as follows:

Conceptually, we provide a critical investigation of the role of technology designers in regulation by consolidating, evaluating and aligning a range of theoretical perspectives from human computer interaction (HCI) and information technology (IT) law. We draw these together through the concept of user centric regulation. This concept advocates a user focused, interaction led approach to position the role of designers in regulation. It draws on the turn to human values and societal issues in HCI, and the increasing reliance in IT law on design for regulation of emerging technologies.

Legally, we present two detailed case studies of emerging technologies (domestic internet of things and smart metering) mapping the emerging legal landscape and challenges therein. We situate the role of designers, as regulators, within this space, and show how they can respond accordingly through their user centric focus.

Practically, we analyse experiences from leading experts in technology design and regulation to understand the challenges of doing information privacy by design (PbD) for the IoT. We present our findings within the framing of technological, business and regulatory perspectives.

Lastly, we present a design tool, ‘information privacy by design cards’, to support designers in doing PbD. This tool has been designed, tested and refined, providing us with a practical approach to doing user centric regulation. Based on our findings from using the cards, we provide the concept of regulatory literacy to clearly conceptualise the role of designers in regulation.


User Centric Regulation; Information Technology Law; Human Computer Interaction; Privacy by Design; Internet of Things; Smart Metering


ESRC Data PSST! Seminar Series

IMG_20160519_182523A couple of weeks ago I ventured over to North Wales to  Bangor University for the final seminar in the ESRC Data PSST! Seminar series. As a bit of background, the project has been running for the past couple of years, and has seen many different speakers and attendees meeting up for critique and discussion around the themes of surveillance, transparency, non-state actors and political communication.

Being rather late to the series, I was pleasantly surprised to be invited to come along as a speaker and participant for the final event by PI Vian Bakir. I had attended the previous session at Cardiff University after colleague and friend Gilad Rosner suggested I come along. The Cardiff event focused on the role of non-state actors in surveillance and the challenges posed for traditional notions of transparency. (I’ve put my position statements from both events at the bottom).

IMG_20160519_190813For this one, I had a few hours driving over to Bangor. It was a dreich Thursday evening from Ashbourne, but there were bonny sea views on the A55 and Gojira’s album l’Enfant Sauvage for company 🙂 Bangor is quite picturesque as it gazes out over nearby Welsh forests, estuaries, cliffs and mountains. The pre-workshop dinner over in Menai Bridge on Anglesey was at a rather charming fish restaurant too.



A very small group (9 or 10 of us) spent the day discussing how best to engage different stakeholders with concerns over transparency, state surveillance and data governance. I particularly enjoyed learning about the concept of translucency from Vian Bakir and Andrew McStay’s work on a typology of transparency. Interesting communication and engagement tools, from provocative short films to art projects, were discussed. An important point raised was how engaging the public and  policymakers require different approaches. The former may be more interested in educational or viral type material (like the recent Cassette Boy/Privacy International mashup on the IP Bill), whereas that won’t work for the latter, where they may be more responsive to reports, white papers and policy recommendations.

20160520_094747My presentation for this session considered practical approaches to engaging internet of things designers with privacy regulation. The privacy by design cards are a good example, but importantly  I looked at the broader shift IMG_20160519_173403towards bringing designers into regulation too. Finding the best forums to support designers in their new role is important.  Professional bodies like the ACM or IEEE clearly have strong links with their members and can guide on ethics and to an extent regulation. Equally state regulators like the Information Commissioner Office have a role in communicating and supporting designers on their compliance obligations. A particular challenge of this is the differing level of resources organisations have to deal with compliance, from startups and SMEs (with little) to multinationals (with more). The nature of support they may require will differ, and we need to better understand how compliance plays out in these different organisations.

It was an enjoyable workshop and thanks again to the organisers again for having me along 🙂

I’ve put my position statements from Data PSST! Cardiff (March 2016) and Bangor (May 2016) below.

Seminar 5:

Transparency of Non-State Actors? The Case of Technology Designers and Privacy by Design

Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham

Cardiff (March 2016)


My position on transparency and non-state actors is framed in the context of European Data Protection (DP) Law. A key component of the upcoming EU DP reform package is the concept of data protection by design and default (DPbD). Designing privacy protections into a technology has long been considered best practice, and soon it will be mandated by law. It requires privacy concerns to be considered as early as possible in the design of a new technology, taking appropriate measures to address concerns. Such an approach recognises the regulatory power of technology which mediates behaviour of a user, and can instantiate regulatory norms.

Concurrently, regulation, as a concept, has been broadening and moving beyond notions of state centricity and increasingly incorporating actions of non-state actors. I’d argue privacy by design is a context where technology designers, as non-state actors, are now regulators. How they build systems needs to reflect their responsibilities of protecting their users’ rights and personal data, through technical and social safeguards.

However, the nature of their new role is not well defined, leaving open questions on their legitimacy as regulators. They are not normally subject to traditional metrics of good governance like public accountability, responsibility or transparency. Furthermore, the transnational nature of data flows, as we see with cloud computing for example, adds an extra layer of complication. The new DP law will apply to actors outside of EU, e.g. in US, if they are profiling or targeting products and services to EU citizens, meaning  there are national, regional and international dimensions to consider. Overall, the fast pace of technological change, contrasted with the slowness of the law has pushed designers to be involved in regulation, but without appropriate guidance on how to do so.

This is a practical problem that needs to be addressed. An important component is the role of nation states. State and non-state actors need to complement each other, with the state often ‘steering, not rowing’. The model of less centralised regulation cannot mean dispelling with traditional values of good governance. Instead state regulators need to support and guide non-state actors, on how to act in a regulatory capacity. How can transparency, legitimacy and accountability be reformulated for this new class of ‘regulator’: the technology designer. Much work needs to be done to understand how designers need support as regulators, and how the state can respond to this.

Seminar 6:

Making Privacy by Design a Reality?

Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham

Bangor (May 2016)

We have developed a tool that aims to take the principle of data protection by design from theory into practice. Article 23 of the General Data Protection (DP) Reform Package (GDPR) mandates data protection by design and default (DPbD). This requires system designers to be more involved in data protection regulation, early on in the innovation process. Whilst this idea makes sense, we need better tools to help designers actually meet their new regulatory obligations. [1]

Guidance on what DPbD actually requires in practice is sparse, although work from usable privacy and security or privacy engineering does provide some guidance [5, 6]. These may favour technical measures like anonymisation or tools to increase user control over their personal data [7]; or organisational approaches like privacy impact assessments. [2]

By calling on design to be part of regulation, it is calling upon the system design community, one that is not ordinarily trained or equipped to deal with regulatory issues. Law is not intuitive or accessible to non-lawyers, yet by calling for privacy by design, the law is mandating non-lawyers be involved in regulatory practices. We argue that there is a need to engage, sensitise and guide designers on data protection issues on their own terms.

Presenting a non-legal community with legislation, case law or principles framed in complex, inaccessible legalese is not a good starting point. Instead, a truly multidisciplinary approach is required to translate legal principles from law to design. In our case, we bring together information technology law and human computer interaction. [4]

Our data protection by design cards are an ideation technique that helps designers explore the unfamiliar or challenging issues of EU DP law. [8] Our cards focus on the newly passed GDPR, which comes into effect in 2018. They are designed to be sufficiently lightweight for deployment in a range of design contexts eg connected home ecosystems or smart cars. We have been testing them through workshops with teams of designers in industry and education contexts: we are trying to understand the utility of the cards as a privacy by design tool. [9]

A further challenge for privacy by design goes beyond how to communicate regulatory requirements to communities unfamiliar with the law and policy landscape. Whilst finding mechanisms for delivering complex content in more accessible ways is one issue, like our cards, finding the best forums for engagement with these concepts is another. Two examples could be the role of state regulators and industry/professional associations. State regulatory bodies, like the UK ICO or EU Article 29 Working Party, have a role to play in broadcasting compliance material and supporting technology designers’ understanding of law and regulation. The needs of each business will vary, and support has to adapt accordingly. One example could be the size and resources a business has at its disposal. It is highly likely these will dictate how much support they needed to understand regulatory requirements e.g. an under resourced Small or Medium-sized Enterprise vs. a multinational with in-house legal services.

Industry and professional associations, like British Computer Society, Association for Computing Machinery or the Institute of Electrical and Electronics Engineers may be suitable forums for raising awareness with members about the importance of regulation too. Sharing best practice is a key element of this, and these organisations are in a good position to feed their experience into codes of practice, like those suggested by Art 40 GDPR.

[1] – L Urquhart and E Luger “Smart Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015) Computers and Law 26(2)

[2] – D Wright and P De Hert Privacy Impact Assessment (2012 Springer)

[3] – A29 WP “Opinion 8/2014 on the recent Developments on the Internet of Things” WP 233

[4] –  We are conducting a project in the EU and US involving researchers from: University of Nottingham (Tom Rodden, Neha Gupta, Lachlan Urquhart),Microsoft Research Cambridge (Ewa Luger, Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer Patil).

EU project page and cards are available at

[5] – J Hong “Usable Privacy and Security: A Grand Challenge for HCI” (2009) Human Computer Interaction Institute

[6] – Danezis et al “Privacy and Data Protection by Design– from policy to engineering” (2014) ENISA; M Dennedy, J Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software Engineering 35 (1)

[7] – H Haddadi et al “Personal Data: Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et al “Human -Data Interaction: The Human Face of the Data Driven Society”(2014)

[8] IDEO; M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010) Proceedings of ACM DIS ’10, Aarhus, Denmark

[9] E Luger, L Urquhart, T Rodden, M Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul, S Korea


Some thoughts on Ethics and Big Social Media Research

This week there have been a few stories in the tech news around social media research ethics. These range from the controversial Kirkegaard and Bjerrekær case involving data scraping from OK Cupid and subsequent public release, to new UK Cabinet Office guidelines on Data Science ethics and a new report from the Council for Big Data, Ethics, and Society. This post is not a commentary on these stories, but instead they prompted me to share some notes I’ve got on the topic that have been on lurking on my hard drive for a wee while. They are not particularly polished or well structured currently, however, hopefully are a few useful nuggets in here, and post PhD it would be nice to turn them into something more formal. But anyway,  here  we go for now, but be warned…there is law 🙂


1. There is a need to balance the end goals of data driven research projects that aim at fostering some notion of the ‘greater good’ against the regulatory context. Utilitarian research goals do not preclude studies from legal compliance requirements. This is especially so for privacy and data protection considerations, as these are fundamental, inalienable human rights which often enable other human values like dignity, autonomy, or identity formation. When dealing with vulnerable populations, the need to respect these elements heightens. Broadly, virtuous research goals do not override legal safeguards.

However, this becomes problematic when handling big social media data for research because of significant technical, and regulatory issues for researchers. The ‘velocity, variety and volume’[1] of big data is a challenge computationally. These large data-sets often involve personal data, and accordingly this brings the EU data protection regulations to the fore. The UK ICO has many concerns around use of big data analytics with personal data, yet they state ‘big data is not a game that is played by different rules’ and the existing DP rules are fit for regulation. [2] They are particularly concerned about a number of issues like: ensuring sufficient transparency and openness with data subjects about how their data is used; reflecting on when data re-purposing is compatible with original purposes of collection or not (eg data collected for one purpose is reused for another); the importance of privacy impact assessments; how big data challenges the principle of data minimisation and preserving data subjects access rights.  [3]

2) Researchers, as data controllers in their research projects i.e. those determining the purposes and means of data processing,[4] have a range of responsibilities under data protection rules. They need to ensure security, proportionate collection and legal grounds for processing, to name a few. Data subjects have many rights in data protection law from knowing what data is collected, why, by whom, for what purposes and to object to processing in certain grounds. From a data subjects’ perspective, they may not even know they are part of a big social media dataset, making it very hard for them to protect their own rights (eg with data scraped from Tweets around a particular topic). Furthermore, data subject rights are set to be extended in the new General Data Protection Regulation[5]. They will be able to restrict data processing in certain circumstances, have their data in a portable format they can move and even have a right to erasure.[6] Researchers need to reflect on the range of subject rights and controller responsibilities in the GDPR, and consider how to protect the rights of data subjects whose personal data are within their big social media datasets. A particular challenge is obtaining user consent. The status quo that some argue these datasets are too large to reasonably obtain consent of every ‘participant’ is not sustainable…(we return to that below…)

keyboard-417092_12803) To understand why the distinction between personal and sensitive personal data is important, we need to unpack the nature of consent in data protection law. In general, unambiguous user consent is not the only legal grounds for processing personal data (eg legitimate interests of the controller), but for handling sensitive personal data, explicit consent is required. To clarify, personal data is “any information relating to an identified or identifiable natural person (‘data subject’)”,[7] but sensitive personal data is information about “racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.”[8] Similarly, legally speaking, consent is “freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed”.[9] The informed requirement needs clearly visible and accessible, jargon free, easily understandable information to be directly provided to subjects before the point of consent. [10] Similarly, the unambiguous element needs “the indication by which the data subject signifies his agreement must leave no room for ambiguity regarding his/her intent”. [11]

With sensitive personal data, the consent needs to be ‘explicit’ but what this requires is not defined in the law. Importantly, for both types of consent, the EU Article 29 Working Party (an advisory institution for DP law) argues that how an indication or agreement is made is not limited to just writing, but any indication of wishes i.e. “it could include a handwritten signature affixed at the bottom of a paper form, but also oral statements to signify agreement, or a behaviour from which consent can be reasonably concluded.[12] Importantly, passive behaviour is not enough, and action is necessary. [13]

4) From the perspective of social media research, this leaves the mechanisms that can be used for obtaining explicit consent quite open, within these required parameters. However, more innovative approaches for communicating with and informing the end users are required. One proposal for studies using Twitter might be to send direct messages to the Twitter handles of those whose data is captured, explicitly explaining about data being used in research and asking permission. Privacy preserving approaches like removing Twitter handles at point of data collection might be another means, or at least pseudonymising them.  However, even these might not be compliant because Twitter handles themselves are likely personal data, and handling these, even prior to providing information about the nature of processing to end users, or prior to anonymisation/pseudonymisation, would still be subject to DP rules. Whilst these are difficult challenges to address, they are necessary to be considered.

5) Beyond compliance, we also need to consider the distinction between law and ethics. Form contracts eg privacy policies, end user licence agreements or terms of service, are a dominant approach for obtaining end user consent. No-one reads these, and they can not change them even if they did.[14] Whilst many of these contract terms are challengeable as unfair under consumer protection laws, for example jurisdiction clauses, this requires testing in a dispute. This costs money, time and resources that many consumers lack. Beyond this, the use of contracts highlights a distinction between law and ethics.

successful-1095545_1280Organisations will argue contracts that include clauses allowing for research provide legal compliance, and research based on such contracts may be ‘legal’. Whether it is ethical is another question,  as we saw with the Facebook ‘emotional contagion’ study.[15] However, people rarely read Ts&Cs, challenging any notion of ‘informed consent [16] and with the need for explicit consent to process data relating to political opinions, health, sex life or philosophical views of subjects, it is hard to argue such form contracts are enough. Indeed, sentiment analysis of Tweets, for example, may often focus on topics like political responses to different topics from different communities. However, even if you could be convinced by a good legal that orator that the legal footing for processing is sound, the uses are still questionable ethically. Fear of sanctions and implications from lack of legal compliance, like litigation, will likely foster more change than the aspiration of ethical practice. Sound ethical practices could be viewed as a carrot, and the law as a a stick, but increasingly we need both. Attempts to find a complementary approach between the two are growing. A good example is the European Data Protection Supervisor recently established ethics advisory group to help design a new set of ‘digital ethics’ that can help foster trust in organisations.[17]

6) Publicly accessible research data like Tweets are often argued to be fairly used in research, as they are broadcast to the online world at large, but this is not correct. As boyd argues[18], information is often intended only for a specific networked public made up of peers, a support network or specific community, not necessarily the public at large (boyd 2014). When it is viewed outside of those parameters it can cause harm. Indeed, as Nissenbaum states, privacy harms are  about information flowing out of the context it was intended for (Nissenbaum 2009). [19] Indeed, legally, people do have a reasonable expectation to privacy, even in public spaces (Von Hannover v UK).[20]

7) As researchers, our position within these contexts is unclear. We are often in an asymmetric position of power with regards to our participants, and we need to adhere to higher standards of accountability and ethics, especially when dealing with vulnerable populations. How we maintain public trust in our work has to reflect this. It becomes a question of who is looking at this data, how and in what capacity. The context of police analysis of open social media is a comparative example (i.e. not interception of private communications but accessing publicly available information on social media, news sites etc). There, the systematic nature of their observation[21], and their position as a state organisation brings questions about legality, proportionality or necessity of intrusions into private and family life to the fore. The same questions may not be asked about the general public looking at such data. The discussions and challenges around standards of accountability, transparency, and importantly legitimacy for the police using open social media,[22] has parallels with those of researchers.

8) The DPA provides exemptions from certain DP rules for research purposes, although ‘research’ is not well defined.[23] The UK ICO Anonymisation Code of Practice  clarifies to an extent, stating research includes [24]statistical or historical research, but other forms of research, for example market, social, commercial or opinion research”.[25] Importantly, research should not support measures or decisions about specific individuals, and be used in a way that causes, or is likely to cause, the data subject substantial damage or stress. [26]  The ICO affirms the exemptions are only for specific parts of data protection law, namley, incompatible purposes, retention periods and subject access rights – i.e. “if personal data is obtained for one purpose and then re-used for research, the research is not an incompatible purpose. Furthermore, the research data can be kept indefinitely. The research data is also exempt from the subject access right, provided, in addition, the results are not made available in a form that identifies any individual.”[27] All other provisions of DP law still apply, unless it is “anonymised data for research, then it is not processing personal data and the DPA does not apply to that research”.[28]

That being said, the ICO has also stated, “anonymisation should not be seen merely as a means of reducing a regulatory burden by taking the processing outside the DPA. It is a means of mitigating the risk of inadvertent disclosure or loss of personal data, and so is a tool that assists big data analytics and helps the organisation to carry on its research or develop its products and services.”[29] Scholars, like Ohm, have also argued against the assumption that anonymisation of data is a good policy tool, because the dominant anonymisation techniques are at risk of easy deanonymisation. [30]  Narayanan and Felton have argued similarly from a more technical perspective. [31]  Anonymisation is hard because of risks of linking data between and across databases, ‘singling out’ individuals from data or inferences on attributes from values of other attributes.[32]


[1] ICO Big Data and Data Protection (2014) p6-7 available at

[2] ICO Big Data and Data Protection (2014) p4

[3] ICO Big Data and Data Protection (2014) p5-6

[4] Art 2 Data Protection Directive 1995

[5] XXXX

[6] Article 17, 17a, Art 18

[7] Art 2(a) DPD

[8] Art 8(1) DPD

[9] Article 2(h); Art 7 – unambiguous consent

[10] Opinion 15/2011 p19-20

[11] Opinion 15/2011 p19-20

[12] Opinion 15/ 2011 p11

[13] Opinion 15/ 2011 p11





[18] d boyd ‘It’s Complicated: Social Lives of Networked Teens ’ (2014)

[19] H Nissenbaum ‘Privacy in Context’ (2009)

[20] Von Hannover v Germany; Peck v UK

[21] Rotaru v Romania

[22] See L Edwards and L Urquhart ”Privacy in Public Spaces” (2016) Forthcoming –

[23] Article 33

[24] p44 onwards

[25] ICO Anonymisation Code of Practice (2012) p44 – 48

[26] DPA 1998 s33(1)

[27] ICO Big Data and Data Protection (2014) para 84

[28] ICO Big Data and Data Protection (2014) para 86

[29] ICO Big Data and Data Protection (2014) para 46

[30] P Ohm “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” (2010) UCLA Law Review p1704

[31] A Narayanan and E Felton “No Silver Bullet: De-identification still doesn’t work” (2014) where they argue “Data privacy is a hard problem. Data custodians face a choice between roughly three alternatives: sticking with the old habit of de-identification and hoping for the best; turning to emerging technologies like differential privacy that involve some trade-offs in utility and convenience; and using legal agreements to limit the flow and use of sensitive data. These solutions aren’t fully satisfactory, either individually or in combination, nor is any one approach the best in all circumstances.”

[32] Article 29 Working Party “Opinion 05/2014 on Anonymisation Techniques” (2014)

Royal Holloway: Cybersecurity and the Internet of Things Workshop

13199078_1112269528824478_1567831891_oRecently I was invited to a Royal Holloway workshop on Cybersecurity and the Internet of Things, both as a speaker and then as a panellist.

13184802_1112269285491169_1448945586_oDespite getting caught up in London train issues (5th May saw huge delays getting anywhere from Waterloo station), I was glad to get there eventually. It was a great line up, not too big, and I met some really nice people.  To top it off, the campus is one of the prettiest I’ve visited in the UK in some time … although saying that, I still feel loyalty to the Scottish Ancients…especially Edinburgh’s Old College 😀 .

It was a rather cr13184542_1112269205491177_869072537_ooss-disciplinary affair for me as a law/HCI researcher, with speakers bringing their own perspectives on IoT (detailed below) and a number of attendees being mathematicians and computer scientists working on cybersecurity, crypto research etc (there were some social scientists too). The talks focused on: business insight into the IoT market and design challenges in that space; governance issues around IoT algorithms; mapping, modelling and analysing IoT security threats; IoT infrastructure security; political and social implications of IoT, with particular focus on hack-spaces and autonomous cars. The great speakers included:

  • Alex Deschamps-Sonsino (IoT expert, designer, consultant and entrepreneur – creator of designswarm and Good Night Lamp – – Keynote
  • Dominique Guinard (CTO for EVRYTHNG and author of “Building the Web of Things”) – Keynote
  • Josh Schiffman (HP Labs)
  • Andrea Calderaro (Cardiff University- International Relations Lecturer specialising in Internet Governance, Cyber Security, Digital Rights & Freedoms)
  • Declan McDowell-Naylor (Royal Holloway – Politics and International Relations – PhD student – Politics and Ethics of IoT)
  • Benjamin Aziz –  Senior Lecturer in Computer Security at the School of Computing, University of Portsmouth
  • David McCann (Bristol University – PhD student – IoT Security)

13199045_1112269402157824_480897661_oIn my own talk, I discussed my current research, focusing on mapping the intersection between regulation and HCI, framing it discussion around regulatory  aspects of security/privacy issues in the IoT. The talk was followed with a good amount of questions, and after this, I was  invited to join the panel with Josh Schiffman (HP Labs), David McCann (Bristol) and the chair, Prof Paul Dorey (RHUL/ CSO Confidential). I particularly enjoyed this as it was a nice interactive session with a breadth of questions from across the room on various technical and social aspects of cybersecurity and the IoT.

Overall, it was a very friendly and enjoyable event, with many CDT students from both Royal Holloway and Oxford attending. So many thanks again to Royal Holloway’s Nick Robinson who did a great job of organising the event 🙂


Blog at

Up ↑