MENU

Consultation response: Online Harms and the Ethics of Data

As part of the UK Government’s consultation on Online Harms and Ethics of Data, our Policy Director Claire Levens provides insight that we have gained from parents, teenagers, and academics.

Insights from Living the Future: The Technological Family and Connected Home

We’re delighted to respond to this consultation and will be drawing heavily on our recently published report: ‘Living the Future: The Technological Family and Connected Home’ in the course of this document.  Living the Future was an independent academic work written by Professor Lynne Hall at Sunderland University. It was commissioned and edited by Internet Matters and funded by Huawei.  You can read the report in full here.

The report used a range of methodologies pre, during and post lockdown to understand families use of home tech and their expectations for the future.  This was enhanced by a literature review and a delphi study comprising academic and industry experts, online safety specialists and schools, a parent poll, workshops for teenagers, plus in-depth family interviews.  Full details of the methodology mix can be found on page 4.

We wanted to produce a different type of report – one that focused on the future and asked society to think about the opportunities and risks of connected tech and the challenges families may face regarding data and children’s wellbeing and safety.  Being in field before, during and after lockdown has provided a window on the accelerated demand for connectivity and the consequences, both of connectivity and of experiencing disconnection.  Lack of access to tech is an issue of social justice – one which we know the Committee is already engaged in.

What parents, carers and professionals tell us

In addition to this report, we’ll also be drawing upon our experience of running focus groups with parents, carers and professionals caring for teenagers with SEND and our regular parental polling.  We listen to 2000 parents three times a year to understand their online safety concerns and any experiences of risk or harm encountered by their children.

Response to consultation questions

How is data collected, and is this ethical?

If by ‘ethical’ the Committee means morally good or correct – then our response has to be that data collection from the home is not usually ethical.  This conclusion stems from our research insight that many of the connected devices in the home – from smart speakers and TVs to connected toys seem to in effect treat the people in the home as data subjects, rather than people. The cost of seemingly free content is data.  Our data.

 

In our research 42% of families already had a smart device and, and 39% had it on a wish list.  Homes are becoming increasingly porous edifices as personal data streams in and out to and from smart devices.  By 2025, Voice Assistants will feel synonymous with the house, customised to those living within it and perhaps even regulating in-house communications. Yet, they are unlikely to become ‘one of the family’ as there is a lack of trust, and families are unsure about their Voice Assistant’s collection, retention and use of what they are saying.

Reinforcing this distrust is an underlying sense that privacy is compromised by the lack of transparency and comprehensibility related to what happens with this voice data and who manages, accesses manipulates and benefits from it. The Terms & Conditions families sign up to could justify a great many, possibly not-yet-known uses of data.  But if you don’t sign up, you don’t get access.  So whether this is for gaming or for commerce, the price of participation is the provision of personal data.  This, of course, raises the ethical question of what ‘informed consent’ is now and can be if we reimagined it.

The 5RightsFramework which was the result of extensive consultation with children led by the redoubtable Baroness Kidron has as its second right, the right to know.  This is the right to: ‘know who and what and why and for what purposes your data is being exchanged.  And a meaningful choice about whether to engage in the exchange.’ This seems to us to be a useful point of adjudication on whether the data capture is ethical or not.  If this is the standard, we would suggest there is a long way to go.

How is data aggregated, synthesised and/or inferred?

  • for example how is data collected on people put together
  • how much do consumers understand about this

Our remarks are confined to the second part of this question, about consumer understanding.  Lynne Hall found that whilst families may have initial fears about data deployment and privacy, these concerns fade as convenience trumps concerns. If convenience is the primary route around data concerns for product producers, then addressing these concerns has to be done before the devices are offered to consumers.  Designing in data minimisation and privacy should be standard practice – especially for personal and biometric data.  Passive acceptance should not be a proxy for informed consent.

Internet Matters work with parents of SEND children suggests that they would be willing to provide all sorts of personal information if it meant their teenager would have a more positive experience online. (Life online for Children with SEND).  That parents would be comfortable identifying their child as someone with additional needs to mainstream social media platforms indicates that they have little understanding of the value of this data.  This is a concern for now and something we need to find a way to help parents consider as they may well be future unknown consequences.

This suggests that there is a role for all players here.  Regulators are inevitably playing catch up – and although the Age Appropriate Design Code is a good start, more needs to be done as more and more smart devices become mainstream. Secondly, tech companies need to do more to reflect on why they are collecting the data they currently are and to what end.  It may be that regulatory oversight is required to prompt that reflection. Thirdly we have to have an inclusive public conversation about what sort of data capture we are content with and what we are concerned about, and how we can genuinely provide informed consent for both adults and children across a range of platforms.

How is data used?

  • to serve up content online?
  • In other, real-life applications (e.g. life insurance; banking; justice system; etc)

There is a paradox here, in that, at a time whenever more content is available consumers are being served more and more of the same content. As algorithms perform as programmed to personalise content, the ever-increasing volume of content, is funnelled ever more narrowly – to make selection easier.  As an example, the report cites data indicating that 70% of YouTube content is watched through auto-recommendation, rather than search.  This means there is a real risk that the diversity of content is effectively denied.

Where the responsibility for this lies is an interesting ethical debate – as arguably the responsibility for what we consume, our digital diet, rests with the individual or their carer.  However the relationship between individual and the technology in their lives is an asymmetrical one.  Taking on the massed ranks of big tech psychologists to exert control to turn off any streaming service is simply not a fair fight.  Moreover the de facto removal of active choice / participation in our own media selection is a significant and worrying development.  We need a diversity of media inputs to make sense of the world around us, and to become digitally literate citizens.  Replacing that active choice with a funnel of homogenous content is neither morally good or correct and is therefore not ethical.

Why should people care about this, and what mechanisms exist to make them care?

It’s sometimes helpful to consider an offline analogy.  If an eight year old was roaming around a community littering the place with pieces of paper with their personal data on – most people would rightly be concerned about that.  Likewise, if a teenager left her contact details highlighting when her parents were out of the house in an adult club, we would be concerned.  Our concerns would be for their safety, their wellbeing and their privacy.  As offline, so online.  If Paw Patrol is most watched on a streaming service, and the soundtrack for Glee or Frozen II is regularly requested from a smartspeaker, assumptions will be made about the ages of the occupants in that household.  Add to that the insights from the online food shop and income brackets can be identified.

The question then becomes – if these data points exit about the home and its occupants are they shared elsewhere – does that matter?.  One’s answer to that question may depend on one’s philosophical or political perspective.  You could say yes it matters as I have no wish to share those details with a company over which I have no control or even insight to what use they put my data.  You could equally say no, I have nothing to hide and I really love the personalisation that comes with being ‘known’ by brands.  But the fundamental point here is that consumers / subscribers / viewers / voters, call them what you will, should be in a position to make an active choice and to do that, they have to have accurate, transparent and digestible information about what happens to their data.

Perhaps there are no clear cut yes or no answers here, more a recognition that this conversation has not been and is not being had, and yet our report suggests that the deployment of smart devices is set to increase and has been accelerated by the lockdown.

How you get people to care about this is extremely tricky. From our experience of getting parents to engage in the online safety of their children – we know that there are four primary opportunities to attract their attention:

  1. When a new device is bought/brought home
  2. When a child gets their first mobile phone (usually around age 11)
  3. When something goes wrong / something has happened
  4. When a new app, platform or game is requested by a child.

At that point, parents either search online for information or ask the schools for help.  It may be that the committee could use these insights in its report to call for:

  1. More information to be provided at the point of purchase
  2. More protection for minors
  3. Easily available and effect routes of redress
  4. Easily available information on how to prevent data leakage

However, getting parental attention on these issues is hard, expensive and requires dedicated sustained effort.  We suspect driving awareness of data collection amongst the public will be equally if not more challenging and therefore suggest that in tandem with any public awareness campaign, companies are challenged to do more voluntarily, with further regulation a realistic prospect for non-compliance.

Is a Digital Bill of Rights (or similar) useful and practical to enshrine data rights and make them practically applicable to people?

For rights to be practically applicable to people, they have to be relevant, meaningful and respected. Additionally, people have to know how to complain and seek redress if they think their rights have been infringed.  To make a material difference, any Bill of Rights solution has to be accompanied by a simple and effective remedy.  Simple words to write that belies a wealth of complexity to make real.

We think there are a few steps to take before the creation of a Bill of Rights:

  1. To understand what it would take to get people to care about this issue
  2. To understand what voluntarily options are available from the tech companies
  3. To explore what effective regulation looks like
  4. To understand what a meaningful call to action might be
  5. To understand how a Bill of Rights to contribute to creating a culture change so that data rights are respected and valued.

Consideration must also be given to how people will exercise such rights – what do they mean in practice and are the people most in need of these enshrined rights the least likely to use them.  Questions such as what support mechanisms are most useful to those sections of society and who is best placed to create, promote and safeguard them.  In our judgement, a cultural change is required as the outcome of a wider conversation about data usage and ethics.  Internet Matters would be delighted to play our part in that conversation, with parents, professionals and families across the UK.

Recent posts