Looking at a ‘least access’ approach, from both a data protection and safeguarding point of view

Supporting the EdTech Community with By Design and By Default principles

Looking at a ‘least access’ approach, from both a data protection and safeguarding point of view

"Safety on Death Star" by Kalexanderson is marked with CC BY-NC-ND 2.0.

It was so good to be back at BETT a few weeks ago, and there are a raft of really good reports, talks and narratives about what people found, were inspired by or even scared by. My regular tweets and retweets covered some of these, rather than to try to list as many as I can here.

Instead, I’d like to look at something raised by Abid Patel during his fantastic session about how schools can take action to protect themselves against cyber attacks where he talked about the principle of least access. Too many of us, this is nothing new but sometimes it is hard to conceptualise and think of real world issues. It just so happens that I have been discussing this as part of a wider issue that also applies to data protection and safeguarding, never mind cyber security.

After discussing their use of Google Workspace for Education with a school I come into contact with, I reached out to both the Google education team and a number of Google Certified Trainers/admins. You see, I had spotted that it would be all too easy to leave access for school email accounts to be used with 3rd party services. Well, what is the problem with that? Surely it means staff and students can keep to the one account when they log in to things? Well, it doesn’t quite work like that. Let’s take a step back and think about how we log into things.

Generally, when we sign up for something using our email address, we simply use that as a key piece of information (as the username, generally) to create an account, say with something like the BBC. They only use the details we provide for our name, age and so on. The password we create is only used for the BBC and nothing else. Presuming we have not reused that username and password elsewhere, then if anything happens with it then there is not too much damage. If we need to remember lots of passwords for that username/email address, then there are some really good password managers out there. From a cybersecurity point of view, it means that anyone with access to that username and that unique password can only affect a single service, for example, access to the BBC.

From a data protection point of view, it also means that if this account is compromised, then it doesn’t affect data in other services you may access (unless there is something in that compromised data that gives you information to access the other accounts). In the same way, from a safeguarding point of view, the less access someone has to a compromised account, the less damage can occur to individuals and the greater protection there is from misuse, abuse or other issues affecting the safeguarding of children and young people.

Risks

But what does this have to do with Google? Well, there is often the complaint that Google sell on data to 3rd parties. Let me just explain what that means from a data protection position. The school is usually the decision maker in why any personal data needs to be used, how it is used, how long it is kept for, etc. It may be that this choice is established by them choosing a particular vendor to do the work on their behalf, and this makes the vendor a data processor and the school the data controller. With BigTech firms, this can be hard to manage, especially when there are lots of linked services and other vendors being managed at the same time. They all struggle with it to some extent and being transparent is a key element of helping schools deal with this. This was why I was so happy to have the Google education team and various Google experts contribute to the discussion. However, a 3rd party is a very specific term under data protection legislation, and it differs from that in contract law. A 3rd party is a separate data controller who uses personal data for their own purposes. Yes, this heavily paraphrased from the ICO materials but it should give the right idea. The example would be that a school shares personal data with the DfE for the use as decided by the DfE. The school is clear about this in their privacy notice, and families are able to contact the DfE if they need to as a result.

So what happens when there are other ways to log in to different sites and systems? We hear about using Single-Sign On a lot in schools (SSO) and for most, this will mean using one account to be able to log in to lots of different systems controlled by the school. Generally, this will mean that the school is using an account from one EdTech vendor (e.g. Google) to log into another EdTech vendor that the school has an established relationship with. This is not Google sharing data with a 3rd party, but the school has given an instruction for Google to work with another one of the school’s data processors. All good so far. But this raises the issue about a compromised Google account being used to access lots of things that the school uses. This is why Abid, and many others during Bett, talked about 2 factor authentication (2FA) or multi-factor authentication (MFA) as a way if increasing security. And it works. It truly does and makes life for all involved a lot safer.

But there is a major caveat, and one that more schools need to look into when using SSO from *any* provider, not just Google. I talk about Google here, because I was provided with a lot of helpful information but the principle applies to many such services.

You will come across many places on the internet that provide options to sign in with different services. Google, Microsoft, Apple, Facebook, OpenID and many more. This make it easy for you to keep simple. The problems arise for schools when the site, which a particular user is signing up for, is not one that has an agreement with the school. That site then has access to a chunk of personal data from one of your accounts that has no right to it. The user account is the responsibility of the school to protection as the school is the data controller. The data has now just been given to a 3rd party. OK, but this is just a bit of personal data and nothing serious, surely? Well, that site now knows the name of the individual, their email address, the school, possibly other groups they may be in (such as age, sex, interests) and, from a safeguarding point of view, the individual may think that the site they are on is safe and protected as they used a school account to access it.

The problem is that using such SSO is commonplace on many sites across the internet. For chat sites, for auction sites, blog sites, social networks, payment systems, media and file storage services, chat sites, dating sites, image sharing, and more.

Managing Risks

So what can be done about it? Let’s take this from a Google point of view first and foremost. As Google Workspaces are set up for schools and trusts, there are simple checks that can be made to ensure that only authorised sites can use the SSO capabilities. Because of the international nature of Google, some of the admin console may not be labelled as helpfully as we would like but please take into account how the term ‘3rd party’ gets misused and that sometimes they truly do mean it as a data processor/sub-processor instead!

When it comes to using the school Google account to sign into something new you have to consider what it is, what the purpose is, what data is being used, and so on. These could be bits of the core services you are turning on, additional services from Google, things from the marketplace, verified 3rd parties or, dare we say it, unverified 3rd parties.

What is an unverified 3rd party? Let’s look at verified 3rd parties first. “A verified app is a third-party app that’s been reviewed by Google to ensure compliance with security and privacy requirements. Third-party apps that haven’t been verified by Google might be subject to restrictions. Note that many well-known apps might not be listed as verified.” (https://support.google.com/a/answer/9987046, 11th October 2021)

That doesn’t mean verified 3rd parties are good or unverified are bad, but it gives you an idea of whether they have done anything to meet some basic standards set out by Google.

The key term in all this is ‘3rd party’. This means they are not you, nor are they Google. This still sounds like a good thing though, as it means being able to sign into Zoom, Loom and a bunch of other sites. However, they may not be a ‘3rd party’ in the data protection sense. You might have a contract or agreement with them, and they may be a data processor, doing something you want them to do. 

The downside is that, without any controls, you can find the school accounts being used to sign in to social media, gaming sites, chat and dating sites, or indeed anything else that says ‘Sign in with Google’. Not only is this a concern for data protection and privacy, it also introduces safeguarding risks. If children and young people can use a school account to sign into other areas, that puts an onus on schools to help manage it. And, as already mentioned, children and young people may believe the site is safe, may think they are protected because they are using a school account and also may share far more details with others than they normally would as a result.

Controls

As I mentioned before though, there are tools to help deal with this and you need to know where they are and what to do with them. In discussions with Google education staff and noted Google Certified Trainer, Abid Patel, we were able to see the difficulties that could arise and approaches that schools could take. Again, these are areas that the Education team have pointed people at before, but perhaps more complete context has been needed.

The first area we will look at are some articles that have been released already.

Back in November 2019, App Access Control was launched, to help admins get a better grip on how apps where using the identity tools, and this was followed up more information in July 2020 (https://workspaceupdates.googleblog.com/2020/07/block-apps-access-control-google-data.html). Additional information was provided in April 2021, (https://workspaceupdates.googleblog.com/2021/04/restrict-third-party-api-access-to-sensitive-data-with-new-admin-setting.html). When we look at this we see that schools need to consider 3 things very carefully.

  • They need to consider how they are going to make sure that any planned integration would involve 3rd party app controls.
  • They need to consider how specific 3rd party apps could be pre-authorised and available, and what any request/contact process would be for other 3rd party apps.
  • They need to consider how to communicate all this in a meaningful manner to staff, children, their families and EdTech partners/vendors.

All very manageable areas and it doesn’t take a lot to work into the existing guidance.

Planning integrations

Having done complex Change Management, extensive Data Protection Impact Assessments and detailed Project Management scenarios, I know it is all too easy to overthink it, create burdensome red tape and stifle the adoption of seriously helpful learning tools or resources for school management.

The basic thing to do is make sure that you know who you are going to be working with, have a clear idea why (and who makes the decisions), know what personal data will be involved, where it will be held or moved about, and be sure that you know how to technically and organisationally keep control. 

The most detailed article on the technical side has to be this one, (https://support.google.com/a/topic/10021546?hl=en&ref_topic=7556686). We need to check what the integrations options are and how we then make sure that this is configured within any access control. 

Reviewing and managing 3rd parties

More schools will be playing catch up though, and will need to review where Google accounts have already been used. This is also something to chat about with your Data Protection Officer, as they will be pleased to be able to see what apps have been used and log that actions are being taken to manage them. By reviewing the list of 3rd party apps that have been used against the APIs, we can then investigate each one as required.

You can make use of the token audit log to get an idea of which apps your users have signed into over the last 180 days (https://support.google.com/a/answer/6124308?hl=en). This is extremely useful to help identify the impact if you were to block all 3rd party API access to your users. From here, you can gather which stakeholders may need to be contacted and ensure that they are included within any risk assessment/DPIA. 

Communication and transparency

We also need to remember that communication and transparency remain key ways in which you can protect and educate your users. The ability to present users with a customisable message, should they try to use the school account with a 3rd party site, gives you a range of options. 

If you have yet to turn on an allowed list of 3rd party apps, you will be able to see the details of the Client_ID in the list of accessed apps. You can make notes on all those you wish to keep prior to turning on the ‘allowed list’.

Once you have turned on the ‘allowed list’, you might want to customise the provided messages so that you point them towards policies or messages about safeguarding and security. You might want to direct them to a Google Form where they can request that the site is added to the list of allowed 3rd party sites or you might simply want the block message to be in a more friendly format, such as the example below. Please remember that any URL in the text will not work as an active link so you may need to explain the need to Copy and Paste.

One of the useful things that this message can provide, is additional technical information in the collapsed ‘Request Details’ section. This will include the client ID, so is always helpful to check with a ‘test’ account if you want more details.

tl;dr

You need to check where people can use their school Google account to sign in under SSO, make sure it is only school-approved sites and you review where accounts may have been used elsewhere. This is to support safeguarding, cybersecurity and data protection in general.

You can find my guide to managing this at here, and thanks got to Abid Patel, Dave Leonard, Kim Nilsson and all in the GEG communities who have contributed and supported this work.

 

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: