Out-Law Analysis 5 min. read
31 Jul 2018, 11:21 am
The warning from the Information Commissioner's Office (ICO) concerns use of data stemming from research initiatives that universities are involved in, but, when considered with the growing cyber threat facing the sector, it should spur universities to improve the security of all the data they hold and to better control how it is shared and used.
Opportunities to play a leading role in artificial intelligence (AI) developments are likely to arise for universities that can demonstrate robust policies and practices on cyber and data security.
Universities hold vast amounts of data of various types.
For example, in the context of research projects, data can include commercially sensitive information, intellectual property and personal data, including potentially that of a sensitive nature, such as medical or genetic data used in projects aimed at developing new drugs.
In addition, as major employers and with a large student population, universities also hold personal data records belonging to academics, staff and students.
Protecting all this information is a challenge. Universities often operate with disparate IT systems that reflect the organic growth of the institution over time. These systems can vary in their age and complexity and contain multiple points of vulnerability. Campuses too are often spread across multiple locations and over a wide area. In both a physical and virtual sense, therefore, universities must provide for adequate data protection measures.
The volume and value of data that universities hold make them targets for cyber attack, particularly in relation to data from collaboration projects that universities are working with major businesses on.
In 2017, The Times reported the results of a freedom of information exercise that found that hundreds of cyber attacks at universities are successful each year. It highlighted the prominence of phishing and ransomware attacks on the sector and the risks of valuable data being compromised. University College London reported in June last year that it had been subject to a so-called ransomware attack.
A number of universities have also previously fallen subject to scrutiny by the UK's Information Commissioner's Office (ICO) over issues with data security, including in respect of more basic human-error faults in handling personal data, such as the accidental disclosure of personal data in email attachments.
In an indication of the seriousness with which the ICO is taking data security in the higher education sector, the watchdog fined the University of Greenwich £120,000 in May this year over a data breach. It was the first time the ICO had issued a fine against a university for a breach of data protection law.
In a more recent report, the ICO again highlighted its concerns about the data protection practices at universities.
The ICO said: "What is clear is that there is room for improvement in how higher education institutions overall handle data in the context of academic research and whilst well-established structures exist in relation to the ethical issues that arise from research, similar structures do not appear to exist in relation to data protection. Given the rapid developments in big data and digital technologies, research could increasingly involve personal data sourced from social media and other third party sources. It is therefore essential that higher education institutions have in place the correct processes and due diligence arrangements to minimise the risk to data subjects and to the integrity of academic research practices."
"We have therefore recommended that Universities UK work with the ICO to consider the risks arising from use of personal data by academics in a private research capacity and when they work with their own private companies or other third parties. Universities UK has committed to do so, and will convene a working group of higher education stakeholders to consider the wider privacy and ethical implications of using social media data in research, both within universities and in a private capacity," it said.
The severe financial penalties that can now be imposed on organisations that breach data protection law, under the General Data Protection Regulation, provides a regulatory and reputational incentive to address cyber and data risks properly – fines of up to 4% of annual global turnover, or €20m, whichever is highest, now await organisations that fail to meet their legal obligations on data protection.
However, there are positive reasons why universities should update their practices on cyber and data security.
Universities have a central role to play in helping the UK government achieve the vision set out in its AI 'sector deal'.
That deal, signed up to by the government and sections of the technology industry, sets out how business, academia and government might work in partnership to drive improvements in UK productivity through support for AI. It is made up of a package of measures including up to £0.95 billion of financial support for the AI sector from public and private investment, as well as improved tax credits for AI research and development.
Inherent in the use of AI is the use of data – it is the data that helps inform how AI operates.
The government has recognised this and has further committed to identifying barriers that exist to sharing data and work with industry to "explore frameworks and mechanisms for safe, secure and equitable data transfer", such as through the use of 'data trusts', which were recommended in a government-commissioned review into how to grow the AI industry in the UK last year.
Data trusts can help "facilitate the sharing of data between organisations holding data and organisations looking to use data to develop AI", that review said.
In a recent consultation on the role, objectives and focus of the new Centre for Data Ethics and Innovation, the government again endorsed the idea of data trusts (25-page / 278KB PDF).
It said: "Data is at the core of the UK government's ambition to build the world’s leading digital economy and government. This will require the right incentives and structures for the creation, collection and analysis of data. It will also need to encourage data to be created, shared and traded efficiently across markets, including those of public or national interest. This will require establishing novel data sharing frameworks, such as the data trusts proposed by the recent Hall-Pesenti review of AI, as well as work to enhance interoperability across data networks."
It is unclear yet which organisations might act as data trusts in the context of AI projects, but universities, as existing research hubs and with links to and collaborations with businesses interested in the potential of AI, are perhaps well placed to do so.
Universities that embed robust cyber and data security practices, processes and procedures across their whole organisation will certainly be well placed to lead in this area of innovation. That process will not be easy and will take time, but with the opportunities ahead, universities should not wait for a major incident to hit them before taking action.
Chris Martin and Joanne McIntosh are experts in technology law in the higher education sector at Pinsent Masons, the law firm behind Out-Law.com.