9 min read

ICO releases second Tech Horizons Report commenting on eight priority technologies

Read more

By Christopher Air & Stuart Hunt


Published 08 March 2024


In December 2022, the Information Commissioner's Office published its first 'Tech Horizons Report'. As the first in an annual series supporting the ICO25 strategic plan and a commitment to comment on emerging technologies, the 2022 report focused on four technologies: Consumer Healthtech, Next Generation Internet of Things (IoT), Immersive Technology and Decentralised Finance. Those technologies are now subject to a call for applications into the ICO Regulatory Sandbox.

The second edition of the Tech Horizons Report (published in February 2024) takes a similar approach, albeit doubling the number of technological developments discussed compared to the previous edition, with eight priorities identified in the report, along with a discussion of the privacy risks associated with each. As before, recommendations on how organisations and businesses should address these risks are set out in the report.

Across these emerging technologies, the ICO emphasises that there are a number of overarching trends and dynamics to be addressed in their development, which we summarise as follows:

  • Innovations across fields such as immersive technologies, genomics and neurotechnology will allow the collection of novel types of intimate data. Appropriate safeguards must be built-in from the outset.
  • The collection of substantial amounts of personal data to help personalise experiences will challenge individuals' understanding on how their data is used, and what the combination of different information may reveal.
  • The use of AI across the technology spectrum has created a range of developments for the ICO to keep up to date with.
  • The involvement of multiple organisations in the processing of people's data is making it challenging for individuals to exercise their information rights.
  • The possible misuse of some of the technologies in the report should compel organisations to put appropriate safeguards in place. New solutions should not introduce new types of harm or discrimination

The technologies identified in this report are those that the ICO believes will be the subject of widespread adoption in the next two to seven years. The technologies identified by the ICO have gone through a selection process which included analysis of: the magnitude of the privacy risk internally by the ICO and with input from external experts, the expected market penetration of the technology and scenario-building. The technologies are:

  • Genomics;
  • Immersive virtual worlds;
  • Neurotechnologies;
  • Quantum computing;
  • Commercial use of drones;
  • Personalised AI;
  • Next-generation search; and
  • Central bank digital currencies (CBDCs).

The report summarises the technology behind the topics, current adoption, and the associated privacy and data protection risks.


The sequencing of the human genome has improved understanding of a broad range of traits in healthcare. Currently, most genomic research, investment and market growth is directed at diagnostics, drug development and precision medicine, with benefits already seen in diagnosis and treatment of rare conditions.

Non-healthcare application of genomics is limited, but wider use may emerge in the future. Potential applications such as intervention in early-childhood education, sports nutrition or screening of health and personality in employment have been identified. The application of genomics to insurance has also been discussed, with reference to determining the risk of disease; yet the use of genetic information in insurance remains limited to specific circumstances currently.

The data protection and privacy risks associated with genomics include accuracy and fairness. The existing limitations of the science mean that predictive capabilities are likely to be contested. An overreliance on any predictive capabilities, combined with a lack of understanding, could result in a failure to make limitations and biases clear. Any processing of genomic information must also ensure that the intended processing is fair.

The special category nature of genomic information, and its uniqueness, will render it difficult to effectively anonymise. Research insights may only be gained through long-term research, meaning that strong security and controls over the use of the information are important. Genomic information may also affect related family members, and consideration of complex issues around lawful bases for processing will be needed.

Immersive virtual worlds

Often commonly referred to as the 'metaverse', immersive virtual worlds are where users can interact with each other and use digital services, particularly e-commerce and gaming. This can include diverse types of extended reality including augmented reality (AR), virtual reality (VR) and mixed reality (combining elements of AR and VR). The ICO highlights scepticism at the realisation of the metaverse, due to social and technological barriers. Those challenges are likely to be resolved in the medium to long-term, with the full potential of immersive virtual worlds likely to be realised in a further 10 to 15 years.

Immersive technologies generate numerous data protection challenges. The collection of user information such as eye and facial movement tracking may be subject to special category information procedures due to their biometric nature. The use of immersive technology by children should prompt proactive consideration of safeguards and age assurance technologies, as well as compliance with privacy and safety by design requirements.


Invasive and non-invasive technologies are capable of recording and processing neuro-data (i.e. information about the human brain), to obtain information, control devices or modulate neural activity. These include medical devices such as prosthetics, implants, and wearable devices. In addition, commercial use for employee monitoring is growing in number, and market developments are likely to include wellbeing wearables and further workplace deployment to track employee safety.

The collection of neural information will pose increased privacy and security risks. Inferences may be made about highly sensitive information such as a person's mental health or sexuality. There is no explicit definitive definition of neural information under the UK GDPR, and it may not be considered special category information unless used for medical or identification purposes. People using neurotechnology are also unable to control the specific information generated and shared, raising issues of consent, and the use of such information may also give rise to neurodiscrmination. The ICO is clear that further collaboration between appropriate regulators is necessary to develop technical clarity and effective regulation of the use of neurotech information.

Quantum computing

Through combining sophisticated technologies with the principles of quantum mechanics, quantum computing may be able to resolve highly complex computational problems beyond the capacity of existing computers. As with immersive virtual worlds, the timescales for widespread integration of quantum computing are unclear with significant technical and engineering challenges remaining.

When quantum computing does become widespread, in the absence of mitigation, quantum computing could undermine existing encryption through the financial system, biometric information and digital signatures. The threat of 'harvest now, decrypt later' describes circumstances where threat actors collect high value information now to access later when sufficiently powerful quantum models are available. Where the emerging use of quantum computing involves the processing of personal information, organisations must comply with their data protection obligations. It is expected that the initial use of quantum computing in the UK may require the use of international quantum capacity, meaning organisations will need to consider international transfer requirements.

Commercial use of drones

The use of unmanned aerial vehicles (UAVs) in commercial settings is increasing, particularly in the areas of delivery and e-commerce, monitoring and crowd control. Increased commercial use of drones is so far limited by aviation regulatory requirements. However, the Government has publicly committed to delivering an enabling regulatory framework to support the drone industry.

The key data protection and privacy implications of the wider commercial use of drones include surveillance, which the ICO is already monitoring . The collection of large amounts of data is possible via the usage of drones, which is likely to consist of both essential data, such as personal information programmed prior to an operational flight, and inadvertent collection when recording people whilst in flight. Organisations are encouraged to use blurring or obfuscation as proactive mitigation measures to avoid concerns about transparency and to facilitate people's rights. The ICO also raises the possibility of drone IDs being utilised, similar to other countries such as US, Japan and Switzerland.

Personalised AI

The customisation of large language models will create more tailored user experiences based on individual users' personal preferences, characteristics and search patterns. During the rapid pace of development of AI in the past couple of years, a variety of services marketing themselves as personal AIs are already on the market, allowing for automated email and instant message responses.

The increased uptake of personalised AI may improve workplace productivity, educational outcomes and some of the features of existing generative AI. The expansion of these benefits will carry data and privacy implications. The ICO have previously set out eight privacy related issues that developers will need to consider when developing generative AI systems. However, as AI becomes increasingly personalised, developments in areas such as education could result in the processing of special category data. The exploitation of personalisation could occur as threat actors seek to obtain intimate details from users' personal lives. Organisations must therefore have appropriate technical and organisational measures to process data and use suitable encryption.

Next-generation search

The incorporation of embedded AI capabilities such as voice and image-based elements will form part of the next generation of search engines. The ICO identifies the most visible recent development as the use of generative AI as either a separate search engine or in existing search engines. However, multi-model search and queryless search also expected to become relevant in coming years. These innovations will carry numerous implications for data protection and privacy. The quantity of information collected will allow for greater personalisation, but organisations developing these methods will need to comply with the principle of data minimisation, and ensure that any information processed is adequate, relevant and limited to what is necessary.

In addition, organisations will need to consider issues of transparency, hallucination concerns in respect of personal data complicating accuracy obligations, and intersections with immersive technologies. The ICO notes it expects to plan a foresight report on the future of search and discovery in 2024 which will discuss these issues further.

Central bank digital currencies (CBDCs)

 new form of digital money, central bank-issued currency is likely to be increasingly used in the future to complement physical cash and other payment mechanisms for payment needs. Several nations are now at late stages of their pilots, including France and China, with the UK Treasury and Bank of England currently assessing the case of retail CBDCs (which are intended for use by private sector businesses and individuals to make payments).

As a general principle, the development of CBDCs is expected to consider whether personal information might be processed within the deployment of CBDCs. For example, the use of CBDCs may be pseudonymised or encrypted, yet central banks will need to ensure that any information they process cannot be combined with other sources to reidentify users. Sensitive information about users and their spending habits could be revealed.


What next?

The ICO states it intends on addressing these innovative technologies proactively as they mature, which will involve bringing the public into decisions about the risks and benefits. More specifically, further insights on quantum computing and genomics will be shared by the ICO in foresight reports, and innovators will be invited to work with the Regulatory Sandbox on ensuring that data protection is engineered into the technologies.