-
Wrap Up
-

Tech Wrap Up October 2023

By
Distilled Post Editorial Team

NHS data platform contract announcement delayed further

Earlier this month, NHS England’s Chief Data and Analytics Officer Ming Tang reiterated the ‘need to bring the public with us’ with regards to NHS data. She was speaking in the context of NHS England’s incoming centralised patient data platform, known as the Federated Data Platform (FDP). The FDP represents a significant attempt to amalgamate patient data and medical records into a standardised, centralised framework, with the aim of improving efficiency across the system and providing patients with more joined-up care. 

The US tech company Palantir is considered a front-runner in the race to be awarded the £480m contract to build the platform, partly due to their prior involvement with patient data through the company’s organisation of the NHS Covid-19 data store during the pandemic. Critics have labelled Palantir as having an unfair advantage, with concerns raised by patient data advocates that there is insufficient safeguarding in place to ensure that patients retain control over access to their medical records from the corporate sphere. The announcement of the contract was set to arrive at the end of September, but was subsequently delayed until mid-October. Now that deadline has passed, questions will inevitably arise over the awarding process, and what might be behind the delay. 

UK’s landmark online safety bill set to receive royal assent

The much-discussed online safety bill is set to become law, with the legislation regulated by Ofcom. The act is intended to ensure tech companies in the UK have adequate moderation processes in place to prevent the dissemination of harmful material. It primarily focuses on protecting children, shielding the public from illegal content and supporting adult users in avoiding harmful (but not illegal) content on platforms such as Facebook or X. 

The act will create various new criminal offences, including criminal prosecution for those who use social media to encourage self-harm, and for those who share pornographic deepfakes. Many companies already have systems in place to moderate user-generated content, but the act means that now, this will be occurring under the watchful eye of Ofcom. Various elements of the bill have prompted concerns from data privacy advocates, most notably the legislation’s provision on combating child sexual abuse material (CSAM). The bill allows Ofcom to order a messaging platform to actively look for such content, but in doing so this provides an existential threat to end-to-end encrypted messaging. The provision could feasibly require companies to scan private messages. 

Investigation finds UK Government frequently using AI to make key decisions

An investigation by the Guardian has revealed that government officials are using AI and algorithms to make key decisions across sectors. The investigation showed that the Department for Work and Pensions was utilising an algorithm to allocate benefits, while an algorithm was being used by the Home Office to identify sham marriages. AI and algorithms carry with them the inherent risk of bias; for example, there are claims that the marriage licencing AI used by the Home Office was disproportionately identifying people of certain nationalities, and a facial recognition tool used by the Metropolitan police was found to make more mistakes recognising black faces than white ones. While AI is often heralded as a revolutionary tool to improve efficiency and service delivery in government, such examples serve as a reminder of the risks inherent in over-reliance on technology.