NHS contact-tracing app code hints at security and privacy bugs early on

London, UK. NHS recently announced plans to unveil their own coronavirus contact-tracing app, as opposed to joining leagues of Apple and Google, to have better visibility into citizen movements.

Suffice to say, the plan has certainly raised eyebrows of privacy activists, lockdown sceptics, and opponents of “big government.”

On the bright side, the NHS coronavirus app is open source, with code for beta versions of Android and iOS apps released on GitHub as of last week.

At this time, it also seems the app is voluntary to install for those willing to provide data on their movements, self-report coronavirus infections, and ultimately alert those who’ve been in close contact with them, as a safety practice.

The idea is to offer more insight to the government and each other, and better promote social distancing through sound use of data.

While the government developing a project of such national scale so rapidly is a bold move, it is commendable that they chose to open source it.

Because of this, we can hope any security vulnerabilities shall be discovered and remedied before adversaries get a headstart. After all, for an app like this, adequate security and user privacy controls are a must.

Dissecting the code

Commits don’t lie. I’ve said this before, if the source code reveals too much about what’s lacking in terms of security, malicious actors—hackers who are well qualified to read it, may begin exploiting the vulnerabilities before the “good guys” can patch them.

Of course, this greater transparency. at the same time, is a benefit of open source too. Bugs catch the attention of the public eye, get reported sooner, and are typically resolved faster than they would have in a proprietary system.

The very first iteration of source code attracts a lot of interesting GitHub issues which are obviously still open, given the recency of release.

Android repository

There’s already and allegedly at least one security (crypto-related) bug in the Android and iOS code which, from the looks of it, doesn’t generate “private keys” correctly,  as per the norms of cryptography.

Because the keys are generated on an external web service rather than user’s device itself, the guarantee of privacy is rendered moot, in a strict cybersecurity context.

The issue reporter Michael Brown states, “this implementation flaw is separate from the basic design flaw of any centralised approach to contact tracing. The basic design flaw allows a government to trace the movements and meetings of its citizens.

This implementation flaw additionally allows the government to forge records of such movements and meetings, and to create valid digital signatures for the forged records.” Of course, whether such malpractices will happen in practice is hard to comment on.

Still, the flaw goes to the heart of cryptography basics, and the premise of data confidentiality and integrity which can only be guaranteed if the private key remains… private.

What good would be so much data if its integrity remains questionable?

Another GitHub issue, just short of turning into an online debate, concerns whether ProGuard should be enabled in the build or notProGuard is a widely used open-source tool for obfuscating* and optimizing code.

Since this is a “privacy sensitive” app, a commentator spotted, we might be better off leaving ProGuard out altogether. This would enable the security research community to easily reverse engineer and analyze the finalised live versions of Android APKs, for greater transparency.

Lastly, there’s the obvious shared devices social practice awaiting a solution. It isn’t unusual for members of the same household to share each others’ digital devices, especially kids using their parents’ iPad, for example, which could lead to discrepancies in data reporting.

*Obfuscation: scrambling source code so as to make it virtually incomprehensible and too cryptic for a human to read, in a quest to achieve security via obscurity.

iOS repository

From the discussion on GitHub, it appears the Apple version of the app isn’t immune from the “secretKey” (private key) bug either. There’s also slight disapproval of the “centralized” data approach the app apparently employs.

In the same thread, there is justification provided for how the centralized data approach might be anacceptable privacy tradeoff for the overall greater goal the app aims to achieve.

Some users surmised that the Google Analytics tracking functionality might constitute a GDPR violation as the code, at this time, does not ask for prior consent from the user. That one might be easy to work around via legal disclaimers and terms of service, in my opinion.

Privacy first

The biggest concern here is the privacy of the citizens, and the slightest but possible risk of the app becoming a doorway to government surveillance.

Even if no credible findings exist yet to indicate the plausibility of such adversarial motives, not every member of the public may feel comfortable enough trusting the app.

As if socially distancing strangers on the street wasn’t awkward enough, do we now want to become GPS beacons, transmitting our coordinates every minute – to the government and to everyone using the app?

There’s obviously the privacy concerns experts have raised, but the use of bluetooth technology and recent security vulnerabilities impacting bluetooth devices warrant taking a closer look.

As stated, once released on app stores, it seems the app would be voluntary to install. Researchers suspect, however, that the data reporting efforts of the app will be fruitful in “curbing the spread of coronavirus if at least 56 to 60 per cent of the population download it.

Given the low adoption rates of similar apps in other nations, the projections can’t be too optimistic. For example, only one in five people (20%) have signed up for Singapore’s TraceTogether app.

Why so much time, effort, and money has been spent on this potentially “useless and unlawful” app is difficult to justify. No solution is perfect, and we are trying to find innovative ways to combat the unknownunknowns brought forward by COVID-19.

While the design of the app, and its planned objectives may make some cringe, the fact that it’s open source at least provides a slight degree of reassurance to beta testers.

© 2020. Ax Sharma. All Rights Reserved.

Ax Sharma

Ax Sharma is an Indian-origin British security researcher, journalist and TV subject matter expert with a focus on malware analysis and cybercrime investigations. His areas of interest include open source software security, threat intel analysis, and reverse engineering. Frequently featured by leading media outlets like the BBC, Channel 5, Fortune, WIRED, The Register, among others, Ax is an active community member of the OWASP Foundation and the British Association of Journalists (BAJ).

Recent Posts

Sea Turtle Cyber Espionage Campaign Targets Telecommunication and IT Companies in the Netherlands

Telecommunication, media, internet service providers (ISPs), information technology (IT)-service providers, and Kurdish websites in the…

11 months ago

Rogue WordPress plugin: Threat hunters uncover credit card skimming campaign targeting e-commerce sites

Rogue WordPress Plugin Found to Steal Credit Card Information in Magecart Campaign Threat hunters have…

11 months ago

Albanian Parliament and telco ‘One Albania’ suffer cyber attacks

The Assembly of the Republic of Albania and telecom company One Albania have recently fallen…

11 months ago

Carbanak Banking Malware Resurfaces with Updated Tactics in Ransomware Attacks

The banking malware Carbanak has resurfaced with updated tactics, incorporating attack vendors and techniques to…

11 months ago

Theme park giant Parques Reunidos hit by a ransomware cyber attack

One of the world's largest theme park operators, Parques Reunidos has disclosed a cybersecurity incident.…

2 years ago

Phishing kit screenshots your email domain on the fly to appear real

Phishing kit used by multiple hacked sites generates a log in page on the fly…

2 years ago

This website uses cookies.