Connect with us

Biz & IT

Digital driver’s license billed as harder than plastic to forge is easily forged

Published

on

In late 2019, the government of New South Wales in Australia rolled out digital driver’s licenses. The new licenses allowed people to use their iPhone or Android device to show proof of identity and age during roadside police checks or at bars, stores, hotels, and other venues. ServiceNSW, as the government body is usually referred to, promised it would “provide additional levels of security and protection against identity fraud, compared to the plastic [driver’s license]” citizens had used for decades.

Now, 30 months later, security researchers have shown that it’s trivial for just about anyone to forge fake identities using the digital driver’s licenses, or DDLs. The technique allows people under drinking age to change their date of birth and for fraudsters to forge fake identities. The process takes well under an hour, doesn’t require any special hardware or expensive software, and will generate fake IDs that pass inspection using the electronic verification system used by police and participating venues. All of this, despite assurances that security was a key priority for the newly created DDL system.

“To be clear, we do believe that if the Digital Driver’s Licence was improved by implementing a more secure design, then the above statement made on behalf of ServiceNSW would indeed be true, and we would agree that the Digital Driver’s Licence would provide additional levels of security against fraud compared to the plastic driver’s licence,” Noah Farmer, the researcher who identified the flaws, wrote in a post published last week.

A better mousetrap hacked with minimal effort

“When an unsuspecting victim scans the fraudster’s QR code, everything will check out, and the victim won’t know that the fraudster has combined their own identification photo with someone’s stolen Driver’s Licence details,” he continued. As things have stood for the past 30 months, however, DDLs make it “possible for malicious users to generate [a] fraudulent Digital Driver’s Licence with minimal effort on both jailbroken and non-jailbroken devices without the need to modify or repackage the mobile application itself.”

DDLs require an iOS or Android app that displays each person’s credentials. The same app allows police and venues to verify that the credentials are authentic. Features designed to confirm the ID is authentic and current include:

  • Animated NSW Government logo.
  • Display of the last refreshed date and time.
  • A QR code expires and reloads.
  • A hologram that moves when the phone is tilted.
  • A watermark that matches the licence photo.
  • Address details that don’t require scrolling.

Surprisingly simple

The technique for overcoming these safeguards is surprisingly simple. The key is the ability to brute-force the PIN that encrypts the data. Since it’s only four digits long, there are only 10,000 possible combinations. Using publicly available scripts and a commodity computer, someone can learn the correct combination in a matter of a few minutes, as this video, showing the process on an iPhone, demonstrates.

ServiceNSW Digital Driver’s Licence proof-of-concept: Brute-forcing PIN.

Once a fraudster gets access to someone’s encrypted DDL license data—either with permission, by stealing a copy stored in an iPhone backup, or through remote compromise—the brute force gives them the ability to read and modify any of the data stored on the file.

From there, it’s a matter of using simple brute-force software and standard smartphone and computer functions to extract the file storing the credential, decrypting it, changing the text, re-encrypting it, and copying it back to the device. The precise steps on an iPhone are:

  • Use iTunes backup to copy the contents of iPhone storing the credential the fraudster wants to modify
  • Extract the encrypted file from the backup stored on the computer
  • Use brute-force software to decrypt the file
  • Open the file in a text editor and modify the birth date, address, or other data they want to fake
  • Re-encrypt the file
  • Copy the re-encrypted file to the backup folder and
  • Restore the backup to the iPhone

With that the ServiceNSW app will display the fake ID and present it as genuine.

The following video shows the entire process from start to finish.

Death by 1,000 flaws

A variety of design flaws make this simple hack possible.

The first is a lack of adequate encryption. A key based on a four-digit PIN is woefully inadequate. Apple provides a function named SecRandomCopyBytes for producing random bytes that can be used to generate secure keys. “If this was used to encrypt the Digital Driver’s Licence rather than the 4 digit PIN, it would make the task of brute-forcing much harder if not completely infeasible for attackers,” Farmer wrote.

The next major flaw is that, astonishingly, DDL data is never validated against the back-end database to make sure that what’s stored on the iPhone matches records maintained by the government department. With no means to natively validate the data, there’s no way to tell when information has been tampered with. As a result attackers are able to display the falsified data on the Service NSW application without any means to prevent or detect the fraud.

The third shortcoming is that using the “pull-to-refresh” function—a cornerstone of the DDL verification scheme intended to ensure the most current information is showing—fails to refresh any of the data stored in the electronic credential. Instead, it updates only the QR code. A better response would be for the pull-to-refresh function to download the latest copy of the DDL from the ServiceNSW database.

Fourth, the QR code transmits only the DDL holder’s name and status as either over or under the age of 18. The QR code is supposed to allow the person checking the ID to scan it with their own ServiceNSW app to validate that the data presented is authentic. To bypass the check, a fraudster only needs to obtain the driver’s license details from a stolen or otherwise-obtained DDL and replace it locally on their phone.

“When an unsuspecting victim scans the fraudster’s QR code, everything will check out, and the victim won’t know that the fraudster has combined their own identification photo with someone’s stolen Driver’s Licence details,” Farmer explained. Had the system returned the legitimate image data, the scanning party would easily see that the fraudster had forged the DDL, since the face returned by Service NSW wouldn’t match the face displayed on the app.

The last flaw the researcher identified was that the app allows the data it stores to be backed up and restored at all. While all files stored in the Documents and Library/Application Support/ folders are backed up by default, iOS allows developers to easily exclude certain files from backup by calling NSURL setResourceValue:forKey:error: with the NSURLIsExcludedFromBackupKey key.

With a reported 4 million NSW residents using the DDLs, the gaffe could have serious consequences for anyone who relies on DDLs to verify identities, ages, addresses, or other personal information. It’s not clear how or even if Service NSW plans to respond. Given time differences between San Francisco and New South Wales, officials with the department weren’t immediately available for comment.

Farmer noted this tweet, which called out a hotel bar for refusing service to someone who had only physical ID and instead accepting only DDLs. “I know 10 kids that you let in regularly with fake digital licenses because they are easy to make,” the person claimed.

While the veracity of that claim can’t be verified, it certainly sounds plausible, given the ease and effectiveness of the hack shown here.

Continue Reading

Biz & IT

The cryptopocalypse is nigh! NIST rolls out new encryption standards to prepare

Published

on

Enlarge / Conceptual computer artwork of electronic circuitry with blue and red light passing through it, representing how data may be controlled and stored in a quantum computer.

Getty Images

In the not-too-distant future—as little as a decade, perhaps, nobody knows exactly how long—the cryptography protecting your bank transactions, chat messages, and medical records from prying eyes is going to break spectacularly with the advent of quantum computing. On Tuesday, a US government agency named four replacement encryption schemes to head off this cryptopocalypse.

Some of the most widely used public-key encryption systems—including those using the RSA, Diffie-Hellman, and elliptic curve Diffie-Hellman algorithms—rely on mathematics to protect sensitive data. These mathematical problems include (1) factoring a key’s large composite number (usually denoted as N) to derive its two factors (usually denoted as P and Q) and (2) computing the discrete logarithm that keys are based on.

The security of these cryptosystems depends entirely on classical computers’ difficulty in solving these problems. While it’s easy to generate keys that can encrypt and decrypt data at will, it’s impossible from a practical standpoint for an adversary to calculate the numbers that make them work.

In 2019, a team of researchers factored a 795-bit RSA key, making it the biggest key size ever to be solved. The same team also computed a discrete logarithm of a different key of the same size.

The researchers estimated that the sum of the computation time for both of the new records was about 4,000 core-years using Intel Xeon Gold 6130 CPUs (running at 2.1GHz). Like previous records, these were accomplished using a complex algorithm called the Number Field Sieve, which can be used to perform both integer factoring and finite field discrete logarithms.

Quantum computing is still in the experimental phase, but the results have already made it clear it can solve the same mathematical problems instantaneously. Increasing the size of the keys won’t help, either, since Shor’s algorithm, a quantum-computing technique developed in 1994 by the American mathematician Peter Shor, works orders of magnitude faster in solving integer factorization and discrete logarithmic problems.

Researchers have known for decades these algorithms are vulnerable and have been cautioning the world to prepare for the day when all data that has been encrypted using them can be unscrambled. Chief among the proponents is the US Department of Commerce’s National Institute of Standards and Technology (NIST), which is leading a drive for post-quantum cryptography (PQC).

On Tuesday, NIST said it selected four candidate PQC algorithms to replace those that are expected to be felled by quantum computing. They are: CRYSTALS-Kyber, CRYSTALS-Dilithium, FALCON, and SPHINCS+.

CRYSTALS-Kyber and CRYSTALS-Dilithium are likely to be the two most widely used replacements. CRYSTALS-Kyber is used for establishing digital keys two computers that have never interacted with each other can use to encrypt data. The remaining three, meanwhile, are used for digitally signing encrypted data to establish who sent it.

“CRYSTALS-Kyber (key-establishment) and CRYSTALS-Dilithium (digital signatures) were both selected for their strong security and excellent performance, and NIST expects them to work well in most applications,” NIST officials wrote. “FALCON will also be standardized by NIST since there may be use cases for which CRYSTALS-Dilithium signatures are too large. SPHINCS+ will also be standardized to avoid relying only on the security of lattices for signatures. NIST asks for public feedback on a version of SPHINCS+ with a lower number of maximum signatures.”

The selections announced today are likely to have significant influence going forward.

“The NIST choices certainly matter because many large companies have to comply with the NIST standards even if their own chief cryptographers don’t agree with their choices,” said Graham Steel, CEO of Cryptosense, a company that makes cryptography management software. “But having said that, I personally believe their choices are based on sound reasoning, given what we know right now about the security of these different mathematical problems, and the trade-off with performance.”

Nadia Heninger, an associate professor of computer science and engineering at University of California, San Diego, agreed.

“The algorithms NIST chooses will be the de facto international standard, barring any unexpected last-minute developments,” she wrote in an email. “A lot of companies have been waiting with bated breath for these choices to be announced so they can implement them ASAP.”

While no one knows exactly when quantum computers will be available, there is considerable urgency in moving to PQC as soon as possible. Many researchers say it’s likely that criminals and nation-state spies are recording massive amounts of encrypted communications and stockpiling them for the day they can be decrypted.

Continue Reading

Biz & IT

Google allowed sanctioned Russian ad company to harvest user data for months

Published

on

ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for The Big Story newsletter to receive stories like this one in your inbox.

The day after Russia’s February invasion of Ukraine, Senate Intelligence Committee Chairman Mark Warner sent a letter to Google warning it to be on alert for “exploitation of your platform by Russia and Russian-linked entities,” and calling on the company to audit its advertising business’s compliance with economic sanctions.

But as recently as June 23, Google was sharing potentially sensitive user data with a sanctioned Russian ad tech company owned by Russia’s largest state bank, according to a new report provided to ProPublica.

Google allowed RuTarget, a Russian company that helps brands and agencies buy digital ads, to access and store data about people browsing websites and apps in Ukraine and other parts of the world, according to research from digital ad analysis firm Adalytics. Adalytics identified close to 700 examples of RuTarget receiving user data from Google after the company was added to a US Treasury list of sanctioned entities on Feb. 24. The data sharing between Google and RuTarget stopped four months later on June 23, the day ProPublica contacted Google about the activity.

RuTarget, which also operates under the name Segmento, is owned by Sberbank, a Russian state bank that the Treasury described as “uniquely important” to the country’s economy when it hit the lender with initial sanctions. RuTarget was later listed in an April 6 Treasury announcement that imposed full blocking sanctions on Sberbank and other Russian entities and people. The sanctions mean US individuals and entities are not supposed to conduct business with RuTarget or Sberbank.

Of particular concern, the analysis showed that Google shared data with RuTarget about users browsing websites based in Ukraine. This means Google may have turned over such critical information as unique mobile phone IDs, IP addresses, location information, and details about users’ interests and online activity, data that US senators and experts say could be used by Russian military and intelligence services to track people or zero in on locations of interest.

Last April, a bipartisan group of US senators sent a letter to Google and other major ad technology companies warning of the national security implications of data shared as part of the digital ad buying process. They said this user data “would be a goldmine for foreign intelligence services that could exploit it to inform and supercharge hacking, blackmail, and influence campaigns.”

Google spokesperson Michael Aciman said that the company blocked RuTarget from using its ad products in March and that RuTarget has not purchased ads directly via Google since then. He acknowledged the Russian company was still receiving user and ad buying data from Google before being alerted by ProPublica and Adalytics.

“Google is committed to complying with all applicable sanctions and trade compliance laws,” Aciman said. “We’ve reviewed the entities in question and have taken appropriate enforcement action beyond the measures we took earlier this year to block them from directly using Google advertising products.”

Aciman said this action includes not only preventing RuTarget from further accessing user data, but from purchasing ads through third parties in Russia that may not be sanctioned. He declined to say whether RuTarget had purchased ads via Google systems using such third parties, and he did not comment on whether data about Ukrainians had been shared with RuTarget.

Krzysztof Franaszek, who runs Adalytics and authored the report, said RuTarget’s ability to access and store user data from Google could open the door to serious potential abuse.

“For all we know they are taking that data and combining it with 20 other data sources they got from God knows where,” he said. “If RuTarget’s other data partners included the Russian government or intelligence or cybercriminals, there is a huge danger.”

In a statement to ProPublica, Warner, a Virginia Democrat, called Google’s failure to sever its relationship with RuTarget alarming.

“All companies have a responsibility to ensure that they are not helping to fund or even inadvertently support Vladimir Putin’s invasion of Ukraine. Hearing that an American company may be sharing user data with a Russian company—owned by a sanctioned, state-owned bank no less—is incredibly alarming and frankly disappointing,” he said. “I urge all companies to examine their business operations from top to bottom to ensure that they are not supporting Putin’s war in any way.”

Continue Reading

Biz & IT

Google closes data loophole amid privacy fears over abortion ruling

Published

on

Google is closing a loophole that has allowed thousands of companies to monitor and sell sensitive personal data from Android smartphones, an effort welcomed by privacy campaigners in the wake of the US Supreme Court’s decision to end women’s constitutional right to abortion.

It also took a further step on Friday to limit the risk that smartphone data could be used to police new abortion restrictions, announcing it would automatically delete the location history on phones that have been close to a sensitive medical location such an abortion clinic.

The Silicon Valley company’s moves come amid growing fears that mobile apps will be weaponized by US states to police new abortion restrictions in the country.

Companies have previously harvested and sold information on the open market including lists of Android users using apps related to period tracking, pregnancy and family planning, such as Planned Parenthood Direct.

Over the past week, privacy researchers and advocates have called for women to delete period-tracking apps from their phones to avoid being tracked or penalised for considering abortions.

The US tech giant announced last March that it would restrict the feature, which allows developers to see which other apps are installed and deleted on individuals’ phones. That change was meant to be implemented last summer, but the company failed to meet that deadline citing the pandemic among other reasons.

The new deadline of July 12 will hit just weeks after the overturning of Roe vs Wade, a ruling that has thrown a spotlight on how smartphone apps could be used for surveillance by US states with new anti-abortion laws.

“It’s long overdue. Data brokers have been banned from using the data under Google’s terms for a long time, but Google didn’t build safeguards into the app approvals process to catch this behavior. They just ignored it,” said Zach Edwards, an independent cyber security researcher who has been investigating the loophole since 2020.

“So now anyone with a credit card can purchase this data online,” he added.

Google said: “In March 2021, we announced that we planned to restrict access to this permission, so that only utility apps, such as device search, antivirus, and file manager apps, can see what other apps are installed on a phone.”

It added: “Collecting app inventory data to sell it or share it for analytics or ads monetisation purposes has never been allowed on Google Play.”

Despite widespread usage by app developers, users remain unaware of this feature in Android software—a Google-designed programming interface, or API, known as the “Query All Packages.” It allows apps, or snippets of third-party code inside them, to query the inventory of all other apps on a person’s phone. Google itself has referred to this type of data as high-risk and “sensitive,” and it has been discovered being sold on to third parties.

Researchers have found that app inventories “can be used to precisely deduce end users interests and personal traits,” including gender, race and marital status, among other things.

Edwards has found that one data marketplace, Narrative.io, was openly selling data obtained by intermediaries in this way, including smartphones using Planned Parenthood, and various period tracking apps.

Narrative said it removed pregnancy tracking and menstruation app data from its platform in May, in response to the leaked draft outlining the Supreme Court’s forthcoming decision.

Another research company, Pixalate, discovered that consumer apps, like a simple weather app, were running bits of code that exploited the same Android feature and were harvesting data for a Panamanian company with ties to US defense contractors.

Google said it “never sells user data, and Google Play strictly prohibits the sale of user data by developers. When we discover violations we take action,” adding it had sanctioned multiple companies believed to be selling user data.

Google said it would restrict the Query All Packages feature to only those who require it from July 12. App developers will be required to fill out a declaration explaining why they need access, and notify Google of this before the deadline so it can be vetted.

“Deceptive and undeclared uses of these permissions may result in a suspension of your app and/or termination of your developer account,” the company warned.

Additional reporting by Richard Waters.

© 2022 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.

Continue Reading

Trending