Connect with us

Biz & IT

Law enforcement needs to protect citizens and their data

Published

on

Over the past several years, the law enforcement community has grown increasingly concerned about the conduct of digital investigations as technology providers enhance the security protections of their offerings—what some of my former colleagues refer to as “going dark.”

Data once readily accessible to law enforcement is now encrypted, protecting consumers’ data from hackers and criminals. However, these efforts have also had what Android’s security chief called the “unintended side effect” of also making this data inaccessible to law enforcement. Consequently, many in the law enforcement community want the ability to compel providers to allow them to bypass these protections, often citing physical and national security concerns.

I know first-hand the challenges facing law enforcement, but these concerns must be addressed in a broader security context, one that takes into consideration the privacy and security needs of industry and our citizens in addition to those raised by law enforcement.

Perhaps the best example of the law enforcement community’s preferred solution is Australia’s recently passed Assistance and Access Bill, an overly-broad law that allows Australian authorities to compel service providers, such as Google and Facebook, to re-engineer their products and bypass encryption protections to allow law enforcement to access customer data.

While the bill includes limited restrictions on law enforcement requests, the vague definitions and concentrated authorities give the Australian government sweeping powers that ultimately undermine the security and privacy of the very citizens they aim to protect. Major tech companies, such as Apple and Facebook, agree and have been working to resist the Australian legislation and a similar bill in the UK.

Image: Bryce Durbin/TechCrunch

Newly created encryption backdoors and work-arounds will become the target of criminals, hackers, and hostile nation states, offering new opportunities for data compromise and attack through the newly created tools and the flawed code that inevitably accompanies some of them. These vulnerabilities undermine providers’ efforts to secure their customers’ data, creating new and powerful vulnerabilities even as companies struggle to address existing ones.

And these vulnerabilities would not only impact private citizens, but governments as well, including services and devices used by the law enforcement and national security communities. This comes amidst government efforts to significantly increase corporate responsibility for the security of customer data through laws such as the EU’s General Data Protection Regulation. Who will consumers, or the government, blame when a government-mandated backdoor is used by hackers to compromise user data? Who will be responsible for the damage?

Companies have a fiduciary responsibility to protect their customers’ data, which not only includes personally identifiable information (PII), but their intellectual property, financial data, and national security secrets.

Worse, the vulnerabilities created under laws such as the Assistance and Access Bill would be subject almost exclusively to the decisions of law enforcement authorities, leaving companies unable to make their own decisions about the security of their products. How can we expect a company to protect customer data when their most fundamental security decisions are out of their hands?

phone encryption

Image: Bryce Durbin/TechCrunch

Thus far law enforcement has chosen to downplay, if not ignore, these concerns—focusing singularly on getting the information they need. This is understandable—a law enforcement officer should use every power available to them to solve a case, just as I did when I served as a State Trooper and as a FBI Special Agent, including when I served as Executive Assistant Director (EAD) overseeing the San Bernardino terror attack case during my final months in 2015.

Decisions regarding these types of sweeping powers should not and cannot be left solely to law enforcement. It is up to the private sector, and our government, to weigh competing security and privacy interests. Our government cannot sacrifice the ability of companies and citizens to properly secure their data and systems’ security in the name of often vague physical and national security concerns, especially when there are other ways to remedy the concerns of law enforcement.

That said, these security responsibilities cut both ways. Recent data breaches demonstrate that many companies have a long way to go to adequately protect their customers’ data. Companies cannot reasonably cry foul over the negative security impacts of proposed law enforcement data access while continuing to neglect and undermine the security of their own users’ data.

Providers and the law enforcement community should be held to robust security standards that ensure the security of our citizens and their data—we need legal restrictions on how government accesses private data and on how private companies collect and use the same data.

There may not be an easy answer to the “going dark” issue, but it is time for all of us, in government and the private sector, to understand that enhanced data security through properly implemented encryption and data use policies is in everyone’s best interest.

The “extra ordinary” access sought by law enforcement cannot exist in a vacuum—it will have far reaching and significant impacts well beyond the narrow confines of a single investigation. It is time for a serious conversation between law enforcement and the private sector to recognize that their security interests are two sides of the same coin.

Source link

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Biz & IT

A new app helps Iranians hide messages in plain sight

Published

on

Enlarge / An anti-government graffiti that reads in Farsi “Death to the dictator” is sprayed at a wall north of Tehran on September 30, 2009.

Getty Images

Amid ever-increasing government Internet control, surveillance, and censorship in Iran, a new Android app aims to give Iranians a way to speak freely.

Nahoft, which means “hidden” in Farsi, is an encryption tool that turns up to 1,000 characters of Farsi text into a jumble of random words. You can send this mélange to a friend over any communication platform—Telegram, WhatsApp, Google Chat, etc.—and then they run it through Nahoft on their device to decipher what you’ve said.

Released last week on Google Play by United for Iran, a San Francisco–based human rights and civil liberties group, Nahoft is designed to address multiple aspects of Iran’s Internet crackdown. In addition to generating coded messages, the app can also encrypt communications and embed them imperceptibly in image files, a technique known as steganography. Recipients then use Nahoft to inspect the image file on their end and extract the hidden message.

Iranians can use end-to-end encrypted apps like WhatsApp for secure communications, but Nahoft, which is open source, has a crucial feature in its back pocket for when those aren’t accessible. The Iranian regime has repeatedly imposed near-total Internet blackouts in particular regions or across the entire country, including for a full week in November 2019. Even without connectivity, though, if you already have Nahoft downloaded, you can still use it locally on your device. Enter the message you want to encrypt, and the app spits out the coded Farsi message. From there you can write that string of seemingly random words in a letter, or read it to another Nahoft user over the phone, and they can enter it into their app manually to see what you were really trying to say.

“When the Internet goes down in Iran, people can’t communicate with their families inside and outside the country, and for activists everything comes to a screeching halt,” says Firuzeh Mahmoudi, United for Iran’s executive director, who lived through the 1979 Iranian revolution and left the country when she was 12. “And more and more the government is moving toward layered filtering, banning different digital platforms, and trying to come up with alternatives for international services like social media. This is not looking great; it’s the direction that we definitely don’t want to see. So this is where the app comes in.”

Iran is a highly connected country. More than 57 million of its 83 million citizens use the Internet. But in recent years the country’s government has been extremely focused on developing a massive state-controlled network, or intranet, known as the “National Information Network” or SHOMA. This increasingly gives the government the ability to filter and censor data, and to block specific services, from social networks to circumvention tools like proxies and VPNs.

This is why Nahoft was intentionally designed as an app that functions locally on your device rather than as a communication platform. In the case of a full Internet shutdown, users will need to have already downloaded the app to use it. But in general, it will be difficult for the Iranian government to block Nahoft as long as Google Play is still accessible there, according to United for Iran strategic adviser Reza Ghazinouri. Since Google Play traffic is encrypted, Iranian surveillance can’t see which apps users download. So far, Nahoft has been downloaded 4,300 times. It’s possible, Ghazinouri says, that the government will eventually develop its own app store and block international offerings, but for now that capability seems far off. In China, for example, Google Play is banned in favor of offerings from Chinese tech giants like Huawei and a curated version of the iOS App Store.

Ghazinouri and journalist Mohammad Heydari came up with the idea for Nahoft in 2012 and submitted it as part of United for Iran’s second “Irancubator” tech accelerator, which started last year. Operator Foundation, a Texas nonprofit development group focused on Internet freedom, engineered the Nahoft app. And the German penetration testing firm Cure53 conducted two security audits of the app and its encryption scheme, which draws from proven protocols. United for Iran has published the findings from these audits along with detailed reports about how it fixed the problems Cure53 found. In the original app review from December 2020, for example, Cure53 found some major issues, including critical weaknesses in the steganographic technique used to embed messages in photo files. All of these vulnerabilities were fixed before the second audit, which turned up more moderate issues like Android denial-of-service vulnerabilities and a bypass for the in-app auto-delete passcode. Those issues were also fixed before launch, and the app’s Github repository contains notes about the improvements.

The stakes are extremely high for an app that Iranians could rely on to circumvent government surveillance and restrictions. Any flaws in the cryptography’s implementation could put people’s secret communications, and potentially their safety, at risk. Ghazinouri says the group took every precaution it could think of. For example, the random word jumbles the app produces are specifically designed to seem inconspicuous and benign. Using real words makes it less likely that a content scanner will flag the coded messages. And United for Iran researchers worked with Operator Foundation to confirm that current off-the-shelf scanning tools don’t detect the encryption algorithm used to generate the coded words. That makes it less likely that censors will be able to detect encoded messages and create a filter to block them.

You can set a passcode needed to open Nahoft and set an additional “destruction code” that will wipe all data from the app when entered.

“There has always been a gap between communities in need and the people who claim to work for them and develop tools for them,” Ghazinouri says. “We’re trying to shrink that gap. And the app is open source, so experts can audit the code for themselves. Encryption is an area where you can’t just ask people to trust you, and we don’t expect anyone to trust us blindly.”

In a 2020 academic keynote, “Crypto for the People,” Brown University cryptographer Seny Kamara made a similar point. The forces and incentives that typically guide cryptographic inquiry and creation of encryption tools, he argued, overlook and dismiss the specific community needs of marginalized people.

Kamara has not audited the code or cryptographic design of Nahoft, but he told WIRED that the goals of the project fit with his ideas about encryption tools made by the people, for the people.

“In terms of what the app is trying to accomplish, I think this is a good example of an important security and privacy problem that the tech industry and academia have no incentive to solve,” he says.

With Iran’s Internet freedom rapidly deteriorating, Nahoft could become a vital lifeline to keep open communication going within the country and beyond.

This story originally appeared on wired.com.

Continue Reading

Biz & IT

SpaceX Starlink will come out of beta next month, Elon Musk says

Published

on

Enlarge / Screenshot from the Starlink order page, with the street address blotted out.

SpaceX’s Starlink satellite-broadband service will emerge from beta in October, CEO Elon Musk said last night. Musk provided the answer of “next month” in response to a Twitter user who asked when Starlink will come out of beta.

SpaceX began sending email invitations to Starlink’s public beta in October 2020. The service is far from perfect as trees can disrupt the line-of-sight connections to satellites and the satellite dishes go into “thermal shutdown” in hot areas. But for people in areas where wired ISPs have never deployed cable or fiber, Starlink is still a promising alternative and service should improve as SpaceX launches more satellites and refines its software.

SpaceX has said it is serving over 100,000 Starlink users in a dozen countries from more than 1,700 satellites. The company has been taking preorders for post-beta service and said in May that “over half a million people have placed an order or put down a deposit for Starlink.”

It is still possible to place pre-orders and submit $99 deposits at the Starlink website, but the site notes that “Depending on location, some orders may take 6 months or more to fulfill.” The deposits are fully refundable.

First 500,000 to order will “likely” get service

There are capacity limits imposed by the laws of physics, and SpaceX hasn’t guaranteed that every person who pre-ordered will actually get Starlink. Musk said in May that the first 500,000 people will “most likely” get service, but that SpaceX will face “[m]ore of a challenge when we get into the several million user range.”

We asked Musk today how many orders will be fulfilled by the end of 2021 and will update this article if we get a response. Musk has said the capacity limits will primarily be a problem in densely populated urban areas, so rural people should have a good chance at getting service.

SpaceX has US permission to deploy 1 million user terminals across the country and is seeking a license to deploy up to 5 million terminals. The number of Starlink pre-orders is up to 600,000 and SpaceX is reportedly speeding up its production of dishes to meet demand, as PCMag wrote last week. 

No changes to pricing yet

In beta, SpaceX has been charging a one-time fee of $499 for the user terminal, mounting tripod, and router, plus $99 per month for service. SpaceX hasn’t announced any changes to the pricing, but that could change when it moves from beta to commercial availability.

In April, SpaceX president and COO Gwynne Shotwell said that Starlink will likely avoid “tiered pricing” and “try to keep [pricing] as simple as possible and transparent as possible.” Shotwell said that SpaceX would keep Starlink in beta “until the network is reliable and great and something we’d be proud of.” SpaceX is also working on ruggedized user terminals for aircraft, ships, large trucks, and RVs.

SpaceX has a Federal Communications Commission license to launch nearly 12,000 low-Earth orbit satellites and is seeking permission to launch an additional 30,000. Amazon, which plans its own satellite constellation, has been urging the FCC to reject the current version of SpaceX’s next-generation Starlink plan. Satellite operator Viasat supported Amazon’s protest and separately urged a federal appeals court to halt SpaceX launches, but judges rejected Viasat’s request for a stay.

Continue Reading

Biz & IT

Telegram emerges as new dark web for cyber criminals

Published

on

Telegram has exploded as a hub for cybercriminals looking to buy, sell, and share stolen data and hacking tools, new research shows, as the messaging app emerges as an alternative to the dark web.

An investigation by cyber intelligence group Cyberint, together with the Financial Times, found a ballooning network of hackers sharing data leaks on the popular messaging platform, sometimes in channels with tens of thousands of subscribers, lured by its ease of use and light-touch moderation.

In many cases, the content resembled that of the marketplaces found on the dark web, a group of hidden websites that are popular among hackers and accessed using specific anonymizing software.

“We have recently been witnessing a 100 per cent-plus rise in Telegram usage by cybercriminals,” said Tal Samra, cyber threat analyst at Cyberint.

“Its encrypted messaging service is increasingly popular among threat actors conducting fraudulent activity and selling stolen data… as it is more convenient to use than the dark web.”

The rise in nefarious activity comes as users flocked to the encrypted chat app earlier this year after changes to the privacy policy of Facebook-owned rival WhatsApp prompted many to seek out alternatives.

Launched in 2013, Telegram allows users to broadcast messages to a following via “channels” or create public and private groups that are simple for others to access. Users can also send and receive large data files, including text and zip files, directly via the app.

The platform said it has more than 500 million active users and topped 1 billion downloads in August, according to data from SensorTower.

But its use by the cyber criminal underworld could increase pressure on the Dubai-headquartered platform to bolster its content moderation as it plans a future initial public offering and explores introducing advertising to its service.

According to Cyberint, the number of mentions in Telegram of “Email:pass” and “Combo”—hacker parlance used to indicate that stolen email and passwords lists are being shared—rose fourfold over the past year, to nearly 3,400.

In one public Telegram channel called “combolist,” which had more than 47,000 subscribers, hackers sell or simply circulate large data dumps of hundreds of thousands of leaked usernames and passwords.

Ad for data posted on Telegram.
Enlarge / Ad for data posted on Telegram.

A post titled “Combo List Gaming HQ” offered 300,000 emails and passwords that it claimed were useful for hacking video game platforms such as Minecraft, Origin, or Uplay. Another purported to have 600,000 logins for users of the services of Russian Internet group Yandex, others for Google and Yahoo.

Telegram removed the channel on Thursday after it was contacted by the Financial Times for comment.

Yet email password leaks account for only a fraction of the worrisome activity on the Telegram marketplace. Other types of data traded include financial data such as credit card information, copies of passports and credentials for bank accounts and sites such as Netflix, the research found. Online criminals also share malicious software, exploits and hacking guides via the app, Cyberint said.

Meanwhile, links to Telegram groups or channels shared inside forums on the dark web jumped to more than 1 million in 2021, from 172,035 the previous year, as hackers increasingly direct users to the platform as an easier-to-use alternative or parallel information center.

The research follows a separate report earlier this year by vpnMentor, which found data dumps circulating on Telegram from previous hacks and data leaks of companies including Facebook, marketing software provider Click.org, and dating site Meet Mindful, among others.

“In general, it appears that most data leaks and hacks are only shared on Telegram after being sold on the dark web—or the hacker failed to find a buyer and decided to share the information publicly and move on,” vpnMentor said.

Still, it dubbed the trend “a serious escalation in the ongoing surge of cyber crime,” noting that some users in these groups appeared less tech savvy than a typical dark web user.

Telegram said it was unable to verify the vpnMentor findings because the researchers had not shared details identifying which channels these alleged leaks were in.

Samra said the transition for cybercriminals from the dark web to Telegram was taking place in part because of the anonymity afforded by encryption—but noted that many of these groups were also public.

Post from a Telegram channel called
Enlarge / Post from a Telegram channel called “combolist.”

Telegram is also more accessible, provides better functionality, and is generally less likely to be tracked by law enforcement when compared to dark web forums, he added.

“In some cases, it’s easier to find buyers on Telegram rather than a forum because everything is smoother and quicker. Access is easier… and data can be shared much more openly.”

Hackers are less inclined to use WhatsApp both for privacy reasons and because it displays users’ numbers in group chats, unlike Telegram, Cyberint said. Encrypted app Signal remains smaller and tends to be used for more general messaging among people who know each other rather than forum-style groups, it added.

Telegram has long taken a more lax approach to content moderation than larger social media apps such as Facebook and Twitter, attracting scrutiny for allowing hate groups and conspiracy theories to flourish. In January, it began shutting down public extremist and white supremacist groups—for the first time—in the wake of the Capitol riots amid concerns it was being used to promote violence.

The Cyberint research—particularly the uncovering of public, searchable groups for cybercriminals—raises further questions about Telegram’s content moderation policies and enforcement at a time when chief executive Pavel Durov has said the company is preparing to sell advertisements in public Telegram channels.

It also comes as the company prepares to head for public markets after raising more than $1 billion through bond sales in March to investors including to Mubadala Investment Company, the Gulf emirate’s large sovereign wealth fund, and Abu Dhabi Catalyst Partners, a joint venture between Mubadala and the $4 billion New York hedge fund Falcon Edge Capital.

Telegram said in a statement that it “has a policy for removing personal data shared without consent.” It added that each day, its “ever growing force of professional moderators” removes more than 10,000 public communities for terms of service violations following user reports.

© 2021 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.

Continue Reading

Trending