Connect with us

Security

At least nine global MSPs hit in APT10 attacks: ACSC

Published

on


(Image: ACSC)

Nine global service providers are known to have been compromised in attacks by China’s APT10 group, according to Alastair MacGibbon, head of the Australian Cyber Security Centre (ACSC).

On Friday, the US formally attributed these attacks to China’s Ministry of State Security (MSS) in its indictment of two Chinese nationals who it alleges are members of the group.

APT10 is the name given to the group by FireEye. Other names assigned to it are Red Apollo (PwC), CVNX (BAE Systems), Stone Panda (CrowdStrike), POTASSIUM (Microsoft), and MenuPass (Trend Micro).

Hewlett Packard Enterprise (HPE) and IBM are among the MSPs affected, Reuters reported on Friday. The companies were infiltrated “multiple times in breaches that lasted for weeks and months”, the report said, although neither company has commented officially.

“We’re not naming any managed service providers,” MacGibbon told ZDNet.

“One, we said we wouldn’t name them. And two, I can’t be sure, and none of our allies can be sure, that we know all of the compromised global providers,” he said.

“We know of, I think, nine global service providers that have been compromised… And they’re the ones we know about. Very Rumsfeldian, but it’s what you don’t know that is problematic. The unknown unknowns.”

The ACSC does not know how many of the MSPs’ customers have been affected either, MacGibbon said.

In part that is because of the “subtlety” and “sophistication” of the attacks, and in part because of the way the MSPs have built their systems to be “scaleable and global in nature”.

“[This] often means that they don’t segment, and do other things to their networks, that you would argue is sensible,” he said.

Australian customers of compromised MSPs have not been named, but MacGibbon says that globally the targets have been organisations like mining companies, tech companies, and those involved in advanced manufacturing.

“It’s commercial secrets. It’s not about the traditional strategic intelligence. It’s not about, frankly, defence systems, or secrets from governments… [It’s] all of those things where a country may want to win in that competition, stealing the lifeblood from their competitors in the West,” MacGibbon said.

“The reason why I said on RN [the Australian Broadcasting Corporation’s Radio National] that I don’t believe but can’t prove that government entities are being victims is because generally the way government uses outsourced IT providers is different to how some corporates will. We put in place some different architectures.”

But why name China now?

A key question is why China is being called out now.

In April 2017, Premier Li Keqiang and then Prime Minister Malcolm Turnbull signed an agreement to refrain from the cyber-enabled theft of intellectual property, trade secrets, or confidential business information.

The activities of APT10 had been revealed just two weeks beforehand in PwC’s Operation Cloud Hopper report, produced in conjunction with BAE Systems and the UK’s National Cyber Security Centre (NCSC).

“As a result of our analysis of APT10’s activities, we believe that it almost certainly benefits from significant staffing and logistical resources, which have increased over the last three years, with a significant step-change in 2016,” PwC wrote at the time.

In April this year, the NCSC warned that third-party suppliers were now an organisation’s weakest link, citing the success of Cloud Hopper as an example.

And as recently as last month, joint Fairfax Media/Nine News reporting confirmed that China’s Ministry of State Security is behind Cloud Hopper.

MacGibbon uses what he called his “tortured” doctor analogy. If you’re in pain, a doctor might at first advise a couple of days off work, and to come back if pain persists. Next might come manipulation of the limb, and so on.

“We’re now into what I call radical surgery phase. We’ve tried other things. Clearly, to dislodge the threat actor themselves, and to send a message to them, in this case APT10 working on behalf of the Ministry of State Security (MSS) in China.

“That’s an important lever we need to pull to get them to change.”

This is presumably part of the coordinated diplomatic campaign against nation-states breaching the so-called “cyber norms” that named Russia as the nation-state actor behind the NotPetya attack, and blamed North Korea for the WannaCry incident.

But the ACSC’s announcement is also intended to drive action inside Australia’s economy.

The ACSC’s website has posted advice for Australian businesses in the wake of the MSP breaches.

“[MSPs] need to change the way they do their business, because if they are compromised it could potentially compromise all of their customers. Then those that consume those services, what can you do to architect this arrangement to still get the benefits of outsourced IT and reduce the risks,” MacGibbon said.

“So it’s a wake-up call, and we’re using, frankly, naming the MSS as a fulcrum to create leverage to change the way we behave domestically.”

MacGibbon acknowledges that it’s “not the best time of year” to launch an awareness campaign, however. He cites the US indictments as a trigger for it happening now.

“Once everyone’s eaten enough turkey and had enough ham, we’ll be back out again to drive change, where we hope that members of boards, CEOs, and customers start asking questions on how to change the way they construct their IT systems.”

Related Coverage

US charges two Chinese nationals for hacking cloud providers, NASA, the US Navy

The two Chinese nationals were members of the infamous APT10 cyber-espionage group, DOJ said.

DHS aware of ongoing APT attacks on cloud service providers

Attacks most likely linked to APT10, a Chinese cyber-espionage group, also known as Red Apollo, Stone Panda, POTASSIUM, or MenuPass.

Advanced Chinese hacking campaign infiltrates IT service providers across the globe

‘Cloud Hopper’ campaign by sophisticated APT10 hacking group uses advanced phishing and customised malware to conduct espionage.

Elite Chinese hackers target board directors at some of the world’s largest firms

The APT 10 hacking group has struck again, this time using a watering hole attack to compromise the National Foreign Trade Council website and gather sensitive data about its directors.

Top 4 security threats businesses should expect in 2019 (TechRepublic)

Cybercriminals are developing more sophisticated attacks, while individuals and enterprises need to be more proactive in security practices.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Security

Cloud Data Security

Published

on

Data security has become an immutable part of the technology stack for modern applications. Protecting application assets and data against cybercriminal activities, insider threats, and basic human negligence is no longer an afterthought. It must be addressed early and often, both in the application development cycle and the data analytics stack.

The requirements have grown well beyond the simplistic features provided by data platforms, and as a result a competitive industry has emerged to address the security layer. The capabilities of this layer must be more than thorough, they must also be usable and streamlined, adding a minimum of overhead to existing processes.

To measure the policy management burden, we designed a reproducible test that included a standardized, publicly available dataset and a number of access control policy management scenarios based on real world use cases we have observed for cloud data workloads. We tested two options: Apache Ranger with Apache Atlas and Immuta. This study contrasts the differences between a largely role-based access control model with object tagging (OT-RBAC) to a pure attribute-based access control (ABAC) model using these respective technologies.

This study captures the time and effort involved in managing the ever-evolving access control policies at a modern data-driven enterprise. With this study, we show the impacts of data access control policy management in terms of:

  • Dynamic versus static
  • Scalability
  • Evolvability

In our scenarios, Ranger alone took 76x more policy changes than Immuta to accomplish the same data security objectives, while Ranger with Apache Atlas took 63x more policy changes. For our advanced use cases, Immuta only required one policy change each, while Ranger was not able to fulfill the data security requirement at all.

This study exposed the limitations of extending legacy Hadoop security components into cloud use cases. Apache Ranger uses static policies in an OT-RBAC model for the Hadoop ecosystem with very limited support for attributes. The difference between it and Immuta’s attribute-based access control model (ABAC) became clear. By leveraging dynamic variables, nested attributes, and global row-level policies and row-level security, Immuta can be quickly implemented and updated in comparison with Ranger.

Using Ranger as a data security mechanism creates a high policy-management burden compared to Immuta, as organizations migrate and expand cloud data use—which is shown here to provide scalability, clarity, and evolvability in a complex enterprise’s data security and governance needs.

The chart in Figure 1 reveals the difference in cumulative policy changes required for each platform configuration.

Figure 1. Difference in Cumulative Policy Changes

The assessment and scoring rubric and methodology is detailed in the report. We leave the issue of fairness for the reader to determine. We strongly encourage you, as the reader, to discern for yourself what is of value. We hope this report is informative and helpful in uncovering some of the challenges and nuances of data governance platform selection. You are encouraged to compile your own representative use cases and workflows and review these platforms in a way that is applicable to your requirements.

Continue Reading

Security

GigaOm Radar for Data Loss Prevention

Published

on

Data is at the core of modern business: It is our intellectual property, the lifeblood of our interactions with our employees, partners, and customers, and a true business asset. But in a world of increasingly distributed workforces, a growing threat from cybercriminals and bad actors, and ever more stringent regulation, our data is at risk and the impact of losing it, or losing access to it, can be catastrophic.

With this in mind, ensuring a strong data management and security strategy must be high on the agenda of any modern enterprise. Security of our data has to be a primary concern. Ensuring we know how, why, and where our data is used is crucial, as is the need to be sure that data does not leave the organization without appropriate checks and balances.

Keeping ahead of this challenge and mitigating the risk requires a multi-faceted approach. People and processes are key, as, of course, is technology in any data loss prevention (DLP) strategy.

This has led to a reevaluation of both technology and approach to DLP; a recognition that we must evolve an approach that is holistic, intelligent, and able to apply context to our data usage. DLP must form part of a broader risk management strategy.

Within this report, we evaluate the leading vendors who are offering solutions that can form part of your DLP strategy—tools that understand data as well as evaluate insider risk to help mitigate the threat of data loss. This report aims to give enterprise decision-makers an overview of how these offerings can be a part of a wider data security approach.

Continue Reading

Security

Key Criteria for Evaluating Data Loss Prevention Platforms

Published

on

Data is a crucial asset for modern businesses and has to be protected in the same way as any other corporate asset, with diligence and care. Loss of data can have catastrophic effects, from reputational damage to significant fines for breaking increasingly stringent regulations.

While the risk of data loss is not new, the landscape we operate in is evolving rapidly. Data can leave data centers in many ways, whether accidental or malicious. The routes for exfiltration also continue to grow, ranging from email, USB sticks, and laptops to ever-more-widely-adopted cloud applications, collaboration tools, and mobile devices. This is driving a resurgence in the enterprise’s need to ensure that no data leaves the organization without appropriate checks and balances in place.

Keeping ahead of this challenge and mitigating the risk requires a multi-faceted approach. Policy, people, and technology are critical components in a data loss prevention (DLP) strategy.

As with any information security strategy, technology plays a significant role. DLP technology has traditionally played a part in helping organizations to mitigate some of the risks of uncontrolled data exfiltration. However, both the technology and threat landscape have shifted significantly, which has led to a reevaluation of DLP tools and strategy.

The modern approach to the challenge needs to be holistic and intelligent, capable of applying context to data usage by building a broader understanding of what the data is, who is using it, and why. Systems in place must also be able to learn when user activity should be classified as unusual so they can better interpret signs of a potential breach.

This advanced approach is also driving new ways of defining the discipline of data loss prevention. Dealing with these risks cannot be viewed in isolation; rather, it must be part of a wider insider risk-management strategy.

Stopping the loss of data, accidental or otherwise, is no small task. This GigaOM Key Criteria Report details DLP solutions and identifies key criteria and evaluation metrics for selecting such a solution. The corresponding GigOm Radar Report identifies vendors and products in this sector that excel. Together, these reports will give decision-makers an overview of the market to help them evaluate existing platforms and decide where to invest.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Continue Reading

Trending