Connect with us

Security

CISOs given cyber leadership role in Australia’s new Information Security Manual

Published

on

The Australian Signals Directorate (ASD) Essential Eight strategies for mitigating cyber attacks, and a focus on risk management, are at the core of the Australian government’s latest Information Security Manual (ISM) released on Tuesday.

special feature


Cyberwar and the Future of Cybersecurity

Today’s security threats have expanded in scope and seriousness. There can now be millions — or even billions — of dollars at risk when information security isn’t handled properly.

Read More

“We’re pushing the Essential Eight more, absolutely, because we know that’s good advice,” Alastair MacGibbon, head of the Australian Cyber Security Centre (ACSC), told ZDNet.

Many controls which were previously given the priority of “should” are now a “must”.

The ISM now includes specific mandatory controls to limit and log privileged access to systems, applications, and information; mandatory blocking of Adobe Flash, Java from the internet, and web advertisements; and mandatory disabling of unrequired functionality in Microsoft Office, web browsers, and PDF readers.

Microsoft Office macros are now severely restricted.

“Microsoft Office macros are only allowed to execute in documents from Trusted Locations where write access is limited to personnel whose role is to vet and approve macros,” the ISM reads. “Microsoft Office macros in documents originating from the Internet are blocked.”

The scope of the chief information security officer (CISO) role is also expanded under the latest ISM.

Previously, the CISO was described as setting the “strategic direction” for an agency’s information security; “facilitating communications” between the security, ICT, and business personnel; and ensuring compliance.

The 2018 edition of the ISM is more direct — the CISO’s role is to “provide cyber security leadership for their organisation”.

The responsibilities of system owners have been clarified. System owners are now required to “register each system with the system’s authorising officer”, and “monitor security risks and the effectiveness of security controls for each system”, rather than those responsibilities existing at lower levels.

See: Autonomous cyber defences are the future

The ISM also now specifies what an incident response plan needs to contain, and includes the expected response to each type of incident, internal, and external reporting procedures; the steps necessary to ensure the integrity of evidence; and the criteria for referring an incident to law enforcement or the ACSC.

“There is an increased responsibility in 2018 on system owners to truly protect their systems through proper risk management. It is not compliance versus risk. It’s the right type of compliance,” MacGibbon said.

“To me, compliance is hygiene, and we need good hygiene because that’s what makes you secure. What makes you more secure is proper risk management on top of good hygiene.

“So we are expecting a maturation on the part of systems owners, yes.”

MacGibbon cited the example of legacy systems that can’t be patched, because the patches don’t exist. The question then becomes one of how the agency achieves “the intent rather than the black letter” of the ISM controls.

Rather than scrapping a legacy application running on Windows XP, for example, an agency could put in place controls that limit the likelihood that it would be exposed to the relevant threats.

“Risk management doesn’t mean you can cut corners. Risk management actually means you’re more effective at addressing risk,” MacGibbon said.

The ISM is a key element of the cybersecurity policies for Australian government agencies. It fleshes out the cybersecurity components of the Protective Security Policy Framework (PSPF) administered by the Attorney-General’s Department, which also covers personnel security, physical security, and governance.

In previous years, the ISM was split into an Executive Companion, a set of Information Security Principles, and the Information Security Controls themselves.

The 2018 edition is a single volume, intended for CISOs, chief information officers, cyber security professionals, and information technology managers. Discussions of governance issues are more closely integrated with the specifications of technical controls.

The new ISM also removes language referring to the roles of IT Security Advisor (ITSA), IT Security Manager (ITSM), and IT Security Officer (ITSO), as well as the security classification CONFIDENTIAL, all of which have been removed from the PSPF.

Also: Culture the missing link for cybersecurity’s weakest link

Other evolutionary changes include more detail on application hardening; more detail on backup, restoration, and preservation strategies; and more detail on the requirements for cyber security awareness raising and training.

“We can’t change the 2017 version of the ISM to the 2018 version of the ISM to be radically different, but you’ll see a directional shift towards more effective risk management,” MacGibbon said.

“We know there are ways we could a lot of the problems that we see, and that’s through following the ASD/ACSC advice. So we’re saying follow that advice, but how you follow it is the question.”

Related Coverage

Security guarantees will be meaningless under encryption-busting laws: Senetas

If an Australian company is compelled by legislation to deny that a capability in its products exists, then its assertions are meaningless, security company Senetas has said.

5G stakes couldn’t be higher so we advised Huawei ban: ASD

High-risk vendors could previously be confined to the edge of networks, but 5G changes that, the Australian Signals Directorate has said.

AustCyber to figure out what ‘cyber skills’ actually are

Australia is ahead of the global average figure for cyber gender diversity, but when it comes to filling the cyber skills gap we don’t even know what we need.

Australian security trio aim for unbreakable encrypted data environment

Vault, QuintessenceLabs, and Ziroh Labs have joined forces to build a system for strong encryption of user data for government.

5 major data breach predictions for 2019 (TechRepublic)

Biometrics and gaming are just a couple of the new cyberattack vectors professionals can expect in 2019. Here is what else to look out for.

Marriott reveals data breach affecting 500 million hotel guests (TechRepublic)

Hackers have had access to the Starwood guest reservation database since 2014.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Security

Cloud Data Security

Published

on

Data security has become an immutable part of the technology stack for modern applications. Protecting application assets and data against cybercriminal activities, insider threats, and basic human negligence is no longer an afterthought. It must be addressed early and often, both in the application development cycle and the data analytics stack.

The requirements have grown well beyond the simplistic features provided by data platforms, and as a result a competitive industry has emerged to address the security layer. The capabilities of this layer must be more than thorough, they must also be usable and streamlined, adding a minimum of overhead to existing processes.

To measure the policy management burden, we designed a reproducible test that included a standardized, publicly available dataset and a number of access control policy management scenarios based on real world use cases we have observed for cloud data workloads. We tested two options: Apache Ranger with Apache Atlas and Immuta. This study contrasts the differences between a largely role-based access control model with object tagging (OT-RBAC) to a pure attribute-based access control (ABAC) model using these respective technologies.

This study captures the time and effort involved in managing the ever-evolving access control policies at a modern data-driven enterprise. With this study, we show the impacts of data access control policy management in terms of:

  • Dynamic versus static
  • Scalability
  • Evolvability

In our scenarios, Ranger alone took 76x more policy changes than Immuta to accomplish the same data security objectives, while Ranger with Apache Atlas took 63x more policy changes. For our advanced use cases, Immuta only required one policy change each, while Ranger was not able to fulfill the data security requirement at all.

This study exposed the limitations of extending legacy Hadoop security components into cloud use cases. Apache Ranger uses static policies in an OT-RBAC model for the Hadoop ecosystem with very limited support for attributes. The difference between it and Immuta’s attribute-based access control model (ABAC) became clear. By leveraging dynamic variables, nested attributes, and global row-level policies and row-level security, Immuta can be quickly implemented and updated in comparison with Ranger.

Using Ranger as a data security mechanism creates a high policy-management burden compared to Immuta, as organizations migrate and expand cloud data use—which is shown here to provide scalability, clarity, and evolvability in a complex enterprise’s data security and governance needs.

The chart in Figure 1 reveals the difference in cumulative policy changes required for each platform configuration.

Figure 1. Difference in Cumulative Policy Changes

The assessment and scoring rubric and methodology is detailed in the report. We leave the issue of fairness for the reader to determine. We strongly encourage you, as the reader, to discern for yourself what is of value. We hope this report is informative and helpful in uncovering some of the challenges and nuances of data governance platform selection. You are encouraged to compile your own representative use cases and workflows and review these platforms in a way that is applicable to your requirements.

Continue Reading

Security

GigaOm Radar for Data Loss Prevention

Published

on

Data is at the core of modern business: It is our intellectual property, the lifeblood of our interactions with our employees, partners, and customers, and a true business asset. But in a world of increasingly distributed workforces, a growing threat from cybercriminals and bad actors, and ever more stringent regulation, our data is at risk and the impact of losing it, or losing access to it, can be catastrophic.

With this in mind, ensuring a strong data management and security strategy must be high on the agenda of any modern enterprise. Security of our data has to be a primary concern. Ensuring we know how, why, and where our data is used is crucial, as is the need to be sure that data does not leave the organization without appropriate checks and balances.

Keeping ahead of this challenge and mitigating the risk requires a multi-faceted approach. People and processes are key, as, of course, is technology in any data loss prevention (DLP) strategy.

This has led to a reevaluation of both technology and approach to DLP; a recognition that we must evolve an approach that is holistic, intelligent, and able to apply context to our data usage. DLP must form part of a broader risk management strategy.

Within this report, we evaluate the leading vendors who are offering solutions that can form part of your DLP strategy—tools that understand data as well as evaluate insider risk to help mitigate the threat of data loss. This report aims to give enterprise decision-makers an overview of how these offerings can be a part of a wider data security approach.

Continue Reading

Security

Key Criteria for Evaluating Data Loss Prevention Platforms

Published

on

Data is a crucial asset for modern businesses and has to be protected in the same way as any other corporate asset, with diligence and care. Loss of data can have catastrophic effects, from reputational damage to significant fines for breaking increasingly stringent regulations.

While the risk of data loss is not new, the landscape we operate in is evolving rapidly. Data can leave data centers in many ways, whether accidental or malicious. The routes for exfiltration also continue to grow, ranging from email, USB sticks, and laptops to ever-more-widely-adopted cloud applications, collaboration tools, and mobile devices. This is driving a resurgence in the enterprise’s need to ensure that no data leaves the organization without appropriate checks and balances in place.

Keeping ahead of this challenge and mitigating the risk requires a multi-faceted approach. Policy, people, and technology are critical components in a data loss prevention (DLP) strategy.

As with any information security strategy, technology plays a significant role. DLP technology has traditionally played a part in helping organizations to mitigate some of the risks of uncontrolled data exfiltration. However, both the technology and threat landscape have shifted significantly, which has led to a reevaluation of DLP tools and strategy.

The modern approach to the challenge needs to be holistic and intelligent, capable of applying context to data usage by building a broader understanding of what the data is, who is using it, and why. Systems in place must also be able to learn when user activity should be classified as unusual so they can better interpret signs of a potential breach.

This advanced approach is also driving new ways of defining the discipline of data loss prevention. Dealing with these risks cannot be viewed in isolation; rather, it must be part of a wider insider risk-management strategy.

Stopping the loss of data, accidental or otherwise, is no small task. This GigaOM Key Criteria Report details DLP solutions and identifies key criteria and evaluation metrics for selecting such a solution. The corresponding GigOm Radar Report identifies vendors and products in this sector that excel. Together, these reports will give decision-makers an overview of the market to help them evaluate existing platforms and decide where to invest.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Continue Reading

Trending