Connect with us

Security

Cybersecurity: How a layered approach keeps this F1 team’s data secure

Published

on

The internet needs to be fixed, but who’s responsible?
ZDNet’s Danny Palmer talks about the current state of the internet and cybersecurity, which is wobbly. He tells TechReppublic’s Karen Roby what needs to be done in order to fix the security issues for the next generation. Read more: https://zd.net/2Wt2MzJ

Gary Foote, CIO at Rich Energy Haas F1, says information is critical to the success of the Formula 1 racing team – and that’s why the establishment of a data security strategy is one his key business priorities.

“We’re in a competitive sport, so our asset is the data that helps make us more reliable and quicker on a circuit,” he says. “The data that we’re generating – from conception in design, to being used creatively by partners, and onto the race track – is crucial and we have a complex data model where information is flying all over the place but must be kept secure.”

Foote says his organisation is focused on two main types of data: business information, such as finance figures and HR records; and the product information relating to the car, which is best-described as Haas F1’s intellectual property.

SEE: Digital transformation: A CXO’s guide (ZDNet special report) | Download the report as a PDF (TechRepublic)

He says both of these data types are “hugely important” and must be managed across a dispersed geography: “We’ve got data movement all over the place and, as a result, data security becomes a bigger challenge than it might be for other organisations and even some of the other teams in F1.”

Haas F1 has an operations centre in the UK, where Foote is primarily based, and an office in the US, which houses the firm’s sister racing team, Stewart-Haas Racing. There are two additional centres relating to design partners in Italy, and the team also maintains a mobile operation that moves around the globe fulfilling race-day duties.

One of the main explanations for that geographical dispersal is the team’s business model. In an attempt to keep costs low, Foote says the model relies on outsourcing as much non-core expertise as possible.

“And when you outsource expertise, by definition you outsource data,” he says. “Our data movement strategy is complex – overly complex to a degree. As an example, our computational fluid dynamic specialists are based in the US office, but they use a high-performance computing cluster based in the UK, and we provided the results from that platform to aerodynamicists who are based in Italy.”

It’s Foote responsibility as the team’s CIO to ensure that there is adequate protection for the Haas F1 brand and its assets on a global scale. His approach has been to create a business strategy for data security that aims to neutralise the complex geographical structure of the business.

“We’re looking for strong layers of protection placed strategically, with good vendor and product selection. We want to create a sensibly layered strategy, where products complement each other,” says Foote.

gary-haas.jpg

Foote: “We want to create a sensibly layered strategy, where products complement each other.”


Image: Haas F1

Technology partnerships can play a key role in F1. The sport is heavily dependent on its corporate sponsors, yet Foote describes his team as “completely vendor agnostic” when it comes to technology suppliers. 

“We have the freedom and finances to deploy the products that we want to. Rather than having a confined tool box, we’re able to collect the products to use for the problem we’re trying to solve. But it’s really important that they complement each other,” says Foote, who says the products his team chooses must fit his security strategy.

“Layers are great but we also need to keep a check on compatibility and complexity. I talk to my guys quite a lot about the problems you introduce into IT when you create complexity. It’s not good having a hugely redundant disaster recovery network if it’s so over-complex that when it breaks it’s impossible to fix. By making things simpler, you can make things better.”

One of the key products Foote has chosen to implement recently is Nominet’s NTX cybersecurity platform to help keep the team’s data networks secure. The technology analyses domain name system (DNS) traffic to predict, detect and block threats to the network before they cause harm.

“We want to add layers to protect our data from threats coming in and threats going out, such as best-intended user actions that might accidentally lead to malware,” he says. “We’ve got to continually stay at the leading edge of security technology.”

SEE: A winning strategy for cybersecurity (ZDNet special report) | Download the report as a PDF (TechRepublic)

Foote says the selection process was driven by going to market and seeing who was able to offer products and services that can bolster the team’s security strategy. The organisation’s policy for system implementation is to leave race-day systems until last, preferring the technology to be proven in a business environment first.

The Nominet DNS platform, which has now been in place for about six months, analyses and categorises potentially billions of queries in its attempts to eliminate malware, phishing and data theft from the network. Foote is already seeing business benefits. “Nominet are putting a sets of eyes on a huge amount of traffic that I physically don’t have the manpower internally to do,” he says. 

Foote refers to cybersecurity as “an ever-changing landscape” and says Haas F1 must keep up with the latest trends. When new security technologies emerge, Foote wants his team to analyse these tools and to investigate how these layers might help block actors who might want to do harm.

“The key thing for me is security – and it’s almost number one in terms of our technical remit here. F1 is incredibly fast-paced and the worry is less about data going to competitors and more about exposure to individuals who want to use it as a platform for their own gain. Being a global company in a global sport on a global stage means we’re quite a target. Any security technology that’s learning as we go is a big benefit to us,” he says.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Security

Cloud Data Security

Published

on

Data security has become an immutable part of the technology stack for modern applications. Protecting application assets and data against cybercriminal activities, insider threats, and basic human negligence is no longer an afterthought. It must be addressed early and often, both in the application development cycle and the data analytics stack.

The requirements have grown well beyond the simplistic features provided by data platforms, and as a result a competitive industry has emerged to address the security layer. The capabilities of this layer must be more than thorough, they must also be usable and streamlined, adding a minimum of overhead to existing processes.

To measure the policy management burden, we designed a reproducible test that included a standardized, publicly available dataset and a number of access control policy management scenarios based on real world use cases we have observed for cloud data workloads. We tested two options: Apache Ranger with Apache Atlas and Immuta. This study contrasts the differences between a largely role-based access control model with object tagging (OT-RBAC) to a pure attribute-based access control (ABAC) model using these respective technologies.

This study captures the time and effort involved in managing the ever-evolving access control policies at a modern data-driven enterprise. With this study, we show the impacts of data access control policy management in terms of:

  • Dynamic versus static
  • Scalability
  • Evolvability

In our scenarios, Ranger alone took 76x more policy changes than Immuta to accomplish the same data security objectives, while Ranger with Apache Atlas took 63x more policy changes. For our advanced use cases, Immuta only required one policy change each, while Ranger was not able to fulfill the data security requirement at all.

This study exposed the limitations of extending legacy Hadoop security components into cloud use cases. Apache Ranger uses static policies in an OT-RBAC model for the Hadoop ecosystem with very limited support for attributes. The difference between it and Immuta’s attribute-based access control model (ABAC) became clear. By leveraging dynamic variables, nested attributes, and global row-level policies and row-level security, Immuta can be quickly implemented and updated in comparison with Ranger.

Using Ranger as a data security mechanism creates a high policy-management burden compared to Immuta, as organizations migrate and expand cloud data use—which is shown here to provide scalability, clarity, and evolvability in a complex enterprise’s data security and governance needs.

The chart in Figure 1 reveals the difference in cumulative policy changes required for each platform configuration.

Figure 1. Difference in Cumulative Policy Changes

The assessment and scoring rubric and methodology is detailed in the report. We leave the issue of fairness for the reader to determine. We strongly encourage you, as the reader, to discern for yourself what is of value. We hope this report is informative and helpful in uncovering some of the challenges and nuances of data governance platform selection. You are encouraged to compile your own representative use cases and workflows and review these platforms in a way that is applicable to your requirements.

Continue Reading

Security

GigaOm Radar for Data Loss Prevention

Published

on

Data is at the core of modern business: It is our intellectual property, the lifeblood of our interactions with our employees, partners, and customers, and a true business asset. But in a world of increasingly distributed workforces, a growing threat from cybercriminals and bad actors, and ever more stringent regulation, our data is at risk and the impact of losing it, or losing access to it, can be catastrophic.

With this in mind, ensuring a strong data management and security strategy must be high on the agenda of any modern enterprise. Security of our data has to be a primary concern. Ensuring we know how, why, and where our data is used is crucial, as is the need to be sure that data does not leave the organization without appropriate checks and balances.

Keeping ahead of this challenge and mitigating the risk requires a multi-faceted approach. People and processes are key, as, of course, is technology in any data loss prevention (DLP) strategy.

This has led to a reevaluation of both technology and approach to DLP; a recognition that we must evolve an approach that is holistic, intelligent, and able to apply context to our data usage. DLP must form part of a broader risk management strategy.

Within this report, we evaluate the leading vendors who are offering solutions that can form part of your DLP strategy—tools that understand data as well as evaluate insider risk to help mitigate the threat of data loss. This report aims to give enterprise decision-makers an overview of how these offerings can be a part of a wider data security approach.

Continue Reading

Security

Key Criteria for Evaluating Data Loss Prevention Platforms

Published

on

Data is a crucial asset for modern businesses and has to be protected in the same way as any other corporate asset, with diligence and care. Loss of data can have catastrophic effects, from reputational damage to significant fines for breaking increasingly stringent regulations.

While the risk of data loss is not new, the landscape we operate in is evolving rapidly. Data can leave data centers in many ways, whether accidental or malicious. The routes for exfiltration also continue to grow, ranging from email, USB sticks, and laptops to ever-more-widely-adopted cloud applications, collaboration tools, and mobile devices. This is driving a resurgence in the enterprise’s need to ensure that no data leaves the organization without appropriate checks and balances in place.

Keeping ahead of this challenge and mitigating the risk requires a multi-faceted approach. Policy, people, and technology are critical components in a data loss prevention (DLP) strategy.

As with any information security strategy, technology plays a significant role. DLP technology has traditionally played a part in helping organizations to mitigate some of the risks of uncontrolled data exfiltration. However, both the technology and threat landscape have shifted significantly, which has led to a reevaluation of DLP tools and strategy.

The modern approach to the challenge needs to be holistic and intelligent, capable of applying context to data usage by building a broader understanding of what the data is, who is using it, and why. Systems in place must also be able to learn when user activity should be classified as unusual so they can better interpret signs of a potential breach.

This advanced approach is also driving new ways of defining the discipline of data loss prevention. Dealing with these risks cannot be viewed in isolation; rather, it must be part of a wider insider risk-management strategy.

Stopping the loss of data, accidental or otherwise, is no small task. This GigaOM Key Criteria Report details DLP solutions and identifies key criteria and evaluation metrics for selecting such a solution. The corresponding GigOm Radar Report identifies vendors and products in this sector that excel. Together, these reports will give decision-makers an overview of the market to help them evaluate existing platforms and decide where to invest.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Continue Reading

Trending