Connect with us

Security

Key Criteria for Evaluating Hybrid Cloud Data Protection

Published

on

Cloud and edge computing have changed the way enterprises do IT. To best meet business and user needs, many enterprises are embracing hybrid cloud solutions, with multi-cloud around the corner. Data is now created, stored, and consumed everywhere, on several types of devices concurrently, and at any time of the day. At the same time, machine-generated data is growing dramatically faster than human-generated data. However, this is only the tip of the iceberg: data quantity and diversity are just two of many aspects to consider when evaluating a new data protection solution. Challenges now faced by enterprises include:

  • Exponential data growth: Machine-generated data is taking the lion’s share now. Backup and restore processes change accordingly. For example, it is very unlikely that it will be necessary to retrieve single files accidentally deleted.
  • Disparate data types: Structured and unstructured data created by traditional applications are now joined by containers and SaaS data, metadata, and blobs. This new complex type of data is usually self-consistent, capable of reproducing the entire applications or recreating the state of a cloud service if necessary.
  • Pressing SLAs: Digital transformation initiatives embraced by all organizations transformed every process, and now it is becoming harder and harder to stop any part of the infrastructure or having a long RTO (recovery time objective) or RPO (recovery point objective). Instant and continuous backups, as well as fast recovery speeds, are mandatory for an ever-growing number of applications.
  • Data dispersion and consolidation: Data is now created, stored, and consumed on mobile devices, PCs, data centers, and many other places. In order to build effective data protection instrumental in creating a data management strategy, it is necessary to consolidate backup repositories in a single physical or virtual domain.
  • New threats: Traditional risks and threats, such as natural disasters or human errors are now joined by cyberattacks like ransomware, which are even more dangerous and harder to detect.
  • Regulatory compliance: There is a growing number of demanding regulations (like GDPR or CCPA) that require strong data protection and tools able to search and find information quickly or remove/mask them properly when retrieved, as in the case of the right to be forgotten.
  • Data management and reusability: Backups can either be considered a liability or be transformed into an asset for the organization. By indexing data and making it searchable, the data’s real value is revealed, making it reusable for other users or applications across the entire organization.

For these reasons, data protection operations are very difficult and more critical than in the past.

Report Methodology

A Key Criteria report analyzes the most important features of a technology category to understand how it impacts an enterprise and its IT organization. Features are grouped into three categories:

  1. Table Stakes
  2. Key Criteria
  3. Critical Features: Impact Analysis
  4. Near-term game-changing technology

The goal is to help organizations assess capabilities and build a mid-to-long-term infrastructure strategy. In a mature technology, the solutions are divided into three target market categories: enterprise, high-performance, and specialized solutions. In a mature market, these differ in their characteristics and how they can be integrated with existing infrastructures. That said, the assessment is more dependent on the specific user’s needs and not solely on the organization’s vertical.

Table Stakes

Table stakes are system characteristics and features that are important when choosing the right solution. They include architectural choices that depend on the size of the organization, their requirements, the expected growth over time, and the types of workloads. Table stakes are mature and the implementation of these features will not add any business advantage nor significantly change the TCO or ROI of the infrastructure.

Key Criteria

Key Criteria features really differentiate one solution from another. Depending on real user needs, they have a positive impact on one or more of the metrics mentioned. Therefore, implementation details are essential to understanding the benefits relative to the infrastructure, processes, or business. 
Following table stakes and key criteria, aspects like architectural design and implementation regain importance and need to be analyzed in great detail. In some cases, the features described in the Key Criteria section are the core solution, and the rest of the system is designed around them. This could be an important benefit for organizations that see them as a real practical advantage, but it also poses some risks in the long term. In fact, over time, the differentiation introduced by a feature becomes less relevant and falls into the “table stakes” group, while new system capabilities introduce new benefits or address new needs, with a positive impact on the metrics like efficiency, manageability, flexibility and so on.

Key Criteria brings several benefits to organizations of all sizes with different business needs. It is organized to give the reader a brief description of the specific functionality or technology, its benefits in general terms, and what to expect from a good implementation. In order to give a complete picture, we also include examples of the most interesting implementation currently available in the market.

Critical Impact of Features on the Metrics

Technology, functionality, and architecture designs that have demonstrated their value are adopted by other vendors, become a standard, and lose their status as a differentiator. Initially, the implementation of these key criteria was crucial for delivering real value, perhaps with some trade-offs. The most important metrics for the evaluation of a technology solution include:

  • Architecture
  • Scalability
  • Flexibility
  • Efficiency
  • Manageability and ease of use
  • Partner ecosystem

This section provides the impact individual features have on the metrics at the moment of report publication. Each feature is scored from one to five, with a score of five having the most impact on an enterprise. This is not absolute and should always be verified with the organization’s requirements and use case. Strategic decisions can then be based on the impact each metric can have on the infrastructure, system management, and IT processes already in place with particular emphasis on ROI and TCO.

Near-term Game-changing Technology

In this report section, we analyze the most interesting technologies on the horizon over the next 12 to 18 months. Some are already present in some form but usually as part of niche products or for addressing very specific use cases. In either case, at this stage, the implementations available are not mature enough to be grouped in key criteria. Yet when implemented correctly and efficiently, this technology can really make a difference to the metrics.

Over time, game-changing features become key criteria, and the cycle repeats. Therefore, to get the best ROI, it is important to check what vendors are offering today and what they plan to release in the near future.

Continue Reading

Security

The Five Pillars of (Azure) Cloud-based Application Security

Published

on

This 1-hour webinar from GigaOm brings together experts in Azure cloud application migration and security, featuring GigaOm analyst Jon Collins and special guests from Fortinet, Director of Product Marketing for Public Cloud, Daniel Schrader, and Global Director of Public Cloud Architecture and Engineering, Aidan Walden.

These interesting times have accelerated the drive towards digital transformation, application rationalization, and migration to cloud-based architectures. Enterprise organizations are looking to increase efficiency, but without impacting performance or increasing risk, either from infrastructure resilience or end-user behaviors.

Success requires a combination of best practice and appropriate use of technology, depending on where the organization is on its cloud journey. Elements such as zero-trust access and security-driven networking need to be deployed in parallel with security-first operations, breach prevention and response.

If you are looking to migrate applications to the cloud and want to be sure your approach maximizes delivery whilst minimizing risk, this webinar is for you.

Continue Reading

Security

Data Management and Secure Data Storage for the Enterprise

Published

on

This free 1-hour webinar from GigaOm Research brings together experts in data management and security, featuring GigaOm Analyst Enrico Signoretti and special guest from RackTop Systems, Jonathan Halstuch. The discussion will focus on data storage and how to protect data against cyberattacks.

Most of the recent news coverage and analysis of cyberattacks focus on hackers getting access and control of critical systems. Yet rarely is it mentioned that the most valuable asset for the organizations under attack is the data contained in these systems.

In this webinar, you will learn about the risks and costs of a poor data security management approach, and how to improve your data storage to prevent and mitigate the consequences of a compromised infrastructure.

Continue Reading

Security

CISO Podcast: Talking Anti-Phishing Solutions

Published

on

Simon Gibson earlier this year published the report, “GigaOm Radar for Phishing Prevention and Detection,” which assessed more than a dozen security solutions focused on detecting and mitigating email-borne threats and vulnerabilities. As Gibson noted in his report, email remains a prime vector for attack, reflecting the strategic role it plays in corporate communications.

Earlier this week, Gibson’s report was a featured topic of discussions on David Spark’s popular CISO Security Vendor Relationship Podcast. In it, Spark interviewed a pair of chief information security officers—Mike Johnson, CISO for SalesForce, and James Dolph, CISO for Guidewire Software—to get their take on the role of anti-phishing solutions.

“I want to first give GigaOm some credit here for really pointing out the need to decide what to do with detections,” Johnson said when asked for his thoughts about selecting an anti-phishing tool. “I think a lot of companies charge into a solution for anti-phishing without thinking about what they are going to do when the thing triggers.”

As Johnson noted, the needs and vulnerabilities of a large organization aligned on Microsoft 365 are very different from those of a smaller outfit working with GSuite. A malicious Excel macro-laden file, for example, poses a credible threat to a Microsoft shop and therefore argues for a detonation solution to detect and neutralize malicious payloads before they can spread and morph. On the other hand, a smaller company is more exposed to business email compromise (BEC) attacks, since spending authority is often spread among many employees in these businesses.

Gibson’s radar report describes both in-line and out-of-band solutions, but Johnson said cloud-aligned infrastructures argue against traditional in-line schemes.

“If you put an in-line solution in front of [Microsoft] 365 or in front of GSuite, you are likely decreasing your reliability, because you’ve now introduced this single point of failure. Google and Microsoft have this massive amount of reliability that is built in,” Johnson said.

So how should IT decision makers go about selecting an anti-phishing solution? Dolph answered that question with a series of questions of his own:

“Does it nail the basics? Does it fit with the technologies we have in place? And then secondarily, is it reliable, is it tunable, is it manageable?” he asked. “Because it can add a lot overhead, especially if you have a small team if these tools are really disruptive to the email flow.”

Dolph concluded by noting that it’s important for solutions to provide insight that can help organizations target their protections, as well as support both training and awareness around threats. Finally, he urged organizations to consider how they can measure the effectiveness of solutions.

“I may look at other solutions in the future and how do I compare those solutions to the benchmark of what we have in place?”

Listen to the Podcast: CISO Podcast

Continue Reading

Trending