Connect with us

Security

Around 62% of all Internet sites will run an unsupported PHP version in 10 weeks

Published

on


According to statistics from W3Techs, roughly 78.9 percent of all Internet sites today run on PHP.

But on December 31, 2018, security support for PHP 5.6.x will officially cease, marking the end of all support for any version of the ancient PHP 5.x branch.

This means that starting with next year, around 62 percent of all Internet sites still running a PHP 5.x version will stop receiving security updates for their server and website’s underlying technology, exposing hundreds of millions of websites, if not more, to serious security risks.

If a hacker finds a vulnerability in PHP after the New Year, lots of sites and users would be at risk.

“This is a huge problem for the PHP ecosystem,” Scott Arciszewski, Chief Development Officer at Paragon Initiative Enterprise, told ZDNet in an interview. “While many feel that they can ‘get away with’ running PHP 5 in 2019, the simplest way to describe this choice is: Negligent.”

“To be totally fair: It’s likely that any major, mass-exploitable flaw in PHP 5.6 would also affect the newer versions of PHP,” Arciszewski added.

“PHP 7.2 will get a patch from the PHP team, for free, in a timely manner; PHP 5.6 will only get one if you’re paying for ongoing support from your OS vendor.

“If anyone finds themselves running PHP 5 after the end of the year, ask yourself: Do you feel lucky? Because I sure wouldn’t.”

php-eols.pngphp-eols.png

The PHP community has known of this deadline for quite a while. After PHP 5.6 became the most widely used PHP version back in the spring of 2017, PHP maintainers realized it would be a disaster if they stopped security updates right when PHP 5.6 became the most popular PHP version –so they extended the EOL date to the end of 2018.

Since then, there have been several developers and security researchers who warned about the “ticking PHP time bomb,” although not as many as infosec community would have wished.

There has not been a concerted effort to get people to move to the newer PHP 7.x, but some website content management systems (CMS) projects, one by one, have started modifying minimum requirements, and warning users to use more modern hosting environments.

Of the big three –WordPress, Joomla, and Drupal– only Drupal has made the official step to adjust its minimum requirements to PHP 7, but that move will come in March 2019. Ironically, the 7.0.x branch has reached EOL on December 3, 2017, which doesn’t actually solve anything, but it’s still a step forward.

Joomla’s minimum requirement remains PHP 5.3, while WordPress’ minimum requirement remains PHP 5.2.

“The biggest source of inertia in the PHP ecosystem regarding versions is undoubtedly WordPress, which still refuses to drop support for PHP 5.2 because there are more than zero systems in the universe that still run WordPress on an ancient, unsupported version of PHP,” Arciszewski said, describing the WordPress team’s infamous strongheadedness of keeping its minimum requirement at a PHP version that went EOL in 2011.

WordPress –which is used for more than a quarter of all sites on the Internet– would, without a doubt, shift a lot of people’s views on the necessity of using modern PHP versions if the project would move its minimum PHP requirement to the newer PHP 7.x branch.

“What PHP versions should be supported [by WordPress], however, has been a major debate for some time,” said Sean Murphy, Director of Threat Intelligence at Defiant, the company behind the WordFence security plugin for WordPress, in an email exchange with ZDNet.

“There is an ongoing initiative by the WordPress team to notify users when they are using a legacy version of PHP and give them the information and tools they need to request a newer version from their hosting provider,” he added. “Here are notes from this team’s recent meeting.”

Murphy believes that one of the biggest challenges of rolling out PHP version upgrades to a large number of sites is the flood of support requests that come as a result, a reason why many CMS projects and web hosting providers are reticent and unwilling to do so.

But Murphy also points out that “good hosting providers” will always deploy new users on new versions of PHP by default, instead of letting customers choose, and will update existing clients to new versions of PHP only when requested.

But unless customers are aware that their version of PHP has reached end-of-life, very few will ask to be moved to a newer version.

Here’s where WordPress’ notifications for users who are running sites on outdated PHP versions will come to help –making people either update their server or ask their hosting provider for a more modern hosting environment.

While some WordPress security experts are alarmed about the impending EOL for the PHP 5.6 branch and the entire PHP 5.x, indirectly, Murphy is not one of them.

“A PHP vulnerability […] would indeed be very bad, but there hasn’t been any that I know of in recent history,” he said.

“Based on past PHP vulnerabilities, the threat is mostly with PHP applications,” Murphy added, suggesting that attackers would likely continue to focus on PHP libraries and CMS systems.

But not all share Murphy’s opinion. For example, Arciszewski believes that PHP 5.6 and the older branches will be prodded for new vulnerabilities more than the usual. These branches are now EOL, are insanely popular, and are unsupported –the perfect conditions of plentiful targets with bad security that draw in attackers.

“Yes, that is absolutely a risk factor,” Arciszewski said. “We saw something similar happen after Windows XP support was dropped, and I suspect we’ll see the same happen to the PHP 5 branch.

“Maybe that will be the necessary catalyst for companies to take PHP 7 adoption seriously? I can only hope.”

And if server administrators and website owners need more convincing, we’ll end this article with the same ending that Martin Wheatley used for his “ticking PHP time bomb” piece from over the summer.

Yes it does cost time and money, but what’s worse, a small monthly support fee, or a headline “Site hacked, thousands of user details stolen” followed by a fine for up to 20 million euros or 4% of your turnover under GDPR… I know what I’d rather pay.

RELATED COVERAGE:

Source link



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Security

Key Criteria for Evaluating Security Information and Event Management Solutions (SIEM)

Published

on

Security Information and Event Management (SIEM) solutions consolidate multiple security data streams under a single roof. Initially, SIEM supported early detection of cyberattacks and data breaches by collecting and correlating security event logs. Over time, it evolved into sophisticated systems capable of ingesting huge volumes of data from disparate sources, analyzing data in real time, and gathering additional context from threat intelligence feeds and new sources of security-related data. Next-generation SIEM solutions deliver tight integrations with other security products, advanced analytics, and semi-autonomous incident response.

SIEM solutions can be deployed on-premises, in the cloud, or a mix of the two. Deployment models must be weighed with regard to the environments the SIEM solution will protect. With more and more digital infrastructure and services becoming mission critical to every enterprise, SIEMs must handle higher volumes of data. Vendors and customers are increasingly focused on cloud-based solutions, whether SaaS or cloud-hosted models, for their scalability and flexibility.

The latest developments for SIEM solutions include machine learning capabilities for incident detection, advanced analytics features that include user behavior analytics (UBA), and integrations with other security solutions, such as security orchestration automation and response (SOAR) and endpoint detection and response (EDR) systems. Even though additional capabilities within the SIEM environment are a natural progression, customers are finding it even more difficult to deploy, customize, and operate SIEM solutions.

Other improvements include better user experience and lower time-to-value for new deployments. To achieve this, vendors are working on:

  • Streamlining data onboarding
  • Preloading customizable content—use cases, rulesets, and playbooks
  • Standardizing data formats and labels
  • Mapping incident alerts to common frameworks, such as the MITRE ATT&CK framework

Vendors and service providers are also expanding their offerings beyond managed SIEM solutions to à la carte services, such as content development services and threat hunting-as-a-service.

There is no one-size-fits-all SIEM solution. Each organization will have to evaluate its own requirements and resource constraints to find the right solution. Organizations will weigh factors such as deployment models or integrations with existing applications and security solutions. However, the main decision factor for most customers will revolve around usability, affordability, and return on investment. Fortunately, a wide range of solutions available in the market can almost guarantee a good fit for every customer.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Continue Reading

Security

Key Criteria for Evaluating Secure Service Access

Published

on

Since the inception of large-scale computing, enterprises, organizations, and service providers have protected their digital assets by securing the perimeter of their on-premises data centers. With the advent of cloud computing, the perimeter has dissolved, but—in most cases—the legacy approach to security hasn not. Many corporations still manage the expanded enterprise and remote workforce as an extension of the old headquarters office/branch model serviced by LANs and WANs.

Bolting new security products onto their aging networks increased costs and complexity exponentially, while at the same time severely limiting their ability to meet regulatory compliance mandates, scale elastically, or secure the threat surface of the new any place/any user/any device perimeter.

The result? Patchwork security ill-suited to the demands of the post-COVID distributed enterprise.

Converging networking and security, secure service access (SSA) represents a significant shift in the way organizations consume network security, enabling them to replace multiple security vendors with a single, integrated platform offering full interoperability and end-to-end redundancy. Encompassing secure access service edge (SASE), zero-trust network access (ZTNA), and extended detection and response (XDR), SSA shifts the focus of security consumption from being either data center or edge-centric to being ubiquitous, with an emphasis on securing services irrespective of user identity or resources accessed.

This GigaOm Key Criteria report outlines critical criteria and evaluation metrics for selecting an SSA solution. The corresponding GigaOm Radar Report provides an overview of notable SSA vendors and their offerings available today. Together, these reports are designed to help educate decision-makers, making them aware of various approaches and vendors that are meeting the challenges of the distributed enterprise in the post-pandemic era.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Continue Reading

Security

Key Criteria for Evaluating Edge Platforms

Published

on

Edge platforms leverage distributed infrastructure to deliver content, computing, and security closer to end devices, offloading networks and improving performance. We define edge platforms as the solutions capable of providing end users with millisecond access to processing power, media files, storage, secure connectivity, and related “cloud-like” services.

The key benefit of edge platforms is bringing websites, applications, media, security, and a multitude of virtual infrastructures and services closer to end devices compared to public or private cloud locations.

The need for content proximity started to become more evident in the early 2000s as the web evolved from a read-only service to a read-write experience, and users worldwide began both consuming and creating content. Today, this is even more important, as live and on-demand video streaming at very high resolutions cannot be sustained from a single central location. Content delivery networks (CDNs) helped host these types of media at the edge, and the associated network optimization methods allowed them to provide these new demanding services.

As we moved into the early 2010s, we experienced the rapid cloudification of traditional infrastructure. Roughly speaking, cloud computing takes a server from a user’s office, puts it in a faraway data center, and allows it to be used across the internet. Cloud providers manage the underlying hardware and provide it as a service, allowing users to provision their own virtual infrastructure. There are many operational benefits, but at least one unavoidable downside: the increase in latency. This is especially true in this dawning age of distributed enterprises for which there is not just a single office to optimize. Instead, “the office” is now anywhere and everywhere employees happen to be.

Even so, this centralized, cloud-based compute methodology works very well for most enterprise applications, as long as there is no critical sensitivity to delay. But what about use cases that cannot tolerate latency? Think industrial monitoring and control, real-time machine learning, autonomous vehicles, augmented reality, and gaming. If a cloud data center is a few hundred or even thousands of miles away, the physical limitations of sending an optical or electrical pulse through a cable mean there are no options to lower the latency. The answer to this is leveraging a distributed infrastructure model, which has traditionally been used by content delivery networks.

As CDNs have brought the internet’s content closer to everyone, CDN providers have positioned themselves in the unique space of owning much of the infrastructure required to bring computing and security closer to users and end devices. With servers close to the topological edge of the network, CDN providers can offer processing power and other “cloud-like” services to end devices with only a few milliseconds latency.

While CDN operators are in the right place at the right time to develop edge platforms, we’ve observed a total of four types of vendors that have been building out relevant—and potentially competing—edge infrastructure. These include traditional CDNs, hyperscale cloud providers, telecommunications companies, and new dedicated edge platform operators, purpose-built for this emerging requirement.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Vendor Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Continue Reading

Trending