Connect with us

Security

New tool automates phishing attacks that bypass 2FA

Published

on


Image: Piotr Duszyński

A new penetration testing tool published at the start of the year by a security researcher can automate phishing attacks with an ease never seen before and can even blow through login operations for accounts protected by two-factor authentication (2FA).

Named Modlishka –the English pronunciation of the Polish word for mantis– this new tool was created by Polish researcher Piotr Duszyński.

Modlishka is what IT professionals call a reverse proxy, but modified for handling traffic meant for login pages and phishing operations.

modlishka.png

Image: ZDNet

It sits between a user and a target website –like Gmail, Yahoo, or ProtonMail. Phishing victims connect to the Modlishka server (hosting a phishing domain), and the reverse proxy component behind it makes requests to the site it wants to impersonate.

The victim receives authentic content from the legitimate site –let’s say for example Google– but all traffic and all the victim’s interactions with the legitimate site passes through and is recorded on the Modlishka server.

modlishka-backend.png

Modlishka backend panel


Image: Piotr Duszyński

Any passwords a user may enter, are automatically logged in the Modlishka backend panel, while the reverse proxy also prompts users for 2FA tokens when users have configured their accounts to request one.

If attackers are on hand to collect these 2FA tokens in real-time, they can use them to log into victims’ accounts and establish new and legitimate sessions.

The video below shows how a Modlishka-powered phishing site that seamlessly loads content from the real Google login interface without using templates, and logs credentials and any 2FA code that a user might be seeing.

Because of this simple design, Modlishka doesn’t use any “templates,” a term used by phishers to describe clones of legitimate sites. Since all the content is retrieved from the legitimate site in real time, attackers don’t need to spend much time updating and fine-tuning templates.

Instead, all attackers need is a phishing domain name (to host on the Modlishka server) and a valid TLS certificate to avoid alerting users of the lack of an HTTPS connection.

The final step would be to configure a simple config file that unloads victims onto the real legitimate sites at the end of the phishing operation before they spot the sketchy-looking phishing domain.

In an email to ZDNet, Duszyński described Modlishka as a point-and-click and easy-to-automate system that requires minimal maintenance, unlike previous phishing toolkits used by other penetration testers.

“At the time when I started this project (which was in early 2018), my main goal was to write an easy to use tool, that would eliminate the need of preparing static webpage templates for every phishing campaign that I was carrying out,” the researcher told us.

“The approach of creating a universal and easy to automate reverse proxy, as a MITM actor, appeared to be the most natural direction. Despite some technical challenges, that emerged on this path, the overall result appeared to be really rewarding,” he added.

“The tool that I wrote is sort of a game changer, since it can be used as a ‘point and click’ proxy, that allows easy phishing campaign automation with full support of the 2FA (an exception to this is a U2F protocol based tokens – which is currently the only resilient second factor).

“There are some cases that require manual tuning (due to obfuscated JavaScript code or, for example, HTML tag security attributes like ‚integrity’), but these are fully supported by the tool and will also be improved in the future releases,” Duszyński told ZDNet.

An Amnesty International report released in December showed that advanced state-sponsored actors have already started using phishing systems that can bypass 2FA already.

Now, many fear that Modlishka would reduce the entry barrier to allow so-called “script kiddies” to set up phishing sites within minutes, even with far fewer technical skills required. Furthermore, this tool would allow cyber-crime groups to easily automate the creation of phishing pages that are easier to maintain and harder to detect by victims.

When we asked why he released such a dangerous tool on GitHub, Duszyński had a pretty intriguing answer.

“We have to face the fact that without a working proof of concept, that really proves the point, the risk is treated as theoretical, and no real measures are taken to address it properly,” he said.

“This status quo, and lack of awareness about the risk, is a perfect situation for malicious actors that will happily exploit it.”

Duszyński said that while his tool can automate the process of a phishing site passing through 2FA checks based on SMS and one-time codes, Modlishka is inefficient against U2F-based schemes that rely on hardware security keys.

Modlishka is currently available on GitHub under an open source license. Additional information is also available on Duszyński’s blog.

More cybersecurity news:



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

Security

Defeating Distributed Denial of Service Attacks

Published

on

It seems like every day the news brings new stories of cyberattacks. Whether ransomware, malware, crippling viruses, or more frequently of late—distributed denial of service (DDoS) attacks. According to Infosec magazine, in the first half of 2020, there was a 151% increase in the number of DDoS attacks compared to the same period the previous year. That same report states experts predict as many as 15.4 million DDoS attacks within the next two years.

These attacks can be difficult to detect until it’s too late, and then they can be challenging to defend against. There are solutions available, but there is no one magic bullet. As Alastair Cooke points out in his recent “GigaOm Radar for DDoS Protection” report, there are different categories of DDoS attacks.

And different types of attacks require different types of defenses. You’ll want to adopt each of these three defense strategies against DDoS attacks to a certain degree, as attackers are never going to limit themselves to a single attack vector:

Network Defense: Attacks targeting the OS and network operate at either Layer 3 or Layer 4 of the OSI stack. These attacks don’t flood the servers with application requests but attempt to exhaust TCP/IP resources on the supporting infrastructure. DDoS protection solutions defending against network attacks identify the attack behavior and absorb it into the platform.

Application Defense: Other DDoS attacks target the actual website itself or the web server application by overwhelming the site with random data and wasting resources. DDoS protection against these attacks might handle SSL decryption with hardware-based cryptography and prevent invalid data from reaching web servers.

Defense by Scale: There have been massive DDoS attacks, and they show no signs of stopping. The key to successfully defending against a DDoS attack is to have a scalable platform capable of deflecting an attack led by a million bots with hundreds of gigabits per second of network throughput.

Table 1. Impact of Features on Metrics
[chart id=”1001387″ show=”table”]

DDoS attacks are growing more frequent and more powerful and sophisticated. Amazon reports mitigating a massive DDoS attack a couple of years ago in which peak traffic volume reached 2.3 Tbps. Deploying DDoS protection across the spectrum of attack vectors is no longer a “nice to have,” but a necessity.

In his report, Cooke concludes that “Any DDoS protection product is only part of an overall strategy, not a silver bullet for denial-of-service hazards.” Evaluate your organization and your needs, read more about each solution evaluated in the Radar report, and carefully match the right DDoS solutions to best suit your needs.

Learn More About the Reports: Gigaom Key Criteria for DDoS, and Gigaom Radar for DDoS

The post Defeating Distributed Denial of Service Attacks appeared first on GigaOm.

Continue Reading

Security

Assessing Providers of Low-Power Wide Area Networks

Published

on

/*! elementor – v3.6.4 – 13-04-2022 */
.elementor-widget-text-editor.elementor-drop-cap-view-stacked .elementor-drop-cap{background-color:#818a91;color:#fff}.elementor-widget-text-editor.elementor-drop-cap-view-framed .elementor-drop-cap{color:#818a91;border:3px solid;background-color:transparent}.elementor-widget-text-editor:not(.elementor-drop-cap-view-default) .elementor-drop-cap{margin-top:8px}.elementor-widget-text-editor:not(.elementor-drop-cap-view-default) .elementor-drop-cap-letter{width:1em;height:1em}.elementor-widget-text-editor .elementor-drop-cap{float:left;text-align:center;line-height:1;font-size:50px}.elementor-widget-text-editor .elementor-drop-cap-letter{display:inline-block}

Blog Title: Assessing Providers of Low-Power Wide Area Network Technology

Companies are taking note of how Low-Power Wide Area Networks (LPWAN) can provide long-distance communications for certain use cases. While its slow data transfer rates and high latency aren’t going to be driving any high intensity video streaming or other bandwidth-hungry situations, it can provide inexpensive, low power, long-distance communication.

According to Chris Grundemann and Logan Andrew Green’s recent report “GigaOm Radar for LPWAN Technology Providers (Unlicensed Spectrum) v1.0,” this growing communications technology is suitable for use cases with the following characteristics:

  • Requirement for long-distance transmission—10 km/6 miles or more wireless connectivity from sensor to gateway
  • Low power consumption, with battery life lasting up to 10 years
  • Terrain and building penetration to circumvent line-of-sight issues
  • Low operational costs (device management or connection subscription cost)
  • Low data transfer rate of roughly 20kbps

These use cases could include large-scale IoT deployments within heavy industry, manufacturing, government, and retail. The LPWAN technology providers evaluated in this Radar report are currently filling a gap in the IoT market. They are certainly poised to benefit from the anticipated rapid adoption of LPWAN solutions.

Depending on the use case you’re looking to fulfill, you can select from four basic deployment models from these LPWAN providers:

  • Physical Appliance: This option would require a network server on-premises to receive sensor data from gateways.
  • Virtual Appliance: Network servers could also be deployed as virtual appliances, running either on-premises or in the cloud.
  • Network Stack as a Service: With this option, the LPWAN provider fully manages your network stack and provides you with the service. You only need devices and gateways to satisfy your requirements.
  • Network as a Service: This option is provided by mobile network operators, with the provider operating the network stack and gateways. You would only need to connect to the LPWAN provider.

Figure 1. LPWAN Connectivity

The LPWAN providers evaluated in this report are well-positioned from both a business and technical perspective, as they can function as a single point of contact for building IoT solutions. Instead of cobbling together other solutions to satisfy connectivity protocols, these providers can set up your organization with a packaged IoT solution, reducing time to market and virtually eliminating any compatibility issues.

The unlicensed spectrum aspect is also significant. The LPWAN technology providers evaluated in this Radar report use at least one protocol in the unlicensed electromagnetic spectrum bands. There’s no need to buy FCC licenses for specific frequency bands, which also lowers costs.

Learn More: Gigaom Enterprise Radar for LPWAN

The post Assessing Providers of Low-Power Wide Area Networks appeared first on GigaOm.

Continue Reading

Security

The Benefits of a Price Benchmark for Data Storage

Published

on

Why Price Benchmark Data Storage?

Customers, understandably, are highly driven by budget when it comes to data storage solutions. The cost of switching, upkeep and upgrades are high risk factors for businesses, and therefore, decision makers need to look for longevity in their chosen solution. Many factors influence how data needs to be handled within storage, including data that is frequently accessed, or storing rarely-accessed legacy data. 

Storage performance may also be shaped by geographic location, from remote work or global enterprises that need to access and share data instantly, or by the necessity of automation. Each element presents a new price-point that needs to be considered, by customers and by vendors.

A benchmark gives a comparison of system performance based on a key performance indicator, such as latency, capacity, or throughput. Competitor systems are analyzed in like-for-like situations that optimize the solution, allowing a clear representation of the performance. Price benchmarks for data storage are ideal for marketing, showing customers exactly how much value for money a solution has against competitor vendors.

Benchmark tests reinforce marketing collateral and tenders with verifiable evidence of performance capabilities and how the transactional costs relate to them. Customers are more likely to invest in long-term solutions with demonstrable evidence that can be corroborated. Fully disclosed testing environments, processes, and results, give customers the proof they need and help vendors stand out from the crowd.

The Difficulty in Choosing

Storage solutions vary greatly, from cloud options to those that utilize on-premises software. Data warehouses have different focuses which impact the overall performance, and they can vary in their pricing and licensing models. Customers find it difficult to compare vendors when the basic data storage configurations differ and price plans vary. With so many storage structures available, it’s hard to explain to customers how output relates to price, appeal to their budget, and maintain integrity, all at the same time.

Switching storage solutions is also a costly, high-risk decision that requires careful consideration. Vendors need to create compelling and honest arguments that provide reassurance of ROI and high quality performance.

Vendors should begin by pitching their costs at the right level; they need to be profitable but also appealing to the customer. Benchmarking can give an indication of how competitor cost models are calculated, allowing vendors to make judgements on their own price plans to keep ahead of the competition. 

Outshining the Competition

Benchmark testing gives an authentic overview of storage transaction-based price-performance, carrying out the test in environments that imitate real-life. Customers can gain a higher understanding of how the product works in terms of transactions per second, and how competitors process storage data in comparison.

The industry-standard for benchmarking is the TPC Benchmark E (TPC-E), a recognized standard for storage vendors. Tests need to be performed in credible environments; by giving full transparency on their construction, vendors and customers can understand how the results are derived. This can also prove systems have been configured to offer the best performance of each platform.

A step-by-step account allows tests to be recreated by external parties given the information provided. This transparency in reporting provides more trustworthy and reliable outcomes that offer a higher level of insight to vendors. Readers can also examine the testing and results themselves, to draw independent conclusions.

Next Steps

Price is the driving factor for business decisions and the selection for data storage is no different. Businesses often look towards low-cost solutions that offer high capacity, and current trends have pushed customers towards cloud solutions which are often cheaper and flexible. The marketplace is full in regard to options: new start-ups are continually emerging, and long serving vendors are needing to reinvent and upgrade their systems to keep pace. 

Vendors need evidence of price-performance, so customers can be reassured that their choice will offer longevity and functionality at an affordable price point. Industry-standard benchmarking identifies how performance is impacted by price and which vendors are best in the market – the confirmation customers need to invest.

 

The post The Benefits of a Price Benchmark for Data Storage appeared first on GigaOm.

Continue Reading

Trending