Connect with us

Security

The real future of healthcare is cultural change, not just AI and other technology

Published

on


(Image: Getty Images/iStockphoto)

“It actually is quite easy to be a futurist with regards to where are we going with health,” says Dr Ron Grenfell, director of health and biosecurity at CSIRO.

“It takes 15+ years to get evidence into practice,” he told the Commonwealth Bank’s Future of Health conference in Sydney last week. The “inertia of the system” will hold back the adoption of a lot of technology that’s being pitched as the future of health.

That, in your writer’s view, is one of the two big conceptual challenges at the heart of so many discussions of the digital transformation of healthcare. Vendors are pitching technologies like AI and chatbots to reduce the workload of humans, yet the healthcare sector is way behind the pace.

Dr Kevin Cheng is founder of Australian healthcare provider Osana. They use cloud communications provider 8×8 for their own needs, and use cloud-based medical records, but they run into the usual problems when communicating with other providers.

“I tried really hard not to buy a fax machine for our startup, but we failed,” Cheng said during a roundtable in Sydney last week, to much knowing laughter.

“When I talk to allied health and specialists, we’re often crossing IT barriers. It’s hard to get people on the phone to talk, so we’re very transactional … the other clinician could be sitting in a room next door, but we’re literally writing letters to each other and not talking,” he said.

Cheng believes Australia is lagging behind other high-tech nations. GPs in the US are now doing many of their consultations virtually, he said, whereas in Australian that generally only happens in remote locations.

“We’re having to create our own scorecards and dashboards in our own datasets, because there’s no reporting analytics that is on the market that fits our workflows,” Cheng said.

Phil Kernick, co-founder and chief technology officer of information security firm CQR Consulting, confirmed that belief.

“Nowadays doctors use computers for everything, and it doesn’t matter which industry you’re in, these are run badly. They’re run inefficiently. They’re run insecurely,” he said.

When it was “just” data, that didn’t matter so much. But software is now integrated into diagnostic and therapeutic devices, and if vendors are to be believed, AI will soon be taking control.

See: AI and the NHS: How artificial intelligence will change everything for patients and doctors

“I have a real concern that as everything moves to technology, and when we get into AI and machine learning something, we stop understanding how the technology works,” Kernick said.

We’re building systems that have a “very shaky foundation, and there are no regulations around this,” he said.

“If you look at the Therapeutic Goods Act, you look at how we regulate medical equipment, there are no software security standards. The information page actually says we take a risk-based approach, and use the same risk-based and safety-first approach to all systems, whether they include software or not. I mean, that’s just waffle. It doesn’t mean anything.”

Making patients the actual focus of healthcare

Cheng says that Orana’s strategy is to put the patient’s health at the centre of their business, focusing on prevention and outcomes, rather than the transactional fee-for-service treatment model.

“Patients are going to be consumers, so they’re our customers, and that means that we need to practice in a different way. We want to be partners with patients in their health and well-being,” he said, and data and apps will be part of that.

Dr Bertalan Mesko, director of The Medical Futurist Institute, says that the healthcare sector could and should go much further.

“By 2050 the most important change will be that patients will become the point of care,” he told the CommBank conference from Budapest. Not just becoming more engaged, or “empowered”, but the actual point of care and service delivery, using their own apps and devices to gather data, rather than travelling to medical facilities for diagnostic tests.

This isn’t so much a technological revolution, according to Mesko, but a cultural revolution. In your writer’s view, that’s the second big conceptual challenge.

“Since Hippocrates, for 2000 years, medicine has been quite straightforward. Medical professionals know everything, and they let patients come to them for help, they tell them what to do, and patients go home, and either they comply with what they were told or not. Usually half of them do, and half them don’t. That’s quite a bad success rate,” Mesko said.

Medical knowledge, and even the patient’s own data, were held in the medical professionals’ “ivory tower,” he said. But that’s changing.

See: VR, AR and the NHS: How virtual and augmented reality will change healthcare

“With crowdsourcing and crowdfunding, with Amazon and social media, with open access to medical papers, and all of these online communities out there, now patients can get access to the same resources,” Mesko said.

“The hierarchy of the doctor-patient relationship is transforming into an equal partnership.”

And sometimes patients race way ahead of their doctors. Diabetes patients, for example, have combined a continuous glucose monitor, an insulin pump, and a small computer such as a Raspberry Pi, to create what is in effect a do-it-yourself pancreas.

“Many of us have no medical or engineering training and we work on improvements in the evening or at the weekend, for free,” Dana Lewis, founder of the Open Artificial Pancreas System project, told the The Guardian in July.

“Commercial devices similar to ours are now being trialled and gradually coming on to the market: we’re happy to be helping companies to speed up development. The most important thing is that people don’t have to wait,” she said.

Governments and regulators “seem to be pretty terrified about these developments and technologies”, Mesko said.

“When patients find out that there’s a solution technologically for their health problem, they will not wait for regulators to come up with a solution. They will make those solutions themselves,” he said.

“It’s possible for a government to come up with a digital health policy — not just a healthcare policy or a health IT policy, those are different things — a digital health policy that focuses on the cultural aspects of the changes technologies initiate.”

It’s the Terminator scenario forever

This is not to say that the technology isn’t important. AI-powered chatbots can take care of routing patient interactions, for example, leaving the clinicians more time for managing and patient’s health.

According to Murray Brozinsky, chief strategy officer of Conversa Health, the company’s chatbots have saved Northwell Health some $3,400 per patient when they’ve been used to help manage patients after a hip or knee replacement surgery.

Rather than having a clinician call a patient every week to see how they’re doing, a chatbot can check in daily, or whenever the patient has a question. Using what Brozinsky prefers to call “augmented intelligence” any problems can be escalated more quickly.

Mesko, like many other medtech boosters, thinks AI will be the key technological change between now and 2050, but he says it’s important to be clear about what than means.

Artificial narrow intelligence is what we have now, in everything from a car’s braking system or Amazon’s recommendation engine.

Read: IoT and the NHS: Why the Internet of Things will create a healthcare revolution

Artificial general intelligence would mean having one algorithm with the cognitive ability of one human.

“We are far away from that,” Mesko said.

“And then we would have artificial superintelligence, meaning one algorithm would have the cognitive power of humanity, basically meaning that we are doomed. It’s the ‘Terminator’ scenario forever.”

“So I think we have to draw a line under which point it would be great to develop AI. It will be just before reaching artificial general intelligence.”

Related Coverage

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Security

GigaOm Radar for Security Orchestration, Automation, and Response (SOAR)

Published

on

Security Orchestration, Automation, and Response (SOAR) emerged as a product category in the mid-2010s. At that point, SOAR solutions were very much an automation and orchestration engine based on playbooks and integrations. Since then, the platforms have developed beyond the initial core SOAR capabilities to offer more holistic experiences to security analysts, with the aim of developing SOAR as the main workspace for practitioners.

Newer features offered by this holistic experience include case management, collaboration, simulations, threat enrichment, and visual correlations. Additionally, SOAR vendors have gradually implemented artificial intelligence (AI) and machine learning (ML) technologies to enable their platforms to learn from past events and fine-tune existing processes. This is where evolving threat categorization and autonomous improvement become differentiators in the space. While these two metrics are not critical for a SOAR platform, they may offer advantages in terms of reduced mean time to resolution (MTTR), resilience against employee turnover, and overall flexibility.

We’ve observed a lot of acquisition activity in the SOAR space. This was to be expected considering that, after 2015, a sizable number of pure-play SOAR vendors entered the market. Larger players with a wider security portfolio are acquiring these SOAR-specific vendors in order to enter the automation and orchestration market. We expect to see more SOAR acquisitions as the security tools converge, very likely into next-generation Security Information & Event Management products and services (SIEMs).

SIEM is a great candidate for a central management platform for security activities. It was designed to be a single source of truth, an aggregator of multiple security logs, but has been limited historically in its ability to carry out actions. In the past few years, however, SIEMs have either started developing their own automation and orchestration engines or integrated with third-party SOAR vendors. Through a number of acquisitions and developments, multiple players with wider security portfolios have begun to offer SOAR capabilities natively as part of other security solutions.

Going forward, we expect SOAR solutions to be further integrated into other products. This will include not only SIEM, but also solutions such as Extended Detection and Response (XDR) and IT automation. The number of pure-play SOAR vendors is unlikely to increase, although a handful may remain as fully agnostic solutions that enterprises can leverage in instances when their existing next-generation SIEM platforms do not meet all their use cases. However, for pure-play SOAR vendors to remain competitive, they will need to either expand into other security areas or consistently outperform their integrated counterparts.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

The post GigaOm Radar for Security Orchestration, Automation, and Response (SOAR) appeared first on Gigaom.

Continue Reading

Security

GigaOm Radar for Disaster Recovery as a Service (DRaaS)

Published

on

Very few organizations see disaster recovery (DR) for their IT systems as a business differentiator, so they often prefer to outsource the process and consume it as a service (DRaaS) that’s billed monthly. There are many DRaaS providers with varying backgrounds, whose services are often shaped by that background. Products that started as customer-managed DR applications tend to have the most mature orchestration and automation, but vendors may face challenges transforming their application into a consumable service. Backup as a Service (BaaS) providers typically have great consumption models and off-site data protection, but they might be lacking in rich orchestration for failover. Other DRaaS providers come from IaaS backgrounds, with well-developed, on-demand resource deployment for recovery and often a broader platform with automation capabilities.

Before you invest in a DRaaS solution, you should attempt to be clear on what you see as its value. If your motivation is simply not to operate a recovery site, you probably want a service that uses technology similar to what you’re using at the protected site. If the objective is to spend less effort on DR protection, you will be less concerned about similarity and more with simplicity. And if you want to enable regular and granular testing of application recovery with on-demand resources, advanced failover automation and sandboxing will be vital features.

Be clear as well on the scale of disaster you are protecting against. On-premises recovery will protect against shared component failure in your data center. A DRaaS location in the same city will allow a lower RPO and provide lower latency after failover, but might be affected by the same disaster as your on-premises data center. A more distant DR location would be immune to your local disaster, but what about the rest of your business? It doesn’t help to have operational IT in another city if your only factory is under six feet of water.

DR services are designed to protect enterprise application architectures that are centered on VMs with persistent data and configuration. A lift-and-shift cloud adoption strategy leads to enterprise applications in the cloud, requiring cloud-to-cloud DR that is very similar to DRaaS from on-premises. Keep in mind, however, that cloud-native applications have different DR requirements.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

The post GigaOm Radar for Disaster Recovery as a Service (DRaaS) appeared first on Gigaom.

Continue Reading

Security

GigaOm Radar for DDoS Protection

Published

on

With ransomware getting all the news coverage when it comes to internet threats, it is easy to lose sight of distributed denial of service (DDoS) attacks even as these attacks become more frequent and aggressive. In fact, the two threats have recently been combined in a DDoS ransom attack, in which a company is hit with a DDoS and then a ransom demanded in exchange for not launching a larger DDoS. Clearly, a solid mechanism for thwarting such attacks is needed, and that is exactly what a good DDoS protection product will include. This will allow users, both staff and customers, to access their applications with no indication that a DDoS attack is underway. To achieve this, the DDoS protection product needs to know about your applications and, most importantly, have the capability to absorb the massive bandwidth generated by botnet attacks.

All the DDoS protection vendors we evaluated have a cloud-service element in their products. The scale-out nature of cloud platforms is the right response to the scale-out nature of DDoS attacks using botnets, thousands of compromised computers, and/or embedded devices. A DDoS protection network that is larger, faster, and more distributed will defend better against larger DDoS attacks.

Two public cloud platforms we review have their own DDoS protection, both providing it for applications running on their public cloud and offering only cloud-based protection. We also look at two content delivery networks (CDNs) that offer only cloud-based protection but also have a large network of locations for distributed protection. Many of the other vendors offer both on-premises and cloud-based services that are integrated to provide unified protection against the various attack vectors that target the network and application layers.

Some of the vendors have been protecting applications since the early days of the commercial internet. These vendors tend to have products with strong on-premises protection and integration with a web application firewall or application delivery capabilities. These companies may not have developed their cloud-based protections as fully as the born-in-the-cloud DDoS vendors.

In the end, you need a DDoS protection platform equal to the DDoS threat that faces your business, keeping in mind that such threats are on the rise.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Continue Reading

Trending