Connect with us

Cars

Google ponders the shortcomings of machine learning

Published

on

Critics of the current mode of artificial intelligence technology have grown louder in the last couple of years, and this week, Google, one of the biggest commercial beneficiaries of the current vogue, offered a response, if, perhaps, not an answer, to the critics.

In a paper published by the Google Brain and the Deep Mind units of Google, researchers address shortcomings of the field and offer some techniques they hope will bring machine learning farther along the path to what would be “artificial general intelligence,” something more like human reasoning.

The research acknowledges that current “deep learning” approaches to AI have failed to achieve the ability to even approach human cognitive skills. Without dumping all that’s been achieved with things such as “convolutional neural networks,” or CNNs, the shining success of machine learning, they propose ways to impart broader reasoning skills.

Also: Google Brain, Microsoft plumb the mysteries of networks with AI

The paper, “Relational inductive biases, deep learning, and graph networks,” posted on the arXiv pre-print service, is authored by Peter W. Battaglia of Google’s DeepMind unit, along with colleagues from Google Brain, MIT, and the University of Edinburgh. It proposes the use of network “graphs” as a means to better generalize from one instance of a problem to another.

Battaglia and colleagues, calling their work “part position paper, part review, and part unification,” observe that AI “has undergone a renaissance recently,” thanks to “cheap data and cheap compute resources.”

However, “many defining characteristics of human intelligence, which developed under much different pressures, remain out of reach for current approaches,” especially “generalizing beyond one’s experiences.”

Hence, “A vast gap between human and machine intelligence remains, especially with respect to efficient, generalizable learning.”

The authors cite some prominent critics of AI, such as NYU professor Gary Marcus.

In response, they argue for “blending powerful deep learning approaches with structured representations,” and their solution is something called a “graph network.” These are models of collections of objects, or entities, whose relationships are explicitly mapped out as “edges” connecting the objects.

“Human cognition makes the strong assumption that the world is composed of objects and relations,” they write, “and because GNs [graph networks] make a similar assumption, their behavior tends to be more interpretable.”

Also: Google Next 2018: A deeper dive on AI and machine learning advances

The paper explicitly draws upon work for more than a decade now on “graph neural networks.” It also echoes some of the recent interest by the Google Brain folks in using neural nets to figure out network structure.

But unlike that prior work, the authors make the surprising assertion that their work doesn’t need to use neural networks, per se.

Rather, modeling the relationships of objects is something that not only spans all the various machine learning models — CNNs, recurrent neural networks (RNNs), long-short-term memory (LSTM) systems, etc. — but also other approaches that are not neural nets, such as set theory.

The Google AI researchers reason that many things one would like to be able to reason about broadly — particles, sentences, objects in an image — come down to graphs of relationships among entities.


Google Brain, Deep Mind, MIT, University of Edinburgh.

The idea is that graph networks are bigger than any one machine-learning approach. Graphs bring an ability to generalize about structure that the individual neural nets don’t have.

The authors write, “Graphs, generally, are a representation which supports arbitrary (pairwise) relational structure, and computations over graphs afford a strong relational inductive bias beyond that which convolutional and recurrent layers can provide.”

A benefit of the graphs would also appear to be that they’re potentially more “sample efficient,” meaning, they don’t require as much raw data as strict neural net approaches.

To let you try it out at home, the authors this week offered up a software toolkit for graph networks, to be used with Google’s TensorFlow AI framework, posted on Github.

Also: Google preps TPU 3.0 for AI, machine learning, model training

Lest you think the authors think they’ve got it all figured out, the paper lists some lingering shortcomings. Battaglia & Co. pose the big question, “Where do the graphs come from that graph networks operate over?”

Deep learning, they note, just absorbs lots of unstructured data, such as raw pixel information. That data may not correspond to any particular entities in the world. So they conclude that it’s going to be an “exciting challenge” to find a method that “can reliably extract discrete entities from sensory data.”

They also concede that graphs are not able to express everything: “notions like recursion, control flow, and conditional iteration are not straightforward to represent with graphs, and, minimally, require additional assumptions.”

Other structural forms might be needed, such as, perhaps, imitations of computer-based structures, including “registers, memory I/O controllers, stacks, queues” and others.

Previous and related coverage:

What is AI? Everything you need to know

An executive guide to artificial intelligence, from machine learning and general AI to neural networks.

What is deep learning? Everything you need to know

The lowdown on deep learning: from how it relates to the wider field of machine learning through to how to get started with it.

What is machine learning? Everything you need to know

This guide explains what machine learning is, how it is related to artificial intelligence, how it works and why it matters.

What is cloud computing? Everything you need to know about

An introduction to cloud computing right from the basics up to IaaS and PaaS, hybrid, public, and private cloud.

Related stories:



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Cars

BMW iX5 Hydrogen Production Starts, But Don’t Expect To See This Fuel-Cell SUV In Dealerships

Published

on

The reality, though, is that even with a small number of BMW iX5 Hydrogen SUVs being produced — using individual fuel-cells supplied by Toyota, but assembled into a stack by BMW using the automaker’s own processes and technologies — the expectation is that hydrogen as a fuel will be predominantly of interest to non-passenger vehicles. Instead, it arguably makes the most sense, BMW suggests, for larger vehicles like medium- to heavy-duty trucks, along with the marine and aviation sectors. We’ve already seen Toyota reveal its plans for such an FCEV truck.

Despite that, and an acknowledgment that battery-electric vehicles will undoubtedly lead in the mainstream, BMW still believes there’s a place for FCEVs. After all, the automaker argues, if the infrastructure is being built to cater for trucks, there’s no reason not to also use it for passenger vehicles like the iX5 Hydrogen.

The results of the small-series production beginning today will be used as technology demonstrators across select regions from spring 2023, BMW says. It’s unclear at this point how many will be built. Depending on the reception and the strengths of the technology, series production of a first model could follow mid-decade, ahead of a potential full portfolio of BMW FCEVs from the 2030s onwards.

Continue Reading

Cars

Tesla Set To Deliver The First Semi To Pepsi

Published

on

In October, Tesla’s CEO revealed that the production of the Tesla Semi had begun, and it was bound to be delivered today. Tesla has already started the countdown, and we expect the unveiling event to go down at the Nevada factory. The electric truck will be dispatched to Pepsi, which had ordered 100 units. Investor reports that Tesla’s stock price increased by 7.7% on Wednesday, probably in anticipation of Tesla’s Semi first delivery.

Musk tweeted on Saturday that the “Tesla team just completed a 500-mile drive with a Tesla Semi weighing in at 81,000 lbs!” However, considering that Musk said that the company is dealing with supply chain issues and market inflation, it’s unclear if Tesla will stick to the original $180,000 price it intended to sell at when it was announced in 2017. Then again, Tesla offers a cheaper Semi that will be available for about $150,000 — but it can only achieve up to 300 miles at full load capacity. For now, we can only wait until it’s on the road to confirm if the specs match up to what was promised five years ago.  

Continue Reading

Cars

Coinbase Joins Elon Musk In Slamming The Apple App Store Tax

Published

on

Coinbase complained that Apple’s insistence on its cut unreasonably interfered with its business.

Coinbase’s argument was largely the same as Elon Musk’s, and the basis of Epic Games’ aforementioned lawsuit. According to all of the above, Apple was half of a duopoly: with Google, it controlled the global app marketplace. The “duopoly” part of the argument is pretty much incontrovertible: As of October 2022, both Apple and Google control 99.43% of the global smartphone market between them (via StatCounter). Both get a 30% cut of everyone’s action on its marketplace. From the perspective of Coinbase, that took too much money out of too many elements of its business.

Epic sued over that and, as noted above, won with an asterisk. Apple had restricted in-app purchases, and courts found that anticompetitive, but did require that Apple get a 30% cut of the profits, even though they took place in someone else’s app. In short, according to the Verge, the court said that if you’ve found a way to make money using iOS, you owe Apple 30%, period.

Epic thought in-app purchases should be exempted from the tax. Coinbase thinks elements of the NFT development process — in this case, gas prices to run the processing equipment necessary to mint NFTs — should be exempt from Apple’s app tax. Apple treats all user expenses on an app as in-app purchases and, per the Epic court decision, in-app purchases mean Apple gets a cut.

It’s not a simple problem, and it’s not likely to be solved anytime soon. Stakeholders and regulators have barely begun to integrate cryptocurrency and NFTs into the conventional marketplace. Who gets paid for what is likely to be a conversation for years on end. For now, all that’s certain is that conversation has begun.

Continue Reading

Trending