Connect with us

Gadgets

Robots learn to grab and scramble with new levels of agility – TechCrunch

Published

on

Robots are amazing things, but outside of their specific domains they are incredibly limited. So flexibility — not physical, but mental — is a constant area of research. A trio of new robotic setups demonstrate ways they can evolve to accommodate novel situations: using both “hands,” getting up after a fall, and understanding visual instructions they’ve never seen before.

The robots, all developed independently, are gathered together today in a special issue of the journal Science Robotics dedicated to learning. Each shows an interesting new way in which robots can improve their interactions with the real world.

On the other hand…

First there is the question of using the right tool for a job. As humans with multi-purpose grippers on the ends of our arms, we’re pretty experienced with this. We understand from a lifetime of touching stuff that we need to use this grip to pick this up, we need to use tools for that, this will be light, that heavy, and so on.

Robots, of course, have no inherent knowledge of this, which can make things difficult; it may not understand that it can’t pick up something of a given size, shape, or texture. A new system from Berkeley roboticists acts as a rudimentary decision-making process, classifying objects as able to be grabbed either by an ordinary pincer grip or with a suction cup grip.

A robot, wielding both simultaneously, decides on the fly (using depth-based imagery) what items to grab and with which tool; the result is extremely high reliability even on piles of objects it’s never seen before.

It’s done with a neural network that consumed millions of data points on items, arrangements, and attempts to grab them. If you attempted to pick up a teddy bear with a suction cup and it didn’t work the first ten thousand times, would you keep on trying? This system learned to make that kind of determination, and as you can imagine such a thing is potentially very important for tasks like warehouse picking for which robots are being groomed.

Interestingly, because of the “black box” nature of complex neural networks, it’s difficult to tell what exactly Dex-Net 4.0 is actually basing its choices on, although there are some obvious preferences, explained Berkeley’s  Ken Goldberg in an email.

“We can try to infer some intuition but the two networks are inscrutable in that we can’t extract understandable ‘policies,’ ” he wrote. “We empirically find that smooth planar surfaces away from edges generally score well on the suction model and pairs of antipodal points generally score well for the gripper.”

Now that reliability and versatility are high, the next step is speed; Goldberg said that the team is “working on an exciting new approach” to reduce computation time for the network, to be documented, no doubt, in a future paper.

ANYmal’s new tricks

Quadrupedal robots are already flexible in that they can handle all kinds of terrain confidently, even recovering from slips (and of course cruel kicks). But when they fall, they fall hard. And generally speaking they don’t get up.

The way these robots have their legs configured makes it difficult to do things in anything other than an upright position. But ANYmal, a robot developed by ETH Zurich (and which you may recall from its little trip to the sewer recently), has a more versatile setup that gives its legs extra degrees of freedom.

What could you do with that extra movement? All kinds of things. But it’s incredibly difficult to figure out the exact best way for the robot to move in order to maximize speed or stability. So why not use a simulation to test thousands of ANYmals trying different things at once, and use the results from that in the real world?

This simulation-based learning doesn’t always work, because it isn’t possible right now to accurately simulate all the physics involved. But it can produce extremely novel behaviors or streamline ones humans thought were already optimal.

At any rate that’s what the researchers did here, and not only did they arrive at a faster trot for the bot (above), but taught it an amazing new trick: getting up from a fall. Any fall. Watch this:

It’s extraordinary that the robot has come up with essentially a single technique to get on its feet from nearly any likely fall position, as long as it has room and the use of all its legs. Remember, people didn’t design this — the simulation and evolutionary algorithms came up with it by trying thousands of different behaviors over and over and keeping the ones that worked.

Ikea assembly is the killer app

Let’s say you were given three bowls, with red and green balls in the center one. Then you’re given this on a sheet of paper:

As a human with a brain, you take this paper for instructions, and you understand that the green and red circles represent balls of those colors, and that red ones need to go to the left, while green ones go to the right.

This is one of those things where humans apply vast amounts of knowledge and intuitive understanding without even realizing it. How did you choose to decide the circles represent the balls? Because of the shape? Then why don’t the arrows refer to “real” arrows? How do you know how far to go to the right or left? How do you know the paper even refers to these items at all? All questions you would resolve in a fraction of a second, and any of which might stump a robot.

Researchers have taken some baby steps towards being able to connect abstract representations like the above with the real world, a task that involves a significant amount of what amounts to a sort of machine creativity or imagination.

Making the connection between a green dot on a white background in a diagram and a greenish roundish thing on a black background in the real world isn’t obvious, but the “visual cognitive computer” created by Miguel Lázaro-Gredilla and his colleagues at Vicarious AI seems to be doing pretty well at it.

It’s still very primitive, of course, but in theory it’s the same toolset that one uses to, for example, assemble a piece of Ikea furniture: look at an abstract representation, connect it to real-world objects, then manipulate those objects according to the instructions. We’re years away from that, but it wasn’t long ago that we were years away from a robot getting up from a fall or deciding a suction cup or pincer would work better to pick something up.

The papers and videos demonstrating all the concepts above should be available at the Science Robotics site.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Gadgets

Nvidia’s RTX 3050 brings ray tracing and DLSS to $800 laptops

Published

on

Nvidia has added two entry-level GPUs—the GeForce RTX 3050 Ti and RTX 3050—to the RTX 30 laptop line. Nvidia says the chips will be available “this summer” in laptops starting at $799.

Like every other product in the RTX 30 line, these cards are based on the Ampere architecture and are capable of ray tracing and Nvidia’s proprietary “Deep Learning Super Sampling” (DLSS) upscaling tech. As you can probably guess from their names, the cards slot in below the existing RTX 3060 GPU, with cuts across the board. You can dive into Nvidia’s comparison table below, but the short version is that these cheaper GPUs have less memory (4GB) and fewer CUDA, Tensor, and ray-tracing cores.

Nvidia's comparison of its laptop GPU lineup.

Nvidia’s comparison of its laptop GPU lineup.

Nvidia

DLSS lets your GPU render a game at a lower resolution and then uses AI to upscale everything to a higher resolution, helping you hit a higher frame rate than you could at your native resolution. It sounds like AI hocus-pocus, but it actually works—you just need the right Nvidia card and a game that supports it. On a lower-powered laptop, anything that helps boost gaming performance without sacrificing graphical fidelity is welcome.

Intel’s Tiger Lake-H processors were also announced today, and we should see a lot of devices launching with both chips. That’s if there is sufficient supply of the chips to go around. Nvidia is already facing serious video card shortages, and Intel is being hit by the global chip shortage, too. Maybe partner laptops are getting a higher allocation of chips?

Continue Reading

Gadgets

Intel claims its new Tiger Lake-H CPUs for laptops beat AMD’s Ryzen 5000

Published

on

Enlarge / Intel’s new Core i9-11980HK leads the 11th-gen laptop CPU lineup.

Intel today announced 10 new 11th-generation CPUs for high-performance laptops like those made for gamers or content creators. Built on the 10nm SuperFin process, the new chips are in the Core i9, Core i7, Core i5, and Xeon families, and they carry the label “Tiger Lake-H.”

New consumer laptop CPUs include the Core i9-11980HK, Core i9-11900H, Core i7-11800H—all of which have eight cores—plus the Core i5-11400H and Core i5-11260H, which each have six cores.

Naturally, Intel today put the spotlight on the fastest Core i9-11980HK chip. The company claims this CPU is able to beat its predecessor by several percentage points in games like Hitman 3 or Rainbow Six: Siege, depending on the game—anywhere from 5 percent to 21 percent, according to Intel’s own testing.

Intel also claims that the Core i9-11980HK beats AMD’s Ryzen 9 5900HX by anywhere from 11 to 26 percent. Obviously, reviewers will have to put these claims to the test in the coming weeks.

Other features in the new Tiger Lake-H chips include support for Thunderbolt 4 and Wi-Fi 6E.

As is the custom with new Intel CPU launches, numerous OEMs refreshed their laptop lineups with the new chips, including Dell, HP, Lenovo, MSI, Acer, Asus, and others. You can just about bet that if an OEM offered a portable gaming laptop for which these chips are suitable—like the Dell XPS 15, for example—a new version of that laptop was announced today.

Today was a big day for laptop hardware. By no coincidence at all, Nvidia also announced the new GeForce RTX 3050 Ti GPU, which is offered as a configuration option in some of the same laptops that now feature the new Tiger Lake-H CPUs.

In case you’re curious about more information for Intel’s new laptop chips, Intel has more details on its website. The new chips obviously won’t be sold to consumers on their own, but you’ll likely see them in numerous laptops on the market throughout the next year.

Continue Reading

Gadgets

Samsung and AMD will reportedly take on Apple’s M1 SoC later this year

Published

on

Samsung is planning big things for the next release of its Exynos system on a chip. The company has already promised that the “next generation” of its Exynos SoC will feature a GPU from AMD, which inked a partnership with Samsung in June 2019. A new report from The Korea Economic Daily provides more details.

The report says that “the South Korean tech giant will unveil a premium Exynos chip that can be used in laptops as well as smartphones in the second half of this year” and that “the new Exynos chip for laptops will use the graphics processing unit (GPU) jointly developed with US semiconductor company Advanced Micro Devices Inc.”

There’s a bit to unpack here. First, a launch this year would be an acceleration of the normal Samsung schedule. The last Exynos flagship was announced in January 2021, so you would normally pencil in the new Exynos for early next year. Second, the report goes out of its way to specify that the laptop chip will have an AMD GPU, so… not the smartphone chip?

It was always questionable that Samsung was planning to beef up its Exynos smartphone chips, since the company splits its flagship smartphone lineup between Exynos and Qualcomm, depending on the region. Exynos chips are always inferior to Qualcomm chips, but Samsung considers the two products close enough to call the Exynos- and Qualcomm-based phones the same product. If Samsung knocked it out of the park with an AMD GPU, where would that leave the Qualcomm phones? Would Samsung ditch Qualcomm? That’s hard to believe, and it sounds like the easy answer is for the company to just not dramatically change the Exynos smartphone chips.

For laptops, Samsung has to chase down its favorite rival, Apple, which is jumping into ARM laptops with its M1 chip. If Samsung wants its products to have any hope of being competitive with Apple laptops, it would have to launch its own ARM laptop SoC. Getting AMD onboard for this move makes the most sense (it already makes Windows GPUs), and while that would be a good first step, it still doesn’t seem like it would lead to a complete, competitive product.

What about the CPU?

Even if we suppose everything goes right with Samsung’s AMD partnership and the company gets a top-tier SoC GPU, the kind of chip Samsung seems to be producing is not what you would draw up for use in a great laptop. The three big components in an SoC are the CPU, GPU, and modem. It seems like everyone is investing in SoC design, and some companies are better positioned to produce a competitive chip than others.

Of course, everybody is chasing Apple’s M1 SoC, but Apple’s expertise lines up well with what you would want from a laptop. Apple has a world-beating CPU team thanks to years of iPhone work based on the company’s acquisition of PA Semi. Apple started making its own GPUs with the iPhone X in 2017, and the M1 GPU is pretty good. Apple doesn’t have a modem solution on the market yet (its phones use Qualcomm modems), but it bought Intel’s 5G smartphone business in 2019, and it’s working on in-house modem chips. This is a great situation for a laptop chip. You want a strong, efficient CPU and a decent GPU—and you don’t really need a modem.

An AMD GPU is a start for Samsung, but the company does not have a great ARM CPU solution. ARM licenses the ARM CPU instruction set and ARM CPU designs, a bit like if Intel both licensed the x86 architecture and sold Pentium blueprints. Apple goes the more advanced route of licensing the ARM instruction set and designing its own CPUs, while Samsung licenses ARM’s CPU designs. ARM is a generalist and needs to support many different form factors and companies with its CPU designs, so it will never make a chip design that can compete with Apple’s focused designs. By all accounts, Samsung’s Exynos chip will have an inferior CPU. It will also be pretty hard to make a gaming pitch with the AMD GPU since there aren’t any Windows-on-ARM laptop games.

Qualcomm is trying to get into the ARM laptop game, too. Qualcomm’s biggest strength is its modems, which aren’t really relevant in the laptop space. Qualcomm has been in a similar position to Samsung; the company had a decent GPU division thanks to acquiring ATI’s old mobile GPU division, but it was always behind Apple because it used ARM’s CPU designs. Qualcomm’s current laptop chip is the Snapdragon 8cx gen 2, but that chip is not even a best-effort design from the company. The 8cx gen 2 doesn’t just use an ARM CPU design; it uses one that is two generations old: a Cortex A76-based design instead of the Cortex X1 design that a modern phone would use. It’s also a generation behind when it comes to the manufacturing process—7 nm instead of the 5 nm the Snapdragon 888 uses.

Qualcomm seems like it will get serious about laptop chips soon, as it bought CPU design firm Nuvia in January 2021. Nuvia has never made a product, but it was founded by defectors from Apple’s CPU division, including the chief CPU architect. Qualcomm says that with Nuvia, it will be able to ship internally designed CPUs by 2H 2022.

And then there’s Google, which wants to ship its own phone SoC, called “Whitechapel,” in the Pixel 6. Google does not have CPU, GPU, or modem expertise, so we don’t expect much from the company other than a longer OS support window.

And what about Windows?

With no great ARM laptop CPUs out there for non-Apple companies, there isn’t a huge incentive to break up the Wintel (or maybe Winx64?) monopoly. Getting a non-Apple ARM laptop most likely means running Windows for ARM, with whatever questionable app support that system has. Microsoft has been working on x86 and x64 emulation on ARM for a bit. The project entered its “first preview” in December in the Windows dev channel, but it doesn’t sound like it will be a great option for many apps. Microsoft has already said that games are “outside the target” of the company’s first attempt at x64 emulation.

Native apps are also a possibility, though developers don’t seem as interested in Windows ARM support as they do in macOS ARM support. Google was quickly ready with an ARM-native build of Chrome for macOS, but there still isn’t a build of Chrome for ARM for Windows. Adobe took a few months, but Photoshop for M1 Macs hit in March, while the Windows-on-ARM build of Photoshop is still in beta. You can, of course, run Microsoft Office. You’ll probably be stuck with OneDrive for cloud folders, since Dropbox and Google Drive don’t support Windows on ARM.

Continue Reading

Trending