Connect with us

Gadgets

Sense Photonics flashes onto the lidar scene with a new approach and $26M – TechCrunch

Published

on

Lidar is a critical part of many autonomous cars and robotic systems, but the technology is also evolving quickly. A new company called Sense Photonics just emerged from stealth mode today with a $26M A round, touting a whole new approach that allows for an ultra-wide field of view and (literally) flexible installation.

Still in prototype phase but clearly enough to attract eight figures of investment, Sense Photonics’ lidar doesn’t look dramatically different from others at first, but the changes are both under the hood and, in a way, on both sides of it.

Early popular lidar systems like those from Velodyne use a spinning module that emit and detect infrared laser pulses, finding the range of the surroundings by measuring the light’s time of flight. Subsequent ones have replaced the spinning unit with something less mechanical, like a DLP-type mirror or even metamaterials-based beam steering.

All these systems are “scanning” systems in that they sweep a beam, column, or spot of light across the scene in some structured fashion — faster than we can perceive, but still piece by piece. Few companies, however, have managed to implement what’s called “flash” lidar, which illuminates the whole scene with one giant, well, flash.

That’s what Sense has created, and it claims to have avoided the usual shortcomings of such systems — namely limited resolution and range. Not only that, but by separating the laser emitting part and the sensor that measures the pulses, Sense’s lidar could be simpler to install without redesigning the whole car around it.

I talked with CEO and co-founder Scott Burroughs, a veteran engineer of laser systems, about what makes Sense’s lidar a different animal from the competition.

“It starts with the laser emitter,” he said. “We have some secret sauce that lets us build a massive array of lasers — literally thousands and thousands, spread apart for better thermal performance and eye safety.”

These tiny laser elements are stuck on a flexible backing, meaning the array can be curved — providing a vastly improved field of view. Lidar units (except for the 360-degree ones) tend to be around 120 degrees horizontally, since that’s what you can reliably get from a sensor and emitter on a flat plane, and perhaps 50 or 60 degrees vertically.

“We can go as high as 90 degrees for vert which i think is unprecedented, and as high as 180 degrees for horizontal,” said Burroughs proudly. “And that’s something auto makers we’ve talked to have been very excited about.”

Here it is worth mentioning that lidar systems have also begun to bifurcate into long-range, forward-facing lidar (like those from Luminar and Lumotive) for detecting things like obstacles or people 200 meters down the road, and more short-range, wider-field lidar for more immediate situational awareness — a dog behind the vehicle as it backs up, or a car pulling out of a parking spot just a few meters away. Sense’s devices are very much geared toward the second use case.

These are just prototype units, but they work and you can see they’re more than just renders.

Particularly because of the second interesting innovation they’ve included: the sensor, normally part and parcel with the lidar unit, can exist totally separately from the emitter, and is little more than a specialized camera. That means that while the emitter can be integrated into a curved surface like the headlight assembly, while the tiny detectors can be stuck in places where there are already traditional cameras: side mirrors, bumpers, and so on.

The camera-like architecture is more than convenient for placement; it also fundamentally affects the way the system reconstructs the image of its surroundings. Because the sensor they use is so close to an ordinary RGB camera’s, images from the former can be matched to the latter very easily.

The depth data and traditional camera image correspond pixel-to-pixel right out of the system.

Most lidars output a 3D point cloud, the result of the beam finding millions of points with different ranges. This is a very different form of “image” than a traditional camera, and it can take some work to convert or compare the depths and shapes of a point cloud to a 2D RGB image. Sense’s unit not only outputs a 2D depth map natively, but that data can be synced with a twin camera so the visible light image matches pixel for pixel to the depth map. It saves on computing time and therefore on delay — always a good thing for autonomous platforms.

Sense Photonics’ unit also can output a point cloud, as you see here.

The benefits of Sense’s system are manifest, but of course right now the company is still working on getting the first units to production. To that end it has of course raised the $26 million A round, “co-led by Acadia Woods and Congruent Ventures, with participation from a number of other investors, including Prelude Ventures, Samsung Ventures and Shell Ventures,” as the press release puts it.

Cash on hand is always good. But it has also partnered with Infineon and others, including an unnamed tier-1 automotive company, which is no doubt helping shape the first commercial Sense Photonics product. The details will have to wait until later this year when that offering solidifies, and production should start a few months after that — no hard timeline yet, but expect this all before the end of the year.

“We are very appreciative of this strong vote of investor confidence in our team and our technology,” Burroughs said in the press release. “The demand we’ve encountered – even while operating in stealth mode – has been extraordinary.”

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Gadgets

New RISC-V CPU claims recordbreaking performance per watt

Published

on

Micro Magic’s new CPU prototype is seen here running on an Odroid board.

Micro Magic Inc.—a small electronic design firm in Sunnyvale, California—has produced a prototype CPU that is several times more efficient than world-leading competitors, while retaining reasonable raw performance.

We first noticed Micro Magic’s claims earlier this week, when EE Times reported on the company’s new prototype CPU, which appears to be the fastest RISC-V CPU in the world. Micro Magic advisor Andy Huang claimed the CPU could produce 13,000 CoreMarks (more on that later) at 5GHz and 1.1V while also putting out 11,000 CoreMarks at 4.25GHz—the latter, all while consuming only 200mW. Huang demonstrated the CPU—running on an Odroid board—to EE Times at 4.327GHz/0.8V, and 5.19GHz/1.1V.

Later the same week, Micro Magic announced the same CPU could produce over 8,000 CoreMarks at 3GHz while consuming only 69mW of power.

OK, but what’s a CoreMark?

Part of the difficulty in evaluating Micro Magic’s claim for its new CPU lies in figuring out just what a CoreMark is and how many of them are needed to make a fast CPU. It’s a deliberately simplified CPU benchmarking tool released by the Embedded Microprocessor Benchmark Consortium, intended to be as platform-neutral and simple to build and use as possible. CoreMark focuses solely on the core pipeline functions of a CPU, including basic read/write, integer, and control operations. This specifically avoids most effects of system differences in memory, I/O, and so forth.

The Embedded Microprocessor Benchmark Consortium (EMBC) is a group with wide industry representation: Intel, Texas Instruments, ARM, Realtek, and Nokia are a few of its more notable and easily recognizable members.

Now that we understood all that, the next step in order to better evaluate Micro Magic’s claims was to run a few CoreMark benchmarks of our own. All we needed to do here was clone its GitHub repository, then issue a make command—optionally, with arguments XCFLAGS="-DMULTITHREAD=8 -DUSE_FORK=1" if we want to test on multiple threads/cores at once.

I still have an Apple M1 Mac Mini on hand, as well as a Ryzen 7 4700U-powered Acer Swift 3, so those were my test systems for comparison. Getting the raw performance scores was considerably easier than getting truly comparable power readings. On the Ryzen powered Linux system, I used the utility turbostat to get both Core and Package power readings while the tests were running.

I don’t have access to anything nearly as fine-grained as turbostat for the Apple M1, so for that platform I took whole-system power draw at the wall and just plain subtracted the reading at desktop idle from the sustained reading while under test. This is extremely crude, and I caution readers not to rely too much on comparing the M1’s efficiency to the Swift 3’s on these numbers alone—but it’s good enough to get some perspective on Micro Magic’s claim for its new RISC-V (pronounced “risk five”) CPU.

On to the tests!

The Micro Magic CPU is, for the moment, single-core and single-threaded—although Huang says it could “easily” be built as a 25-core part. Micro Magic has provided figures—and in one case, a screenshot—for performance at 3GHz, 4.25GHz, and 5GHz. At the maximally power-efficient 3GHz clockrate, the Micro Magic CPU scores about 1/4 the CoreMarks of either the Ryzen 4700u or Apple M1. At the maximally performant 5GHz clock, it manages just over a third of their performance.

This is enough to let us know that the Micro Magic chip in its current form isn’t a world-class competitor for traditional ARM and x86 CPUs in phone or laptop applications—but it’s much closer to them than previous RISC-V implementations have been. At the power-efficient 3GHz clockrate, the Micro Magic CPU is nearly three times faster than, for example, SiFive’s Freedom U540 CPU running single-threaded. At 5GHz, it outruns all four of the SiFive’s cores.

We can see the Micro Magic CPU on Odroid board here, scoring 8,200 iterations/sec over 10 seconds. The multimeter attached to the board is reading 69mW—according to Micro Magic, that's a measurement taken during the run, not at idle afterward.
Enlarge / We can see the Micro Magic CPU on Odroid board here, scoring 8,200 iterations/sec over 10 seconds. The multimeter attached to the board is reading 69mW—according to Micro Magic, that’s a measurement taken during the run, not at idle afterward.

At roughly a quarter the performance of world-leading x86 and ARM mobile processors, the Micro Magic CPU doesn’t sound like much yet. But when we factor in power efficiency, things get crazy. I gave my Ryzen and Apple processors the benefit of every possible doubt when generating the above charts—I used core power (not total package power) on the Ryzen 4700U and ran tests with the Gnome3 desktop shut down. For the Apple, I only had access to whole-system power draw, so I subtracted the “desktop idle” power draw from the “under test” power draw.

I tested the Apple and AMD CPUs both single-threaded and multithreaded when checking power efficiency. Unsurprisingly, both parts produced more performance per watt when exercised with one work thread for each available CPU thread. None of this made much of a dent in the Micro Magic’s commanding lead in power efficiency.

At 4.25GHz, the Micro Magic can accomplish the same workload as the Ryzen 4700U with less than one-third the power required. At 3GHz, that figure plummets to less than one-eighth the power required.

What is it good for?

The Linux operating system already supports RISC-V architecture—so for headless or near-headless controllers that simply need to deliver decent performance paired with extreme power efficiency, Micro Magic’s new CPU is likely most of the way there. Things get considerably more complicated once you start talking about entire, consumer-friendly systems, of course. Even aside from hardware considerations like GPU and LTE modem, creating an entire Android phone based on a non-ARM architecture is likely to be a much bigger undertaking.

With that said, it’s worth pointing out that—if we take Micro Magic’s numbers for granted—they’re already beating the performance of some solid mobile phone CPUs. Even at its efficiency-first 3GHz clockrate, the Micro Magic CPU outperformed a Qualcomm Snapdragon 820. The Snapdragon 820 isn’t world-class anymore, but it’s no slouch, either—it was the processor in the US version of Samsung’s Galaxy S7.

If we use the EMBC’s published single-core score for the Snapdragon 820 along with Anandtech’s single-core CPU power test result, we get about 16,000 CoreMarks per watt. That’s triple the efficiency of the Ryzen 4700u running single-threaded, and a little better than par with it when the Ryzen’s running an optimally multithreaded workload.

In other words, Micro Magic’s prototype CPU is both significantly faster and tremendously more power-efficient than a reasonably modern and still very capable smartphone CPU.

Conclusions

All of this sounds very exciting—Micro Magic’s new prototype is delivering solid smartphone-grade performance at a fraction of the power budget, using an instruction set that Linux already runs natively on. Better yet, the company itself isn’t an unknown.

Micro Magic was originally founded in 1995 and was purchased by Juniper Networks in $260 million. In 2004, it was reborn under its original name by the original founders—Mark Santoro and Lee Tavrow, who originally worked at Sun and led the team which developed the 300MHz SPARC microprocessor.

Micro Magic intends to offer its new RISC-V design to customers using an IP licensing model. The simplicity of the design—RISC-V requires roughly one-tenth the opcodes that modern ARM architecture does—further simplifies manufacturing concerns, since RISC-V CPU designs can be built in shuttle runs, sharing space on a wafer with other designs.

With that said, it would be an enormous undertaking to port—for example—an entire smartphone ecosystem, such as commercial Android, to a new architecture. In addition to building the operating system itself—not just the kernel, but drivers for all hardware from GPU to Wi-Fi to LTE modem, and more—third-party app developers would need to recompile their own applications for the new architecture as well.

We’re also still taking a pretty fair amount of Micro Magic’s claims at face value. While we’ve seen a screenshot of an 8,200 CoreMark score, and we’ve seen a 69mW power reading, it’s not entirely clear that the power reading was representative of the entire benchmark run.

Still, this is an exciting development. Not only does the new design appear to perform well while massively breaking efficiency records, it’s doing so with a far more ideologically open design than its competitors. The RISC-V ISA—unlike x86, ARM, and even MIPS—is open and provided under royalty-free licenses.

Continue Reading

Gadgets

Android apps with millions of downloads are vulnerable to serious attacks

Published

on

Android apps with hundreds of millions of downloads are vulnerable to attacks that allow malicious apps to steal contacts, login credentials, private messages, and other sensitive information. Security firm Check Point said that the Edge Browser, the XRecorder video and screen recorder, and the PowerDirector video editor are among those affected.

The vulnerability actually resides in the Google Play Core Library, which is a collection of code made by Google. The library allows apps to streamline the update process by, for instance, receiving new versions during runtime and tailoring updates to an individual app’s specific configuration or a specific phone model the app is running on.

A core vulnerability

In August, security firm Oversecured disclosed a security bug in the Google Play Core Library that allowed one installed app to execute code in the context of any other app that relied on the vulnerable library version.

The vulnerability stemmed from a directory traversal flaw that allowed untrusted sources to copy files to a folder that was supposed to be reserved only for trusted code received from Google Play. The vulnerability undermined a core protection built into the Android operating system that prevents one app from accessing data or code belonging to any other app.

Here’s an image that illustrates how an attack might work:

Check Point

Google patched the library bug in April, but for vulnerable apps to be fixed, developers must first download the updated library and then incorporate it into their app code. According to research findings from Check Point, a nontrivial number of developers continued to use the vulnerable library version.

Check Point researchers Aviran Hazum and Jonathan Shimonovich wrote:

When we combine popular applications that utilize the Google Play Core library, and the Local-Code-Execution vulnerability, we can clearly see the risks. If a malicious application exploits this vulnerability, it can gain code execution inside popular applications and have the same access as the vulnerable application.

The possibilities are limited only by our creativity. Here are just a few examples:

  • Inject code into banking applications to grab credentials, and at the same time have SMS permissions to steal the Two-Factor Authentication (2FA) codes.
  • Inject code into Enterprise applications to gain access to corporate resources.
  • Inject code into social media applications to spy on the victim, and use location access to track the device.
  • Inject code into IM apps to grab all messages, and possibly send messages on the victim’s behalf.

Seeing is believing

To demonstrate an exploit, Check Point used a proof-of-concept malicious app to steal an authentication cookie from an old version of Chrome. With possession of the cookie, the attacker is then able to gain unauthorized access to a victim’s Dropbox account.

Account Takeover exploiting vulnerability in Android’s Play Core Library Code – Demo.

Check Point identified 14 apps with combined downloads of almost 850 million that remained vulnerable. Within a few hours of publishing a report, the security firm said that developers of some of the named apps had released updates that fixed the vulnerability.

Apps identified by Check Point included Edge, XRecorder, and the PowerDirector, which have combined installations of 160 million. Check Point provided no indication that any of these apps had been fixed. Ars asked developers of all three apps to comment on the report. This post will be updated if they respond.

Continue Reading

Gadgets

Google Maps’ new “Community Feed” is like a social network for food

Published

on

Google Maps is getting a bunch of new features this week, so it’s time for a roundup! The first feature is definitely one nobody asked for: the new “Community Feed,” which is clearly trying to turn Google Maps into a social network. Google’s blog post says that “Every day, people submit more than 20 million contributions—including recommendations for their favorite spots, updates to business services, fresh reviews and ratings, photos, answers to other people’s questions, updated addresses and more.” So now Google Maps is getting a News Feed full of all these reviews and updates.

Google’s sales pitch reads “The feed shows you the latest reviews, photos and posts added to Google Maps by local experts and people you follow as well as food and drink merchants, and articles from publishers like The Infatuation.” All of these updates are in the style of a social network, with the author at the top, a “follow” link for the author, and the ability to “like” posts. The only thing it’s missing is comments!

To show how serious it is about this Google Maps Social Network thing, Google is putting the community feed front-and-center in the interface. When you open Google Maps, the community feed card is peeking up from the top of the screen, right on the main page of Maps. You just swipe up on it to read the latest updates. If you’re not on the main page of Maps, the community feed lives under the “explore” tab, the first tab on the Google Maps tab bar. This also looks like a great spot for ads.

Besides people you actively follow, it sounds like Google Maps is going to push updates from “local experts” to everyone, which hopefully won’t be abused. Google has to assume that, especially at first, everyone is going to have zero followers, so you’ve got to fill the feed with something. Google also says it will try to figure out your Google Maps interests and will fill the feed with recommendations for similar places. Today these recommendations exist in the “updates” tab, which is still in Google’s new images. Seems redundant.

Building numbers and crosswalks

Enlarge / Left: crosswalk markers on the roads. Right: Building numbers!

For a less controversial addition, how about building numbers and crosswalks? Android Police has spotted even more detail being added to certain cities in Google Maps. If you zoom all the way in on places like NYC, you’ll see striped crosswalk paint in some roads and tiny little building numbers letting you know where the exact addresses are. These go great with Google Maps’ other recent detail addition: Traffic lights.

Android Police says this was first spotted in the Android Google Maps Beta, which you can sign up for here. I’m also seeing it on Google Maps on the web.

The “Go” tab: Google Maps Bookmarks

More tab shenanigans: Google Maps’ “Commute” tab (the second one) is turning into the “Go” tab, which sounds a lot more useful. Commute would only list navigation options to your home and work, but the “Go” tab is more of a general bookmark section. Besides your home and work, you can also pin frequently visited places to the “Go” tab, and navigate them to with a tap. It looks like this will also show suggestions too, which are typically based on things like your travel and search history.

The "Go" tab.

Google says your pinned destinations will show live traffic info and accurate ETAs right from Go tab, which sounds handy. You can also pin public transit routes, which will show departure and arrival times, an up-to-date ETA, and any service alerts.

Google says “The Go Tab starts rolling out on Android and iOS in the coming weeks.” There’s no word on if you’ll be able to access these bookmarks from the web. Google didn’t say anything about the social network being on the web either. Someone remind Google that Google Maps has a website.

“Connected Photos” for Street View

Google’s ground-level Street View feature is getting another form of imagery that’s easier to record without special equipment. “Connected Photos” is a new feature more-or-less replicates the experience of walking down a street with street view, but without the hassle of taking a 360-degree photo. You just fire up the new Street View app, walk (or drive) down the street, and some sort of imagery will be created.

The feature requires an ARCore-compatible phone, Google’s 3D sensing Augmented Reality framework. It sounds like what is happening is that Google is recording a video with some 3D positional data, and as you move down the street, the best frames will be saved and converted into a series of still images for Street View. These aren’t 360 images, so you won’t be able to turn the camera, but you will be able to press the forward and backwards buttons to virtually walk down the street.

Back when this feature was in testing, it used to be called “Driving Mode” for Street View. So I guess Google wanted you to put the phones in a car mount, fire up the app, and let it collect as much data as possible while you drive around. The blog post shows a photo from the middle of a five-lane highway, so it seems like turning yourself into an amateur, 120-degree Street View car is still something Google would like you to do.

Google’s blog post says “Before this feature, you would typically need special 360-degree cameras to capture and publish Street View imagery.” That is …not accurate. Android has been able to capture Street View imagery for years, via the PhotoSphere feature that launched in 2012 on Android 4.2. Google Maps uploads of PhotoSpheres have been supported since 2013. PhotoSpheres are full 360 images, and taking them on a phone involves stitching together something like 15 photos. While the wizard walks you through the steps, it takes forever to make one, so Connected Photos is a simpler, higher-bandwidth way for the public to contribute pictures. It sounds like this is only an Android feature again, and you’ll need a new version of the Street View app, not Google Maps.

Continue Reading

Trending