Connect with us

Gadgets

Samsung spilled SmartThings app source code and secret keys – TechCrunch

Published

on

A development lab used by Samsung engineers was leaking highly sensitive source code, credentials and secret keys for several internal projects — including its SmartThings platform, a security researcher found.

The electronics giant left dozens of internal coding projects on a GitLab instance hosted on a Samsung-owned domain, Vandev Lab. The instance, used by staff to share and contribute code to various Samsung apps, services and projects, was spilling data because the projects were set to “public” and not properly protected with a password, allowing anyone to look inside at each project, access and download the source code.

Mossab Hussein, a security researcher at Dubai-based cybersecurity firm SpiderSilk who discovered the exposed files, said one project contained credentials that allowed access to the entire AWS account that was being used, including more than 100 S3 storage buckets that contained logs and analytics data.

Many of the folders, he said, contained logs and analytics data for Samsung’s SmartThings and Bixby services, but also several employees’ exposed private GitLab tokens stored in plaintext, which allowed him to gain additional access from 42 public projects to 135 projects, including many private projects.

Samsung told him some of the files were for testing but Hussein challenged the claim, saying source code found in the GitLab repository contained the same code as the Android app, published in Google Play on April 10.

The app, which has since been updated, has more than 100 million installs to date.

“I had the private token of a user who had full access to all 135 projects on that GitLab,” he said, which could have allowed him to make code changes using a staffer’s own account.

Hussein shared several screenshots and a video of his findings for TechCrunch to examine and verify.

The exposed GitLab instance also contained private certificates for Samsung’s SmartThings’ iOS and Android apps.

Hussein also found several internal documents and slideshows among the exposed files.

“The real threat lies in the possibility of someone acquiring this level of access to the application source code, and injecting it with malicious code without the company knowing,” he said.

Through exposed private keys and tokens, Hussein documented a vast amount of access that if obtained by a malicious actor could have been “disastrous,” he said.

A screenshot of the exposed AWS credentials, allowing access to buckets with GitLab private tokens (Image: supplied)

Hussein, a white-hat hacker and data breach discoverer, reported the findings to Samsung on April 10. In the days following, Samsung began revoking the AWS credentials, but it’s not known if the remaining secret keys and certificates were revoked.

Samsung still hasn’t closed the case on Hussein’s vulnerability report, close to a month after he first disclosed the issue.

“Recently, an individual security researcher reported a vulnerability through our security rewards program regarding one of our testing platforms,” Samsung spokesperson Zach Dugan told TechCrunch when reached prior to publication. “We quickly revoked all keys and certificates for the reported testing platform and while we have yet to find evidence that any external access occurred, we are currently investigating this further.”

Hussein said Samsung took until April 30 to revoke the GitLab private keys. Samsung also declined to answer specific questions we had and provided no evidence that the Samsung-owned development environment was for testing.

Hussein is no stranger to reporting security vulnerabilities. He recently disclosed a vulnerable back-end database at Blind, an anonymous social networking site popular among Silicon Valley employees — and found a server leaking a rolling list of user passwords for scientific journal giant Elsevier.

Samsung’s data leak, he said, was his biggest find to date.

“I haven’t seen a company this big handle their infrastructure using weird practices like that,” he said.

Read more:



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Gadgets

Apple reveals Vision Pro, a AR/VR headset unlike any other

Published

on

Enlarge / Apple’s Vision Pro headset

CUPERTINO, Calif.—After years of speculation, leaks, rumors, setbacks, and rumblings of amazing behind-the-scenes demos, Apple has made its plans for a mixed reality platform and headset public. Vision Pro is “the first Apple Product you look through, not at,” Apple’s Tim Cook said, a “new AR platform with a new product” that “augments reality by seamlessly blending the real world with the digital world.”

“I believe augmented reality is a profound technology. Blending digital content with the real world can unlock new experiences,” Cook said.

The headset, which looks like a pair of shiny ski goggles, can be controlled in a “fully 3D interface” without a handheld controller. It solely uses your eyes, hands, and voice as an interface, and the unit lets you “control the system simply by looking.” Icons and other UI elements react to your gaze, and you use natural gestures like tapping your fingers or a gentle flick to select them—no need to hold your hands awkwardly in front of you constantly.

In video demonstrations, Apple showed users walking around and grabbing things from a fridge without taking the headset off. And to further keep you from feeling isolated while wearing the headset, a system called EyeSight will display your eyes when someone is nearby, conveying “a critical indicator of connection and emotion.”

Floating 2D apps can be placed to float around your “real world” space, which remains visible through the semi-transparent display. Elements in this interface will cast shadows in the real room around them and respond to light, Apple said. These apps can also “expand fully into your space,” like a pulsating 3D animation in a mindfulness app.

Apple CEO Bob Iger came out to demonstrate a number of customized Vision Pro experiences, from Disney+ support to ESPN sports broadcasts with a wide array of stats filling your room to a virtual Mickey Mouse that walks around your space.

A floating 4K Mac display will appear when users glance at their MacBook display while in the Vision Pro. From there, users can interact with use a virtual keyboard or their voice to type, or make use of a physical Magic Trackpad and/or Magic Keyboard.

While watching movies (including 3D movies) on a virtual floating screen, the device will automatically dim your surroundings to be less distracting. The headset can take spatial photos or videos with the click of a button, which you can re-experience as panoramas that you feel like you’re actually inhabiting, Apple said.

Over 100 Apple Arcade titles will be available to play via a floating screen and a handheld controller via Vision Pro “on Day One.”

This is a developing story and will be updated.

Continue Reading

Gadgets

Apple’s iOS 17 will focus on “communication, sharing, and intelligent input”

Published

on

Apple

CUPERTINO, Calif.—As has long been a tradition, Apple publicly announced the key features and other details of the next update to the iPhone’s operating system. Apple’s Craig Federighi said the new operating system would focus on “communication, sharing, intelligent input, and new experiences.”

Beginning with communication: as a follow-up to iOS 16’s customizable lock screens, iPhone users can now customize their own “contact poster” that appears on other phones when a call comes in. Posters will appear not just for calls placed via cellular or FaceTime, but with third-party VOIP services like Zoom or Skype as well.

Apple is also adding features for people who like to leave voicemails—live transcription can render text on your phone as the other person is speaking, so you can decide whether to pick up even if you can’t hear or aren’t listening to what the person on the other end is saying. And FaceTime callers will be able to leave video messages, too.

Some of iOS 17's new features.

Some of iOS 17’s new features.

Apple

The Messages app gets a handful of minor updates, including (blessedly) improved search, audio message transcription, and new organization options for iMessage stickers; all of the iOS emoji will also be available to use as stickers. You can also easily create stickers of people using the same AI features Apple uses to separate the subjects of iPhone photos from the photo’s background.

AirDrop is picking up improvements, too. AirDrop transfers can continue over the Internet if the device you’re sending to moves out of range. iPhones (and Apple Watches) held near each other can automatically share contact information between phones, like the “Bump” app used to back in the old days.

Moving onto the “intelligent input” features, Apple is trying to make autocorrect in iOS less frustrating. The keyboard will more readily learn custom words as you type them.

A big new feature is a “Standby” mode wherein an iPhone can act as a sort of smart display. When charging and placed horizontally, the iPhone’s lock screen will display information like weather, calendar appointments, and notifications, behaving similarly to an Amazon Echo Show or a Google Nest Hub. Last year, Apple brought a (sort of) Android-style always-on display to the iPhone Pro models for the first time, but it mostly just showed the time of day.

Apple also announced iPadOS 17. The iPad’s operating system is still largely identical to iOS, so most of the new iOS features will also make their way to Apple’s tablets. But iPads will finally get a couple of missing iOS 16 features, including the customizable lock screens and the ability to put interactive widgets on the home screen. The Health app will also migrate from iPhone to iPad for the first time.

Apple’s 2023 WWDC keynote is ongoing. We’ll be filling in this post with more details, or you can follow along live here.

 

Continue Reading

Gadgets

New DirectX 12-to-Metal translation could bring a world of Windows games to macOS

Published

on

Enlarge / This Diablo II Resurrected screenshot looks pretty unremarkable until you zoom into the top-right and see that it’s running on an Apple M2.

CodeWeavers

Apple has made a tiny bit of progress in the last year when it comes to getting games running on Macs—titles like Resident Evil Village and a recent No Man’s Sky port don’t exactly make the Mac a gaming destination, but they’re bigger releases than Mac users are normally accustomed to.

For getting the vast majority of PC gaming titles running, though, the most promising solution would be a Steam Deck-esque software layer that translates Microsoft’s DirectX 12 API into something compatible with Apple’s proprietary Metal API. Preliminary support for that kind of translation will be coming to CodeWeavers’ CrossOver software this summer, the company announced in a blog post late last week.

CrossOver is a software package that promises to run Windows apps and games under macOS and Linux without requiring a full virtualized (or emulated) Windows installation. Its developers announced that they were working on DirectX 12 support in late 2021, and now they have a sample screenshot of Diablo II Resurrected running on an Apple M2 chip. This early DirectX12 support will ship with CrossOver version 23 “later this summer.”

The announcement is simultaneously promising and caveat-filled; getting this single game running required fixing multiple game-specific bugs in upstream software projects. Support will need to be added on a game-by-game basis, at least at first.

“Our team’s investigations concluded that there was no single magic key that unlocked DirectX 12 support on macOS,” CodeWeavers project manager Meredith Johnson wrote in the blog post. “To get just Diablo II Resurrected running, we had to fix a multitude of bugs involving MoltenVK and SPIRV-Cross. We anticipate that this will be the case for other DirectX 12 games: we will need to add support on a per-title basis, and each game will likely involve multiple bugs.”

In other words, don’t expect Steam Deck-esque levels of compatibility with Windows games just yet. There are also still gameplay bugs even in Diablo II Resurrected, though “the fact that it’s running at all is a huge win.”

API translation layers have become increasingly visible and important in recent years as competing low-level APIs with the same basic goals and features have proliferated and as older APIs have aged past the point where it makes sense to spend time maintaining and improving a native implementation. Valve’s Proton compatibility layer is actually a big bundle of different technologies that translate DirectX 9, 10, 11, and 12 API calls into Vulkan ones. Intel is using Microsoft-created DirectX 9-to-12 translation to improve the performance of old games on its Arc graphics cards. The MoltenVK Vulkan-to-Metal translation layer is also used in many prominent software projects, like Google’s Android emulator for developers working in macOS and the Dolphin GameCube and Wii emulator.

Continue Reading

Trending