Connect with us

Gadgets

Why did last night’s ‘Game of Thrones’ look so bad? Here comes the science! – TechCrunch

Published

on

Last night’s episode of “Game of Thrones” was a wild ride and inarguably one of an epic show’s more epic moments — if you could see it through the dark and the blotchy video. It turns out even one of the most expensive and meticulously produced shows in history can fall prey to the scourge of low quality streaming and bad TV settings.

The good news is this episode is going to look amazing on Blu-ray or potentially in future, better streams and downloads. The bad news is that millions of people already had to see it in a way its creators surely lament. You deserve to know why this was the case. I’ll be simplifying a bit here because this topic is immensely complex, but here’s what you should know.

(By the way, I can’t entirely avoid spoilers, but I’ll try to stay away from anything significant in words or images.)

It was clear from the opening shots in last night’s episode, “The Longest Night,” that this was going to be a dark one. The army of the dead faces off against the allied living forces in the darkness, made darker by a bespoke storm brought in by, shall we say, a Mr. N.K., to further demoralize the good guys.

If you squint you can just make out the largest army ever assembled

Thematically and cinematographically, setting this chaotic, sprawling battle at night is a powerful creative choice and a valid one, and I don’t question the showrunners, director, and so on for it. But technically speaking, setting this battle at night, and in fog, is just about the absolute worst case scenario for the medium this show is native to: streaming home video. Here’s why.

Compression factor

Video has to be compressed in order to be sent efficiently over the internet, and although we’ve made enormous strides in video compression and the bandwidth available to most homes, there are still fundamental limits.

The master video that HBO put together from the actual footage, FX, and color work that goes into making a piece of modern media would be huge: hundreds of gigabytes if not terabytes. That’s because the master has to include all the information on every pixel in every frame, no exceptions.

Imagine if you tried to “stream” a terabyte-sized TV episode. You’d have to be able to download upwards of 200 megabytes per second for the full 80 minutes of this one. Few people in the world have that kind of connection — it would basically never stop buffering. Even 20 megabytes per second is asking too much by a long shot. 2 is doable — slightly under the 25 megabit speed (that’s bits… divide by 8 to get bytes) we use to define broadband download speeds.

So how do you turn a large file into a small one? Compression — we’ve been doing it for a long time, and video, though different from other types of data in some ways, is still just a bunch of zeroes and ones. In fact it’s especially susceptible to strong compression because of how one video frame is usually very similar to the last and the next one. There are all kinds of shortcuts you can take that reduce the file size immensely without noticeably impacting the quality of the video. These compression and decompression techniques fit into a system called a “codec.”

But there are exceptions to that, and one of them has to do with how compression handles color and brightness. Basically, when the image is very dark, it can’t display color very well.

The color of winter

Think about it like this: There are only so many ways to describe colors in a few words. If you have one word you can say red, or maybe ochre or vermilion depending on your interlocutor’s vocabulary. But if you have two words you can say dark red, darker red, reddish black, and so on. The codec has a limited vocabulary as well, though its “words” are the numbers of bits it can use to describe a pixel.

This lets it succinctly describe a huge array of colors with very little data by saying, this pixel has this bit value of color, this much brightness, and so on. (I didn’t originally want to get into this, but this is what people are talking about when they say bit depth, or even “highest quality pixels.”)

But this also means that there are only so many gradations of color and brightness it can show. Going from a very dark grey to a slightly lighter grey, it might be able to pick 5 intermediate shades. That’s perfectly fine if it’s just on the hem of a dress in the corner of the image. But what if the whole image is limited to that small selection of shades?

Then you get what we see last night. See how Jon (I think) is made up almost entirely of only a handful of different colors (brightnesses of a similar color, really) in with big obvious borders between them?

This issue is called “banding,” and it’s hard not to notice once you see how it works. Images on video can be incredibly detailed, but places where there are subtle changes in color — often a clear sky or some other large but mild gradient — will render in large stripes as the codec goes from “darkest dark blue” to “darker dark blue” to “dark blue,” with no “medium darker dark blue” in between.

Check out this image.

Above is a smooth gradient encoded with high color depth. Below that is the same gradient encoded with lossy JPEG encoding — different from what HBO used, obviously, but you get the idea.

Banding has plagued streaming video forever, and it’s hard to avoid even in major productions — it’s just a side effect of representing color digitally. It’s especially distracting because obviously our eyes don’t have that limitation. A high-definition screen may actually show more detail than your eyes can discern from couch distance, but color issues? Our visual systems flag them like crazy. You can minimize it in various ways, but it’s always going to be there, until the point when we have as many shades of grey as we have pixels on the screen.

So back to last night’s episode. Practically the entire show took place at night, which removes about 3/4 of the codec’s brightness-color combos right there. It also wasn’t a particularly colorful episode, a directorial or photographic choice that highlighted things like flames and blood, but further limited the ability to digitally represent what was on screen.

It wouldn’t be too bad if the background was black and people were lit well so they popped out, though. The last straw was the introduction of the cloud, fog, or blizzard, whatever you want to call it. This kept the brightness of the background just high enough that the codec had to represent it with one of its handful of dark greys, and the subtle movements of fog and smoke came out as blotchy messes (often called “compression artifacts” as well) as the compression desperately tried to pick what shade was best for a group of pixels.

Just brightening it doesn’t fix things, either — because the detail is already crushed into a narrow range of values, you just get a bandy image that never gets completely black, making it look washed out, as you see here:

(Anyway, the darkness is a stylistic choice. You may not agree with it, but that’s how it’s supposed to look and messing with it beyond making the darkest details visible could be counterproductive.)

Now, it should be said that compression doesn’t have to be this bad. For one thing, the more data it is allowed to use, the more gradations it can describe, and the less severe the banding. It’s also possible (though I’m not sure where it’s actually done) to repurpose the rest of the codec’s “vocabulary” to describe a scene where its other color options are limited. That way the full bandwidth can be used to describe a nearly monochromatic scene even though strictly speaking it should be only using a fraction of it.

But neither of these are likely an option for HBO: Increasing the bandwidth of the stream is costly, since this is being sent out to tens of millions of people — a bitrate increase big enough to change the quality would also massively swell their data costs. When you’re distributing to that many people, that also introduces the risk of hated buffering or errors in playback, which are obviously a big no-no. It’s even possible that HBO lowered the bitrate because of network limitations — “Game of Thrones” really is stretching the limits of digital distribution in some ways.

And using an exotic codec might not be possible because only commonly used commercial ones are really capable of being applied at scale. Kind of like how we try to use standard parts for cars and computers.

This episode almost certainly looked fantastic in the mastering room and FX studios, where they not only had carefully calibrated monitors with which to view it but also were working with brighter footage (it would be darkened to taste by the colorist later) and less or no compression. They might not even have seen the “final” version that fans “enjoyed.”

We’ll see the better copy eventually, but in the meantime the choice of darkness, fog, and furious action meant the episode was going to be a muddy, glitchy mess on home TVs.

And while we’re on the topic…

You mean my TV isn’t the problem?

Couple watching TV on their couch.

Well… to be honest, it might be that too. What I can tell you is that simply having a “better” TV by specs, such as 4K or a higher refresh rate or whatever, would make almost no difference in this case. Even built-in de-noising and de-banding algorithms would be hard pressed to make sense of “The Long Night.” And one of the best new display technologies, OLED, might even make it look worse! Its “true blacks” are much darker than an LCD’s backlit blacks, so the jump to the darkest grey could appear more jarring.

That said, it’s certainly possible that your TV is also set up poorly. Those of us sensitive to this kind of thing spend forever fiddling with settings and getting everything just right for exactly this kind of situation. There are dozens of us! And this is our hour.

Usually “calibration” is actually a pretty simple process of making sure your TV isn’t on the absolute worst settings, which unfortunately many are out of the box. Here’s a very basic three-point guide to “calibrating” your TV:

  1. Turn off anything with a special name in the “picture” or “video” menu, like “TrueMotion,” “Dynamic motion,” “Cinema mode,” any stuff like that. Most of these make things look worse, and so-called “smart” features are often anything but. Especially anything that “smooths” motion — turn those off first and never ever turn them on again. Note: Don’t mess with brightness, gamma, color space, pretty much anything with a number you can change.
  2. Figure out light and color by putting on a good, well-shot movie the way you normally do. While it’s playing, click through any color presets your TV has. These are often things like “natural,” “game,” “cinema,” “calibrated,” and so on, and take effect right away. Some may make the image look too green, or too dark, or whatever. Play around with it and whichever makes it look best, just use that one. You can always change it again later – I myself switch between a lighter and darker scheme depending on time of day and content.
  3. Don’t worry about HDR, dynamic lighting, and all that stuff for now. There’s a lot of hype about these technologies and they are still in their infancy. Few will work out of the box and the gains may or may not be worth it. The truth is a well shot movie from the ’60s or ’70s can look just as good today as a “high dynamic range” show shot on the latest 8K digital cinema rig. Just focus on making sure the image isn’t being actively interfered with by your TV and you’ll be fine.

Unfortunately none of these things will make “The Long Night” look any better until HBO releases a new version of it. Those ugly bands and artifacts are baked right in. But if you have to blame anyone, blame the streaming infrastructure that wasn’t prepared for a show taking risks in its presentation, risks I would characterize as bold and well executed, unlike the writing in the show lately. Oops, sorry, couldn’t help myself.

If you really want to experience this show the way it was intended, the fanciest TV in the world wouldn’t have helped last night, though when the Blu-ray comes out you’ll be in for a treat. But here’s hoping the next big battle takes place in broad daylight.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Gadgets

Linus Torvalds doubts Linux will get ported to Apple M1 hardware

Published

on

Enlarge / It would be great to see Linux running and fully operational on Apple M1 hardware like this Mac Mini—but it seems unlikely to happen.

In a recent post on the Real World Technologies forum—one of the few public internet venues Linux founder Linus Torvalds is known to regularly visit—a user named Paul asked Torvalds, “What do you think of the new Apple laptop?”

“I’d absolutely love to have one, if it just ran Linux,” Torvalds replied. “I’ve been waiting for an ARM laptop that can run Linux for a long time. The new [Macbook] Air would be almost perfect, except for the OS.”

Torvalds, of course, can already have an ARM based Linux laptop if he wants one—for example, the Pinebook Pro. The unspoken part here is that he’d like a high-performance ARM based laptop, rather than a budget-friendly but extremely performance constrained design such as one finds in the Pinebook Pro, the Raspberry Pi, or a legion of other inexpensive gadgets.

Apple’s M1 is exactly that—a high performance, desktop-and-laptop oriented system that delivers world-class performance while retaining the hyperefficient power and thermal characteristics needed in the phone and tablet world. On paper, an M1-powered Macbook Air would make a fantastic laptop for Linux or even Windows users—but it seems unlikely that Apple will share.

In an interview with ZDNet, Torvalds expounded on the problem:

The main problem with the M1 for me is the GPU and other devices around it, because that’s likely what would hold me off using it because it wouldn’t have any Linux support unless Apple opens up… [that] seems unlikely, but hey, you can always hope.

Torvalds is almost certainly correct that Apple won’t be forthcoming with sufficient detail about the M1 System on Chip (SoC) for Linux kernel developers to build first-class support. Even in the much better-understood Intel world, Macs haven’t been a good choice for Linux enthusiasts for several years, and for the same reason. As Apple brings its own hardware stack further and further in-house, open source developers get less and less information to port operating systems and write hardware drivers for the platform.

We strongly suspect that by the time enthusiasts could reverse-engineer the M1 SoC sufficiently for first-class Linux support, other vendors will have seen the value in bringing high performance ARM systems to the laptop market—and it will be considerably easier to work with the more open designs many will use.

Up until now, ARM based laptops and miniature PCs have attempted to disrupt the market by shooting low on budget, rather than high on performance. Examples include but are not limited to: the $200 Pinebook Pro laptop, the $100 Raspberry Pi Model 400, and the $99 Nvidia Jetson.

Now that Apple has proven ARM’s value in the performance as well as the budget space, we broadly expect competing systems using high-end Snapdragon and similar processors to enter the market within the next few years. Such systems wouldn’t need to beat—or even match—the M1’s standout performance; they’d simply need to compete strongly with more traditional x86_64 systems on performance and price, while dominating them in power consumption and thermal efficiency.

It’s also worth noting that while the M1 is unabashedly great, it’s not the final word in desktop or laptop System on Chip designs. Torvalds mentions that, given a choice, he’d prefer more and higher-power cores—which is certainly possible and seems a likely request to be granted soon.

Continue Reading

Gadgets

Apple’s M1 MacBook Air has that Apple Silicon magic

Published

on

Enlarge / Hey, my macro lens still works!

Lee Hutchinson

The new M1-powered MacBook Air is hilariously fast, and the battery lasts a long-ass time.

If you stop reading this review immediately after this, then know that unless Windows virtualization is a requirement of your workflow, you should probably just go ahead and sell your old MacBook Air immediately and get this thing instead.

Assuming you’ve got a grand or so lying around that you weren’t going to spend on something else. But hey, if you do, then I can confidently tell you that in spite of what a legion of Doubting Thomases (including me!) might have said about Apple’s freshman effort at its own PC silicon, it is now my studied opinion that there are far, far stupider ways to part with your cash.

A quick caveat on this “review”

Specs at a glance: 2020 MacBook Air (M1)
Screen 2560×1600 at 13.3 inches
OS macOS Big Sur 11.0.1
CPU Apple M1
RAM 16GB
GPU Apple M1 (8 core)
HDD 1TB SSD
Networking 802.11ax Wi-Fi 6; IEEE 802.11a/b/g/n/ac; Bluetooth 5.0
Ports 2x Thunderbolt 3/USB 3.1 Gen 2/DisplayPort, 3.5mm headphone
Size 0.16–0.63×11.97×8.36-inch (0.41–1.61×30.41×21.24cm)
Weight 2.8 lbs (1.29kg)
Warranty 1 year, or 3 years with AppleCare+
Price as reviewed $1,299
Other perks 720p FaceTime HD camera, stereo speakers

Apple provided Ars with a couple of M1 Mac Minis for review. One of those went to Samuel for him to write up, and the other went to Jim for him to do his silicon analysis. Apple declined our request for any model of M1-powered laptop.

The MacBook Air being reviewed here is my personal device, which I bought shortly after the unveiling event. I’ve written this as quickly as possible after receiving it, but I had to wait for the device, which is why you all had to wait for the review. (This is also why it’s in kind of an intermediate configuration, rather than stock or maxed out like most review devices—I bumped the RAM up to 16GB and the internal storage up to 1TB, because that’s what I wanted.)

Because this is my device, I’m coming into this review from a slightly different perspective than some of the other publications doing MBA reviews. I’m not going to tell you why you should buy a MacBook Air, or how it might work for you. But I am going to talk about what it has been like to own it for a few days and how the device fits into my life. I do most of my power-user stuff on the desktop rather than on a portable, but I do occasionally need to leave the office and hit the road—and the M1 MBA is going to be a great traveling companion. You know, once we can hit the road again without worrying about plagues and stuff.

Unboxinating

Approaching a device like this as a reviewer is different from approaching a device as a consumer. When the UPS guy drops it off, you can’t just rip the box open and jump in—there’s stuff you have to do first.

Tripods. Lights. Gotta iron the big white sweep cloth so I’ve got a background for pix. Gotta try to remember where the DSLR battery is.

It’s the oddest part about working for Ars, even after going on eight years. Your technology buying experiences are not always your own—sometimes the Ars readership comes along for the ride.

So after unboxing, I logged on and ran some benchmarks. That’s the first thing you have to do when you’re reviewing—you either do the benchmarks first, or you do them dead last, and I wanted to get them out of the way because this was, you know, my laptop, and I’d actually like to use it for stuff rather than having it be tied up running battery tests for 20 hours at a time.

Only a few days earlier, I had used my living room HTPC—a base-config 2018 Mac mini—to do the entire set of Mac comparison benchmarks for Samuel’s Mac mini review. I had a pretty good feel for how quickly the Intel mini’s hex-core i5 banged through each of the tests, since I’d just seen the numbers, and from talking to Samuel and Jim I was anticipating the new MBA’s M1 would beat the Intel-powered mini.

I just didn’t realize how hard a beatdown it would be.

Getting the benchmarky bits out of the way

So here’s how fast it is in a bunch of charts and graphs.

According to Apple, the MacBook Air’s M1 is voltage-limited in order to function within the fanless design’s thermal envelope. iFixit’s teardown shows in detail that the Air’s M1 cooling setup is an entirely passive affair, with just a heat transfer plate in between the M1 CPU and the aluminum body. I was expecting performance similar to but perhaps a bit lower than the M1-powered Mac mini, and that’s more or less what I got. However, the Air’s M1 is good for at least a few solid minutes of full-bore Firestorm core performance before it throttles back.

The M1 MBA's passive cooling setup, <a href="https://www.ifixit.com/News/46884/m1-macbook-teardowns-something-old-something-new">disassembled over at iFixit</a>.
Enlarge / The M1 MBA’s passive cooling setup, disassembled over at iFixit.

In benchmarking, I noticed that subsequent runs of the Final Cut Pro export would slow down dramatically—the first export would complete in about 1 minute and 19 seconds, but if I immediately repeated the export it would take a bit under 2.5 minutes—and the Air would be quite warm to the touch. After closing the lid to hibernate until the Air was cool and then repeating the export, the time was once again in the 1:20-ish range.

To create some more sustained load, I cloned the source video three times and then repeated the export process. Starting from a cold startup with the MBA’s chassis at ambient temperature gave a result of 4 minutes, 21 seconds. This time, I opened Activity Monitor’s CPU graph to spy on the core utilization. All eight cores were engaged until about 2:56, at which time half of the cores—presumably the high-performance Firestorm cores—dropped to less than 50-percent usage and stayed there until the run completed.

A second run immediately after that took 7:37—not quite twice as long, but heading in that direction. Activity Monitor’s CPU usage graph showed half of the cores (presumably the high-performance Firestorm cores) at half utilization for the entire run.

Further testing—including several runs after letting the MBA sit powered off for about an hour to make absolutely sure it was cooled to ambient—failed to produce anything resembling a precise, repeatable time interval for when throttling starts. The best I can do is to say that it seems that when you throw a heavy workload at the MBA, it runs at full-bore until the Firestorm cores become too toasty, which seems to take anywhere from 3-ish to 6-ish minutes. Then it backs the Firestorm cores off until they show about 50-percent utilization, and the amount of heat generated at that level seems to be within the sustained thermal capacity of the design.

(These are subjective measurements, taken in whatever indoor ambient conditions happened to be happening in my house as I was doing the testing. Your results may vary.)

I hate USB-C charging, give me back MagSafe

The other major thing for a portable like the MBA is battery life, and we’re going to talk about that. But first, very briefly, the loss of MagSafe sucks.

Yes, I know I’m late to the discussion. I know MagSafe was deleted a few hardware revisions ago, but I’m going from a MacBook Air with it to a MacBook Air without it, and plugging in a USB-C cable feels like going back to the freaking dark ages. I’ve been happy with MagSafe plugs on my laptops for almost an entire decade—that quick one-handed snick into place, that easy no-fuss pull to disengage, and that friendly LED to tell you when you’re all charged up.

Gone but not forgotten. I miss these so damn much.

Gone but not forgotten. I miss these so damn much.

Jacqui Cheng

Having to shove a connector into a high-friction plug—often requiring two hands, depending on how you’re holding stuff—is stupid. It’s just stupid. This is a customer-hostile regression in functionality. I’m sure there are excellent reasons for it and that it saves Apple money on the MBA’s bill of materials and on warranty support, but I hate it and it’s terrible. This is not the premium Apple experience I feel like I’m paying for.

Battery life

I used the M1 MacBook Air for work all day one day, filling up about 11 hours of on-the-clock time with Slack, emailing, Zoom conferencing, Messages, and Web browsing, and the Air still had 40 percent remaining on the battery meter when the day was done. This is considerably longer than my old 2015 MBA, which throws in the towel around hour five. (Unlike with the official battery test, my unofficial workday usage test was done with adaptive brightness and Night Shift enabled, and there was a fair amount of idling.)

In the official Ars battery test, with the screen locked at our reference brightness of 200 nits, the M1 MBA lasted for 877 minutes—a bit over 14.5 hours. Charge time back from almost dead to full took a bit over two hours with the included 30W adapter, with the device powered off during the charge.

But I don’t usually spend the day working on my laptop—instead, the place where my old MBA most often lets me down is on long flights. Living in Houston means I usually fly United, and United is particularly miserly with power plugs—if you don’t get certain specific seats, you’re out of luck. In my experience, my Intel MBA is good for three, maybe four hours of movie watching before it’s dead as a doornail—so if I’m flying to California or pretty much anywhere that’s more than a couple of hours away and I don’t get a power outlet seat, I know I probably need to bring a book.

The M1 Air laughs at my old MBA. It laughs at it, gives it noogies, and flushes its head down the toilet in the locker room.

Artist's impression of how I felt about the M1 MacBook Air's battery life as it continued to play <em>Westworld</em> episodes without running out of juice.
Enlarge / Artist’s impression of how I felt about the M1 MacBook Air’s battery life as it continued to play Westworld episodes without running out of juice.

I left the M1 MBA playing 4K Westworld episodes from the UHD BluRay box set, full screen and at max brightness, with the sound blaring at max volume. I finally gave up and shut the laptop off after ten hours, at which point it still said it had 13-percent battery remaining. That’s not only long enough to last out any domestic flight—that’s enough to last you an international flight from the US to Europe.

A quick note on resuming from sleep: during the Air’s reveal, Apple showed off how quickly the Air resumes from standby by having Senior VP Craig Federighi lift the lid of a sleeping MacBook Air and peek in, all set to the mellow sounds of Barry White. While I can’t say that Barry White plays when I open up my laptop, I can say that the M1 Air wakes from sleep very quickly. It’s not that it’s faster than my Intel-powered Air, since the 2015 model will sometimes wake up instantly, too—but the 2015 Air also sometimes takes a second or two to blink on when I lift the lid. The M1 Air is much more consistent—I’ve only had the thing for a few days, but every wake-from-sleep has been lightning quick.

Continue Reading

Gadgets

WireGuard for Windows 0.3.1 is the release you’ve been waiting for

Published

on

Enlarge / I heroically resisted the urge to create a “WireGuard for Workgroups 0.3.1” image for this piece.

Jim Salter

This Monday, WireGuard founder and lead developer Jason Donenfeld announced a new WireGuard release for the Windows platform. The release is something of a godsend for administrators hoping to implement WireGuard as a replacement for more traditional end-user VPNs in a business environment, adding several new features that will make their lives easier—or simply make its implementation possible, in environments where it otherwise would not.

If you haven’t heard about WireGuard yet, it’s a relatively new VPN protocol featuring advanced cryptography. It’s implemented from the ground up as an exercise in cleanly written, minimalist, maximally secure and performant code—and it succeeded at those goals well enough to get Linus Torvalds’ own rarely-seen stamp of approval.

Installation

Existing WireGuard users will be prompted with obvious UI hints to download and install the new version, directly from within the application itself.
Enlarge / Existing WireGuard users will be prompted with obvious UI hints to download and install the new version, directly from within the application itself.

Jim Salter

Those who are already using WireGuard on Windows will receive an obvious in-app prompting to download and install the new version, which works swimmingly. New users can download WireGuard directly from its website.

The simple “Download Installer” button is aimed at Windows end users, and this probes the user’s system to determine which MSI installer to fetch and execute, based on the user’s system architecture. Sysadmin types may also browse the list of MSIs directly, for use with Active Directory Group Policy automated deployments.

WireGuard for Windows currently supports x86_64, x86 (32-bit), ARM, and ARM64 architectures.

Improved tunnel management for Windows users

Probably the most desperately-sought feature in WireGuard’s windows implementation is the ability for unprivileged users to activate and deactivate WireGuard tunnels via the app’s user interface. Until release 0.3.1, WireGuard has only allowed members of the Administrators group to open the UI, let alone do anything within it.

As of version 0.3.1, that limitation has finally been removed. Unprivileged users may be added to the Windows Builtin group “Network Configuration Operators”—and, once members of that group, if and only if the requisite registry key was added and DWORD value set, they can manage their own tunnel into the corporate LAN.

There’s one more step necessary to enable the limited UI—you need to open regedit, create the key HKLMSOFTWAREWireGuard, then create a DWORD at HKLMSOFTWAREWireGuardLimitedOperatorUI and set it to 1. (Don’t be confused at the lack of HKLMSOFTWAREWireGuard itself—you’ll need to create that, too.)

Otherwise-unprivileged users who’ve been allowed into the WireGuard club can see the tunnels available and start and stop those tunnels. They cannot see the public keys for the tunnels—and more importantly, they can neither add, remove, nor edit those tunnels.

Unprivileged users also cannot exit the WireGuard application itself—they can close the dialog just fine, but the “exit WireGuard” item is missing from the context menu in the system tray. This is because closing the WireGuard app from the system tray doesn’t just get rid of the icon, or even disable the WireGuard tunnel services—it actually uninstalls those services entirely. (The services are automatically reinstalled the next time an Administrator runs the WireGuard app.)

Also new to WireGuard for Windows 0.3.1, multiple tunnels can be simultaneously activated from the GUI. This feature is also registry-gated for now—to use it, you’ll need to create a DWORD at HKLMSoftwareWireGuardMultipleSimultaneousTunnels and set it to 1. Without creating and setting that DWORD, WireGuard for Windows 0.3.1 continues to behave like earlier versions, and activating one tunnel from the GUI will automatically deactivate any others.

Continue Reading

Trending