Connect with us


Best Microsoft Store Black Friday 2019 deals



(Image: Getty Images/iStockphoto)

While the Microsoft Store hasn’t yet released an actual Black Friday ad (or had one leak), it does have a preview of several deals on its site. 

Also: More Black Friday coverage at CNET

As you might expect, price cuts on the company’s own Surface devices lead the way, along with several sales on Windows laptops. We’ll update this space if new deals emerge, but for the now, here are the highlights Microsoft is touting for Black Friday.

Black Friday 2019: Best Microsoft deals

Disclosure: ZDNet may earn a commission from some of the products featured on this page.


Microsoft Surface Go for $299 ($100 off)

The cheapest route to the Surface world, the base model Go is a 10-inch tablet running Windows 10S via an Intel Pentium 4415Y processor, 4GB of RAM, and a 64GB solid-state drive. Though it’s even cheaper with $100 off for Black Friday, the Surface Go still has premium touches like Corning Gorilla Glass 3 and 1,800×1,200 resolution for the touchscreen display.


Microsoft Platinum Surface Pro 7 with Type Cover Bundle for $649 ($230 off)

With this deal, Microsoft matches the Black Friday Best Buy has been advertising on the base Surface Pro 7. It includes an Intel Core i3 CPU, 4GB of RAM, 128GB SSD, and a 12.3-inch 2,736×1,824 touchscreen display, along with a type cover that turns it from a tablet to a convertible laptop.


HP 15-dy1731ms Laptop for $299 ($190 off)

You can score a 15.6-inch laptop with the latest 10th-generation Intel Core i3-1005G1 processor for under $300 with this deal. It also includes 8 gigs of RAM and 128GB solid-state drive and is set up to run Windows 10 in S mode to extend battery life, though you can convert the full version of Windows 10 for free.


HP 15-dy1771ms Laptop for $499 ($200 off)

If you need more power than the above HP provides, this deal slashes $200 off the price of a laptop with a 10th-generation Core i7-1065G7 instead of a Core i3 chip. You still get 8GB of memory and a 15.6-inch screen, but the solid-state storage is quadrupled to 512GB.  


Lenovo Legion Y540 15 Gaming Laptop for $789.99 ($400 off)

It seems like nearly every retailer offering Black Friday deals on PCs has a couple of gaming laptops on sale. In Microsoft’s case, it has a pair that come in under $1,000 for gamers watching their budget. The Lenovo Legion Y540 packs an Intel Core i7-9750H CPU, 8GB of RAM, and Nvidia GeForce GTX 1650 graphics, along with a 15.6-inch full HD display and generous storage offerings of both a 512GB SSD and a terabyte hard drive.


Asus ROG Strix G Gaming Laptop for $899.99 ($300 off)

Your other affordable gaming laptop choice from Microsoft’s Black Friday deals has storage on board (just a 256GB SSD), but it doubles the amount of RAM to 16GB. The Asus otherwise uses the same processor and graphics card, while providing a similarly sized Full HD screen.


More Microsoft Black Friday 2019 deals

Microsoft has not specified all of its Black Friday specials, including which specific configurations of certain Surface devices will receive savings. Until we know more, here are a few other Black Friday deals Microsoft is promoting:

Source link

Continue Reading


Lone high-energy neutrino likely came from shredded star in distant galaxy



Enlarge / The remains of a shredded star formed an accretion disk around the black hole whose powerful tidal forces ripped it apart. This created a cosmic particle accelerator spewing out fast subatomic particles.

Roughly 700 million years ago, a tiny subatomic particle was born in a galaxy far, far away and began its journey across the vast expanses of our universe. That neutrino finally reached the Earth’s South Pole last October, setting off detectors buried deep beneath the Antarctic ice. A few months earlier, a telescope in California had recorded a bright glow emanating from the friction of that same distant galaxy—evidence of a so-called “tidal disruption event” (TDE), most likely the result of a star being shredded by a supermassive black hole.

According to two new papers (here and here) published in the journal Nature Astronomy, that lone neutrino was likely born from the TDE, which serves as a cosmic-scale particle accelerator near the center of the distant galaxy, spewing out high-energy subatomic particles as the star’s matter is consumed by the black hole. This finding also sheds light on the origin of ultrahigh-energy cosmic rays, a question that has puzzled astronomers for decades.

“The origin of cosmic high-energy neutrinos is unknown, primarily because they are notoriously hard to pin down,” said co-author Sjoert van Velzen, a postdoc at New York University at the time of the discovery. “This result would be only the second time high-energy neutrinos have been traced back to their source.”

Neutrinos travel very near the speed of light. John Updike’s 1959 poem, “Cosmic Gall,” pays tribute to the two most defining features of neutrinos: they have no charge and, for decades, physicists believed they had no mass (they actually have a teeny bit of mass). Neutrinos are the most abundant subatomic particle in the universe, but they very rarely interact with any type of matter. We are constantly being bombarded every second by millions of these tiny particles, yet they pass right through us without our even noticing. That’s why Isaac Asimov dubbed them “ghost particles.”

That low rate of interaction makes neutrinos extremely difficult to detect, but because they are so light, they can escape unimpeded (and thus largely unchanged) by collisions with other particles of matter. This means they can provide valuable clues to astronomers about distant systems, further augmented by what can be learned with telescopes across the electromagnetic spectrum, as well as gravitational waves. Together, these difference sources of information have been dubbed “multi-messenger” astronomy.

The majority of neutrinos that reach the Earth come from our own Sun, but every now and then, neutrino detectors pick up the rare neutrino that hails from further afield. Such is the case with this latest detection: a neutrino that began its journey in a faraway, as yet-unnamed-galaxy in the constellation Delphinus, born from the death throes of a shredded star.

A view of the accretion disc around the supermassive black hole, with jet-like structures flowing away from the disc. The extreme mass of the black hole bends spacetime, allowing the far side of the accretion disc to be seen as an image above and below the black hole.
Enlarge / A view of the accretion disc around the supermassive black hole, with jet-like structures flowing away from the disc. The extreme mass of the black hole bends spacetime, allowing the far side of the accretion disc to be seen as an image above and below the black hole.

DESY, Science Communication Lab

As we’ve reported previously, it’s a popular misconception that black holes behave like cosmic vacuum cleaners, ravenously sucking up any matter in their surroundings. In reality, only stuff that passes beyond the event horizon—including light—is swallowed up and can’t escape, although black holes are also messy eaters. That means that part of an object’s matter is actually ejected out in a powerful jet. If that object is a star, the process of being shredded (or “spaghettified”) by the powerful gravitational forces of a black hole occurs outside the event horizon, and part of the star’s original mass is ejected violently outward. This in turn can form a rotating ring of matter (aka an accretion disk) around the black hole that emits powerful X-rays and visible light. 

Tidal disruption describes the large forces created when a small body passes very close to a much larger one, like a star that strays too close to a supermassive black hole. “The force of gravity gets stronger and stronger, the closer you get to something. That means the black hole’s gravity pulls the star’s near side more strongly than the star’s far side, leading to a stretching effect,” said co-author Robert Stein of DESY in Germany. “As the star gets closer, this stretching becomes more extreme. Eventually it rips the star apart, and then we call it a tidal disruption event. It’s the same process that leads to ocean tides on Earth, but luckily for us the moon doesn’t pull hard enough to shred the Earth.”

TDEs are likely quite common in our universe, even though only a few have been detected to date. For instance, in 2018, astronomers announced the first direct image of the aftermath of a star being shredded by a black hole 20 million times more massive than our Sun, in a pair of colliding galaxies called Arp 299 about 150 million light years from Earth. And last fall, astronomers recorded the final death throes of a star being shredded by a supermassive black hole, publishing the discovery in Nature Astronomy.

The glow from this most recent TDE was first detected on April 9, 2019 by the Zwicky Transient Facility (ZTF) at California’s Mount Palomar observatory, which has spotted more than 30 such events since it came online 2018. Nearly five months later, on October 1, 2019, the IceCube neutrino observatory at the South Pole recorded the signal from a highly energetic neutrino originating from the same direction as the TDE. Just how energetic was it? Co-author Anna Franckowiak of DESY pegged the energy at over 100 teraelectronvolts (TEV), 10 times the maximum energy for subatomic particles that can be produced by the Large Hadron Collider.

Artistic rendering of the IceCube lab at the South Pole. A distant source emits neutrinos that are then detected below the ice by IceCube sensors.
Enlarge / Artistic rendering of the IceCube lab at the South Pole. A distant source emits neutrinos that are then detected below the ice by IceCube sensors.

Ice Cube/NSF

The likelihood of detecting this solitary high-energy neutrino was just 1 in 500. “This is the first neutrino linked to a tidal disruption event, and it brings us valuable evidence,” said Stein. “Tidal disruption events are not well understood. The detection of the neutrino points to the existence of a central, powerful engine near the accretion disc, spewing out fast particles. And the combined analysis of data from radio, optical and ultraviolet telescopes gives us additional evidence that the TDE acts as a gigantic particle accelerator.”

It’s yet one more example of all the new knowledge to be gained by combining multiple data sources to get different perspectives on the same celestial event. “The combined observations demonstrate the power of multi-messenger astronomy,” said co-author Marek Kowalski of DESY and Humboldt University in Berlin. “Without the detection of the tidal disruption event, the neutrino would be just one of many. And without the neutrino, the observation of the tidal disruption event would be just one of many. Only through the combination could we find the accelerator and learn something new about the processes inside.”

As for the future, “We might only be seeing the tip of the iceberg here. In the future, we expect to find many more associations between high-energy neutrinos and their sources,” said Francis Halzen of the University of Wisconsin-Madison, who was not directly involved in the study. “There is a new generation of telescopes being built that will provide greater sensitivity to TDEs and other prospective neutrino sources. Even more essential is the planned extension of the IceCube neutrino detector, that would increase the number of cosmic neutrino detections at least tenfold.”

DOI: Nature Astronomy, 2021. 10.1038/s41550-020-01295-8

DOI: Nature Astronomy, 2021. 10.1038/s41550-021-01305-3  (About DOIs).

Continue Reading


Johnson & Johnson’s vaccine safe and effective, FDA review concludes



Enlarge / A sign at the Johnson & Johnson campus on August 26, 2019 in Irvine, California.

Johnson & Johnson’s single-shot COVID-19 vaccine is effective and has a “favorable safety profile,” according to scientists at the Food and Drug Administration.

The endorsement comes out of a review released by the regulatory agency Wednesday. The FDA has been looking over data on Johnson & Johnson’s vaccine since February 4, when the company applied for Emergency Use Authorization. The agency’s green light is a positive sign ahead of this Friday, February 26, when the FDA will convene an advisory committee to make a recommendation on whether the FDA should grant the EUA. The FDA isn’t obligated to follow the committee’s recommendation, but it usually does.

If Johnson & Johnson’s vaccine is granted an EUA, it will become the third COVID-19 vaccine available for use in the US. The other two vaccines are both two-dose, mRNA-based vaccines, one made by Pfizer and its German partner BioNTech and the other from Moderna, which developed its vaccine in collaboration with researchers at the US National Institutes of Health.

According to data from a Phase III clinical trial involving more than 44,000 participants, Johnson & Johnson’s vaccine is less effective than the two mRNA vaccines, which were both around 95 percent effective at preventing symptomatic COVID-19. Johnson & Johnson’s vaccine was found to be 66 percent effective overall at preventing moderate to severe COVID-19. However, efficacy differed based on the trial’s location sites, with efficacy found to be 72 percent in the United States, 66 percent in Latin America, and 57 percent in South Africa. The differences may be partly explained by the circulation of variants in Latin America and South Africa, which have been found to reduce the efficacy of vaccines.

Favorable review

But overall, Johnson & Johnson’s vaccine was 85 percent effective against severe COVID-19. Even in South Africa, the vaccine was 82 percent effective against severe and critical COVID-19, according to the FDA’s review.

After the shot, six vaccinated participants and 42 participants who received the placebo were hospitalized. When researchers looked out 28 days after vaccination, zero vaccinated participants were hospitalized, compared with 16 in the placebo group. There were seven deaths in the trial, but all were in the placebo.

Though the efficacy numbers are lower than the mRNA vaccines, experts spotlight the high efficacy against severe disease and death—the most critical functions of any vaccine. Moreover, Johnson & Johnson’s vaccine has clear logistical advantages over the other vaccines. It is only one shot, rather than two, and it also doesn’t require freezer temperatures during shipping.

In terms of side effects, the FDA found that the vaccine has a favorable safety profile, with no specific safety concerns and the most common effects being mild to moderate pain at the injection site, headache, fatigue, and myalgia.

The fate of the vaccine now moves to the FDA advisory committee, which will dive deeper into all the data. If the FDA grants the EUA, Johnson & Johnson’s executive said in congressional testimony this week that the company would provide 4 million doses after the EUA, with a total of 20 million ready by the end of March and a total of 100 million by the end of June.

Continue Reading


D-Wave’s hardware outperforms a classic computer




Early on in D-Wave’s history, the company made bold claims about its quantum annealer outperforming algorithms run on traditional CPUs. Those claims turned out to be premature, as improvements to these algorithms pulled the traditional hardware back in front. Since then, the company has been far more circumspect about its performance claims, even as it brought out newer generations of hardware.

But in the run-up to the latest hardware, the company apparently became a bit more interested in performance again. And it recently got together with Google scientists to demonstrate a significant boost in performance compared to a classical algorithm, with the gap growing as the problem became complex—although the company’s scientists were very upfront about the prospects of finding a way to boost classical hardware further. Still, there are a lot of caveats even beyond that, so it’s worth taking a detailed look at what the company did.

Magnets, how do they flip?

D-Wave’s system is based on a large collection of quantum devices that are connected to some of their neighbors. Each device can have its state set separately, and the devices are then given the chance to influence their neighbors as the system moves through different states and individual devices change their behavior. These transitions are the equivalent of performing operations. And because of the quantum nature of these devices, the hardware seems to be able to “tunnel” to new states, even if the only route between them involves high-energy states that are impossible to reach.

In the end, if the system is operated properly, the final state of the devices can be read out as an answer to the calculation performed by the operations. And because of the quantum effects, it can potentially provide solutions that a classical computer might find difficult to reach.

Validating that idea, however, has proven challenging, as noted above. Where the system has done best is in modeling quantum systems that look a lot like the quantum annealing hardware itself. And that’s what the D-Wave/Google team has done here. The problem can be described as an array of quantum magnets, with the orientation of each magnet influencing that of its neighbors. The system is in the lowest energy state when all of a magnet’s neighbors have the opposite orientation. Depending on the precise configuration of the array, however, that might not be possible to satisfy.

Now, imagine that you start the system in a configuration where the magnets aren’t in a stable state—there are too many cases where neighboring magnets have the same orientation. Magnets will start flipping to get there, but in the process, they may cause their neighbors to flip. The whole thing may work through a variety of intermediate configurations to make its way toward stability. Because of the quantum nature of the device’s components, the progression through different states may involve some steps that are, to our non-quantum brains, difficult to understand.

Quantum Monte Carlo

This system is interesting for a couple of reasons: it’s an approachable way to examine complicated quantum behaviors, and other interesting problems can be mapped onto its behavior. So researchers have figured out how to look at its behavior using computer algorithms. The one the research team says shows the highest performance is what’s called Path-Integral Monte Carlo. “Path-integral” simply indicates that there are multiple valid paths between a starting state and a low-energy state, and the software looks at a subset of them, since there are so many. “Monte Carlo” is an indication that the paths it does sample are chosen randomly.

But the D-Wave system looks a lot like an array of quantum magnets, so it’s possible to configure it so that it behaves a lot like what is being modeled. There’s a chance that configuring the D-Wave machine properly can get it to very efficiently recapitulate the behavior of the system being modeled.

This is what the team tried for the paper, but it found out there was a little problem. With the traditional computing algorithm, it’s easy to essentially stop the system and look at how it’s evolving. With the D-Wave system, things moved so quickly that it ended up carrying on to the final state before it could be sampled. Instead, the researchers had to arrange some fairly tortured configurations to slow the D-Wave hardware down long enough to have a look at what was going on.

The performance measurement the team cared about isn’t the final state; instead, it’s trying to figure out how quickly a given configuration of magnets will take to reach a stable, equilibrium state.

For generating this measure, the researchers found that the D-Wave hardware could outperform the x86 CPU they were using (a hyperthreading Xeon with 26 cores). And the advantage grew larger as the research team increased the complexity of the magnets’ arrangement, reaching up to 3 million times faster. And while the entire D-Wave system didn’t behave as a single quantum object, there were quantum interactions that were larger than the smallest groups of magnets in the D-Wave hardware (linked groups of four).

The caveats

To start with, the gap in performance is between a single Xeon and a chip that requires a cabinet-sized cooling system with some pretty hefty energy use. Should the classical computer algorithm scale with additional processors, it should be relatively simple to put this on a cluster and take a big chunk out of D-Wave’s speed advantage. But Ars’ own Chris Lee notes that even on the simpler problems, the Xeon (which has 26 cores) was already struggling with any increase in complexity. This might be a sign that there are only limited gains we can expect from throwing more processors at the issue.

That said, D-Wave was also not operating at its full advantage. While it recently introduced a new generation of processors, the work was done on an experimental processor that was part of the development of the new generation. This had the same hardware layout—same number and connections among the quantum devices—as the previous generation of hardware. But it was made with a new manufacturing process that lowered the noise in the system and was put into full use in the latest generation of chips.

In addition, the new generation more than doubles the quantum devices on the chip and boosts the connectivity among them. These advances should allow the system to model larger and more complicated magnet arrays, expanding D-Wave’s advantage back.

Finally, the team behind the work emphasizes that there may be ways to optimize the performance of the classical algorithm as well, saying, “Our study does not constitute a demonstration of superiority over all possible classical methods.” How this all shakes out will undoubtedly come with additional work, so we may not have an update on where performance stands for a couple of years.

Still, it’s interesting that D-Wave has become so interested in performance again. The company recently announced that it had adapted its control software so that a specific type of operation (a quadratic unconstrained binary optimization) could be both used by a D-Wave machine and sent to the Qiskit software package that would allow it to run on IBM’s quantum computers. This makes sense for the company’s user base; a large percentage of the base is made up of companies that are simply trying to make sure they’re ready for any disruptive computing technologies, so they are looking at all the quantum hardware on the market. But in the press release announcing the data, the company says this “opens the door to performance comparisons.”

Nature Communications, 2021. DOI: 10.1038/s41467-021-20901-5  (About DOIs).

Continue Reading