Connect with us

Science

Dell launches Alienware Aurora R9, G5 gaming desktops at Gamescom 2019

Published

on

Alienware Aurora R9

Dell kicked off the Gamescom trade event in Cologne, Germany with the introduction of a pair of new gaming desktops that should appeal to gamers who can’t afford to spend thousands of dollars on their rigs.

The more eye-catching launch is the Aurora R9 from Dell’s Alienware gaming arm. It’s the first desktop to be built based around Alienware’s Legend industrial design, which was previously limited to laptops like the updated Area-51m. The company claims the Legend design principle allow the R9 to be slimmer while airflow is improved through the chassis, all while retaining the adventurous look that Alienware is known for. It will be available in a light or dark case color, and the front features Alienware’s usual animated lighting effects.

Inside, the R9 houses the latest ninth-generation Intel Core processors, though you may want to upgrade from the Core i3-9100 in the base configuration. You also get a wide range of graphics card options from AMD and Nvidia, starting with the AMD Radeon RX 560X and Nvidia GeForce GTX 1650 and going all the way to dual Radeon RX 570X or GeForce RTX 2080 GPUs. The base model includes 8GB of HyperX Fury DDR 4 RAM, but you can add up to 64 gigs as well as choose overclocking options. You can equip the R9 with up to a 2TB hard drive or NVMe SSD in single drive options, or double that in a dual-drive option with both a 2TB hard drive and 2TB SSD.

The Aurora R9 is due to go on sale this week with a starting price of $969.99, though with upgrades it could quickly lose that mid-range price tag. Gamers with an even tighter budget may want to look at Dell’s new G5 desktop, the first in its G Series of what had been just gaming laptops. Starting at just $629.99, the G5 obviously lacks some of the bells and whistles of the Aurora R9, sticking to a traditional compact tower design, for instance, though with a few design flourishes like an optional clear side panel and optional blue LED lighting touches.

The base G5 comes with the same Core i3-9100 CPU and similar graphics card options as the R9, but you don’t get the premium RAM (or overclocking options). You’re also limited to 1TB solid-state drives instead of 2 terabytes, and your GPU option maxes out with the Nvidia GeForce RTX 2080 (no dual-card choices). Like the R9, you can get up to 64GB of RAM, and there are dual-drive options up to a 1TB NVMe SSD plus 2TB hard drive. Of course, at that point you might look for a pricier desktop altogether, but at its sub-$700 starting price, the G5 could appeal to gamers heading back to school who prefers desktop gaming over the laptop variety.

Source link



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Science

Atlantic currents seem to have started fading last century

Published

on

Enlarge / The Gulf Stream, as imaged from space.

The major currents in the Atlantic Ocean help control the climate by moving warm surface waters north and south from the equator, with colder deep water pushing back toward the equator from the poles. The presence of that warm surface water plays a key role in moderating the climate in the North Atlantic, giving places like the UK a far more moderate climate than its location—the equivalent of northern Ontario—would otherwise dictate.

But the temperature differences that drive that flow are expected to fade as our climate continues to warm. A bit over a decade ago, measurements of the currents seemed to be indicating that temperatures were dropping, suggesting that we might be seeing these predictions come to pass. But a few years later, it became clear that there was just too much year-to-year variation for us to tell.

Over time, however, researchers have figured out ways of getting indirect measures of the currents, using material that is influenced by the strengths of the water’s flow. These measures have now let us look back on the current’s behavior over the past several centuries. And the results confirm that the strength of the currents has dropped dramatically over the last century.

On the conveyor

The most famous of the currents at issue is probably the Gulf Stream, which runs up the east coast of the US and Canada, taking warm water from the tropics toward Europe. But the Gulf Stream is just one part of a far larger ocean conveyor system, which redistributes heat in all the major ocean basins outside of the Arctic. And while its reach is global, a lot of the force that drives the system develops in the polar regions. That’s where surface waters cool off, increase in density, sink to the ocean floor, and begin to flow south. It’s that process that helps draw warmer water north to replace what has sunk.

It’s the density of the cold, salty water that is key to the whole process—and that’s where climate change can intervene to slow down or halt the water’s turnover. The Arctic is warming faster than any other area on Earth, which means that the surface waters are starting to take longer to cool off. The Arctic warming is also melting off a lot of the ice, both on land and in the floating ice sheets that have typically covered the Arctic Ocean. This process can form a layer of fresher water over the surface of the ocean nearby that, even after it cools, won’t be as dense as the salt water beneath it.

If this process has kicked in, we should be able to detect it by measuring the strength of the currents flowing north. But that has turned out to be less informative than we might want. While we have detected significant drops in some years, they were often countered by large rises in others. This internal variability in the system is so large that it would take decades for any trend to reach the point of statistical significance.

The alternative would be to extend our records back in time. But since we can’t retroactively place buoys in the North Atlantic early last century, researchers have to identify other ways of figuring out how strong the flow of water was before we had accurate measurements.

Current by proxy

The research community as a whole has identified a number of ways to figure out what was going on in the oceans in the past. Some are pretty direct. For example, stronger ocean currents can keep larger particles of sediment flowing in the water for longer. So examining the average particle size deposited in sediments on the ocean floor can tell us something about the currents that flowed past that site. Other measures are a bit less direct, like nitrogen isotope ratios in corals, which tell us something about the productivity of the ocean in that area.

Overall, there are about a half-dozen different ways of understanding past ocean conditions used in the new study. Each has different levels of uncertainty, and many don’t provide an exact measure of conditions in a single year, instead giving a sense of what the average conditions were over a period of several decades.

Complicating matters further, the measures don’t all come from the same locations. Samples taken from deeper waters will capture the equator-directed cold water flow, while shallow sites will yield data on the warm waters flowing north. The Gulf Stream also breaks up into multiple individual currents in the North Atlantic so that some sites only capture a small part of the total picture.

Given all this, it’s not possible to build a complete picture of the Atlantic currents in the past. But with enough sites covered, it’s possible to get a sense of whether there have been any general changes at any point over the last 1,600 years based on the overlaps of the different records.

To identify any major transitions, a research team did change-point analysis, essentially searching for points in the history where the mean behavior before and after are significantly different. They found two change points that show up consistently in the data from multiple proxies. One occurred in the late 1800s, and the second happened around 1960, when the current period of warming really started to take off.

Of the 11 different records examined in the researchers’ work, 10 show that the current’s lowest strength has been within the past century. And that identification is statistically significant in nine of them. “Together, these data consistently show that the modern [current] slowdown is unprecedented in over a thousand years,” the paper’s authors conclude.

Obviously, we’d like to build up better records that more fully capture the dynamics of what has been going on and, if possible, give us more direct measures of the currents’ actual strengths. It’s also important to emphasize that this doesn’t necessarily portend a sudden, radical shift to a completely new climate. Europe might see a little less warming from ocean currents, but it’s also going to be seeing a lot more warming due to rising atmospheric temperatures. However, the drop in this current will have wide-reaching effects, both on the land surrounding the North Atlantic and the ecosystems within it. So getting more data should be a high priority.

Nature Geoscience, 2021. DOI: 10.1038/s41561-021-00699-z  (About DOIs).

Continue Reading

Science

Perseverance’s eyes see a different Mars

Published

on

Enlarge / Perseverance’s two Mastcam-Z imagers (in the gray boxes) are part of the rover’s remote sensing mast.

NASA

The seven minutes of terror are over. The parachute deployed; the skycrane rockets fired. Robot truck goes ping! Perseverance, a rover built by humans to do science 128 million miles away, is wheels-down on Mars. Phew.

Percy has now opened its many eyes and taken a look around.

The rover is studded with a couple dozen cameras—25, if you count the two on the drone helicopter. Most of them help the vehicle drive safely. A few peer closely and intensely at ancient Martian rocks and sands, hunting for signs that something once lived there. Some of the cameras see colors and textures almost exactly the way the people who built them do. But they also see more. And less. The rover’s cameras imagine colors beyond the ones that human eyes and brains can come up with. And yet human brains still have to make sense of the pictures they send home.

To find hints of life, you have to go to a place that was once likely livable. In this case that’s Jezero Crater. Three or four billion years ago, it was a shallow lake with sediments streaming down its walls. Today those are cliffs 150 feet tall, striated and multicolored by those sediments spreading and drying across the ancient delta.

Those colors are a geological infographic. They represent time, laid down in layers, stratum after stratum, epoch after epoch. And they represent chemistry. NASA scientists pointing cameras at them—the right kind of cameras—will be able to tell what minerals they’re looking at, and maybe whether wee Martian beasties once called those sediments home. “If there are sedimentary rocks on Mars that preserve evidence of any ancient biosphere, this is where we’re going to find them,” says Jim Bell, a planetary scientist at Arizona State University and the principal investigator on one of the rover’s sets of eyes. “This is where they should be.”

That’s what they’re looking for. But that’s not what they’ll see. Because some of the most interesting colors in that real-life, 50-meter infographic are invisible. At least they would be to you and me, on Earth. Colors are what happens when light bounces off or around or through something and then hits an eye. But the light on Mars is a little different than the light on Earth. And Perseverance’s eyes can see light we humans can’t—light made of reflected x-rays or infrared or ultraviolet. The physics are the same; the perception isn’t.

Bell’s team runs Mastcam-Z, a set of superscience binoculars mounted atop Perseverance’s tower. (The Z is for zoom.) “We developed Mastcam-Z for a rover going to a spot on Mars that hadn’t been selected yet, so we had to design it with all the possibilities in mind—the optimal set of eyes to capture the geology of any spot on Mars,” says Melissa Rice, a planetary scientist at Western Washington University and coinvestigator on Mastcam-Z.

Close-up, Mastcam-Z can see details about 1 millimeter across; from 100 meters out, it’ll pick up a feature just 4 centimeters wide. That’s better than you and me. It also sees color better—or, rather, “multispectrally,” capturing the broadband visible spectrum that human people are used to, but also about a dozen narrow-band not-quite-colors. (Rice co-wrote a very good geek-out about all this stuff.)

Its two cameras pull off this feat of super-vision with standard, off-the-shelf image sensors made by Kodak, charge-coupled devices like the ones in your phone. The filters make them special. Ahead of the CCD is a layer of pixels that pick up red, green, and blue. Imagine a foursquare grid—the top squares are blue and green, the bottom green and red. Now spread that out into a repeating mosaic. That’s called a Bayer pattern, a silicon version of the three color-sensing photoreceptors in your eye.

Emily Lakdawalla | MSSS

Mars and Earth bathe in the same sunlight—the same hodgepodge of light at every wavelength. But on Mars there’s less of it, because the planet is farther out. And while Earth has a thick atmosphere full of water vapor to reflect and refract all that light, Mars has only a little atmosphere, and it’s full of reddish dust.

On Mars, that means a lot of red and brown. But seeing them on Mars adds a whole other perceptual filter. “We talk about showing an approximate true color image, essentially close to a raw color image that we take with very minimal processing. That’s one version of what Mars would look like to a human eye,” Rice says. “But the human eye evolved to see landscapes under Earth illumination. If we want to reproduce what Mars would look like to a human eye, we should be simulating Earth illumination conditions onto those Martian landscapes.”

So on the one hand, the image processing team working on Perseverance’s raw feed can adjust Mars colors to Earthish colors. Or the team can simulate the spectra of Martian light hitting objects on Mars. That’d look a little different. No less true, but maybe more like what a human on Mars would actually see. (There’s no telling what a Martian would see, because if it had eyes, those eyes would have evolved to see color under that sky, and their brains would be, well, alien.)

NASA’s Mars Perseverance rover acquired this image using its left Mastcam-Z camera.
Enlarge / NASA’s Mars Perseverance rover acquired this image using its left Mastcam-Z camera.

NASA | JPL-Caltech | ASU

But Rice kind of doesn’t care about any of that. “For me, the outcome isn’t even visual, in a sense. The outcome I’m interested in is quantitative,” she says. Rice is looking for how much light at a specific wavelength gets reflected or absorbed by the stuff in the rocks. That “reflectance value” can tell scientists exactly what they’re looking at. The Bayer filter is transparent to light with a wavelength higher than 840 nanometers—which is to say, infrared. In front of that layer is a wheel with another set of filters; block out the colors of light visible to humans and you’ve got an infrared camera. Pick narrower sets of wavelengths and you can identify and distinguish specific kinds of rocks by how they reflect different wavelengths of infrared light.

Before Perseverance left, the Mastcam-Z team had to learn exactly how the cameras saw those differences. They created a “Geo Board,” a design brainstorm meeting’s worth of reference color swatches and also actual square slices of rocks. “We assembled it with rock slabs of all different types of material we knew to be on Mars, things we hoped to find on Mars,” Rice says. For example? On that board were pieces of the minerals basanite and gypsum. “In the normal color image they both just look like bright-white rocks,” Rice says. Both are mostly calcium and sulfur, but gypsum has more water molecules mixed in, and water reflects more at some wavelengths of IR than others. “When we make a false-color image using longer Mastcam-Z wavelengths, it becomes clear as day which is which,” Rice says.

For all its multispectral multitasking, Mastcam-Z does have its limits. Its resolution is great for textures—more on that in a bit—but its field of view is only about 15 degrees wide, and its draggy upload bandwidth would make your home router giggle. For all the wonderful images Perseverance is about to send home, it really doesn’t see all that much. At least, not all at once. All those vistas get bottlenecked by technology and distance. “Dude, our job is triage,” Bell says. “We’re using color as a proxy for, ‘Hey, that’s interesting. Maybe there’s something going on chemically there, maybe there’s some different mineral there, some different texture.’ Color is a proxy for something else.”

The narrowness of the rover’s field of view means that scientists by definition can’t see all they might hope. Bell and his team got a taste of those limits during their simulations of the camera-and-robot experience in the Southern California desert. “As a kind of joke, but also as an object lesson, my colleagues in one of those field tests once put a dinosaur bone right along the rover path,” he says. “We drove right past it.”

Mike Kaplinger | Melissa Rice | NASA | JPL | MSSS

For identifying actual elements—and, more importantly, figuring out if they might have once harbored life—you need even more colors. Some of those colors are even more invisible. That’s where x-ray spectroscopy comes in.

Specifically, the team running one of the sensors on Perseverance’s arm—the Planetary Instrument for X-ray Lithochemistry, or PIXL—is looking to combine the elemental recipe for minerals with fine-grained textures. That’s how you find stromatolites, sediment layers with teeny tiny domes and cones that can only come from mats of living microbes. Stromatolites on Earth provide some of the evidence of the earliest living things here; Perseverance’s scientists hope they’ll do the same on Mars.

The PIXL team’s leader, an astrobiologist and field geologist at the Jet Propulsion Laboratory named Abigail Allwood, has done this before. She used that technology in conjunction with high-resolution pictures of sediments to find signs of the earliest known life on Earth in Australia—and to determine that similar sediments in Greenland weren’t evidence of ancient life there. It’s not easy to do in Greenland; it’ll be even tougher on Mars.

X-rays are part of the same electromagnetic spectrum as the light that humans see, but at a much lower wavelength—even more ultra than ultraviolet. It’s ionizing radiation, only a color if you’re Kryptonian. X-rays cause different kinds of atoms to fluoresce, to give off light, in characteristic ways. “We create the x-rays to bathe the rocks in, and then detect that signal to study the elemental chemistry,” Allwood says. And PIXL and the arm also have a bright-white flashlight on the end. “The illumination on the front started out as just a way of making the rocks easier to see, to tie the chemistry to visible textures, which hasn’t been done before on Mars,” Allwood says. The color was a little vexing at first; heat and cold affected the bulbs. “We initially tried white LEDs, but with temperature changes it wasn’t producing the same shade of white,” she says. “So the guys in Denmark who supplied us with the camera, they provided us with colored LEDs.” Those were red, green, and blue—and ultraviolet. That combination of colors added together to make a better and more consistent white light.

That combination might be able to find Martian stromatolites. After locating likely targets—perhaps thanks to Mastcam-Z pans across the crater—the rover will sidle up and extend its arm, and PIXL will start pinging. The tiniest features, grains and veins, can say whether the rock is igneous or sedimentary, melted together like stew or layered like a sandwich. Colors of layers on top of other features will give a clue about the age of each. Ideally, the map of visible colors and textures will line up with the invisible, numbers-only map that the x-ray results generate. When the right structures line up with the right minerals, Allwood can tell whether she’s got Australia-type life signs or a Greenland-type bust. “What we’ve found that’s really interesting with PIXL is that it shows you stuff you don’t see, through the chemistry,” Allwood says. “That would be the key.”

Allwood is hoping PIXL’s tiny scans will yield huge results—an inferred map of 6,000 individual points on the instrument’s postage stamp-sized field of view, with multiple spectral results for each. She calls this a “hyperspectral datacube.”

Of course, Perseverance has other cameras and instruments, other scanners looking for other hints of meaning in bits of rock and regolith. Adjacent to PIXL is a device that looks at rocks a whole other way, shooting a laser at them to vibrate their molecules—that’s Raman spectroscopy. The data Perseverance collects will be hyperspectral, but also multifaceted—almost philosophically so. That’s what happens when you send a robot to another planet. A human mission or rocks sent home via sample return would produce the best, ground truth data, as one exoplanet researcher told me. Somewhat behind that are x-ray and Raman spectroscopy, then rover cameras, then orbiter cameras. And of course all those things are working together on Mars.

“Finding life on Mars will not be, ‘Such and such an instrument sees something.’ It’ll be, ‘All the instruments saw this, that, and the other thing, and the interpretation makes life reasonable,” Allwood says. “There’s no smoking gun. It’s a complicated tapestry.” And like a good tapestry, the full image only emerges from a warp and weft of color, carefully threaded together.

This story originally appeared on wired.com.

Continue Reading

Science

We’ll likely have a 3rd COVID vaccine soon; J&J vaccine clears last hurdle

Published

on

Enlarge / A sign at the Johnson & Johnson campus on August 26, 2019 in Irvine, California.

After a day-long meeting Friday, an advisory panel for the US Food and Drug Administration voted 22 to 0 to recommend issuing an Emergency Use Authorization for Johnson & Johnson’s single-shot, refrigerator-stable COVID-19 vaccine.

If the FDA accepts the panel’s recommendation and grants the EUA—which it likely will—the country will have a third COVID-19 vaccine authorized for use. Earlier this week, FDA scientists released their review of the vaccine, endorsing authorization.

Agency watchers expect the FDA to move quickly on the decision, possibly granting the EUA as early as tomorrow, February 27. The FDA moved that fast in granting EUAs for the two previously authorized vaccines, the Moderna and Pfizer/BioNTech mRNA vaccines.

Additionally, an advisory panel for the Centers for Disease Control and Prevention that makes recommendations on vaccine use has scheduled an emergency meeting for this Sunday to discuss the vaccine’s use, further bolstering speculation that the federal government will move quickly to authorize and roll out the vaccine. If all of the pieces fall in line, doses of Johnson & Johnson’s COVID-19 vaccine could begin shipping out to vaccination sites early next week.

The rollout won’t be a big burst of new doses right away, though; it will likely be a slow roll. In congressional testimony this week, a Johnson & Johnson executive said that the company would provide 4 million doses after the EUA, with a total of 20 million ready by the end of March and a total of 100 million by the end of June. Still, with the vaccine only requiring a single-shot, those 100 million doses equate to 100 million people protected.

Efficacy

According to a detailed FDA review of Phase III clinical trial data submitted by Johnson & Johnson, the vaccine was 66 percent effective at preventing moderate to severe COVID-19 at 28 days after vaccination. (Johnson & Johnson defined moderate cases to include cases that had two symptoms, such as cough and fever, which would have been classified as simply “symptomatic” infections in other trials.)

The international trial, which involved over 44,000 participants in various trial sites, had different efficacies in different places. In the US, the overall efficacy was slightly higher, at 72 percent. But in places where variants of concern are widely circulating, the efficacy fell. It was 64 percent effective in South Africa, and 61 percent effective in Latin America.

Reassuringly, the efficacy against severe and critical disease was high across the board, in all the trial locations and across age groups. Efficacy against severe disease was 85 percent overall 28 days after vaccination. By location, efficacy against severe disease in the US was at 86 percent, 82 percent in South Africa, and 88 percent in Brazil. In a further analysis, there were zero hospitalizations among vaccinated participants and 16 in the placebo group. As of February 5, there were seven COVID-19-related deaths in the trial, all of which were in the placebo group.

In addition, Johnson & Johnson has a 30,000-person trial in progress testing whether adding a booster shot will further increase efficacy.

Side effects

As for side effects, the vaccine has a “favorable safety profile,” according to the FDA. The most common side effects seen among the 44,000 or so participants were injection site pain (49 percent), headache (39 percent), fatigue (38 percent), and myalgia (33 percent). There were 15 cases of blood-clotting-related conditions among vaccinated participants, compared with 10 in the placebo group. There were also six cases of tinnitus (ringing in the ears) among the vaccinated and zero in the placebo group. It’s unclear if these conditions were related to the vaccine.

While anaphylaxis has been a rare but documented occurrence with the mRNA vaccines, it appears to be less of a risk with Johnson & Johnson’s vaccine. There was a single case of a severe hypersensitivity reaction two days after vaccination that was considered likely related to the vaccine. But the reaction was not classified as anaphylaxis.

Continue Reading

Trending