System76 has been producing Linux-based computers for years, succeeding well enough that it could even produce a PC manufactured largely in the United States. Its latest plans are for a refresh of the Darter Pro laptop to answer customers’ requests for improved battery life.
The Darter Pro is a thin and light portable (3.6 pounds, 0.78 inches thick) designed to offer more than just the basics for computing tasks. It will ship with either an Intel Core i5–8265U or i7-8565U quad-core processor, up to 32GB of RAM, up to 2TB of built-in storage, and a full HD 15.6-inch display. System76 claims that the updated Darter Pro will provide a full workday’s worth of battery life so you don’t need to be chained to a wall outlet by noon.
While a laptop like the Dell XPS 13 can ship with Ubuntu Linux if you choose the Developer Edition, the Darter Pro only ships with a choice of Linux OS: Ubuntu 18.04 LTS, or one of two versions of System76’s own Linux OS, Pop!_OS 18.04 LTS or Pop!_OS 18.10 (64-bit). Pop!_OS is based on Ubuntu, but offers additional features such as full-disk encryption for the company’s systems.
According to Softpedia News, System76 will begin taking orders for the Darter Pro beginning on February 5. Pricing has not been announced — at least we won’t have to wait too long — though the company’s other laptop lines are generally priced around the $1,000 mark.
Potential PC buyers are starting to get more options to consider if they plan to …
For applications like robotics, there’s usually a clear division of labor between the processors that control the robot’s body and the actuators that actually control the physical changes of that body. But a new paper being released today blurs the lines between the two, using a magnetic switch in a way that both stores a bit representing the hardware’s state and alters the physical conformation of the hardware. In essence, it merges memory and physical changes.
This particular implementation doesn’t seem to be especially useful—it’s much too big to be a practical form of memory, and the physical changes are fairly limited. But the concept is intriguing, and it’s possible that someone more adept at creative thinking can find ways of modifying the concept to create a useful device.
A magnetic metamaterial?
A metamaterial is generally defined as a material that is structured so that it has properties that aren’t found in bulk mixes of its raw materials. A broad reading of that definition, however, would mean that a car is a metamaterial, which makes the definition near meaningless. The researchers behind the new device, based at Switzerland’s École Polytechnique Fedeŕale de Lausanne, claim their creation is a metamaterial, but it’s fairly large (roughly a cube three centimeters on a side) and has a number of distinct parts. I’d tend to call that a device rather than a material and will use that terminology here.
So what is the device? The part that changes its configuration is a platform supported by a set of four legs that are bent inward (v and iii in the image below). The two different states of the system are read out by registering the amount of force needed to push the platform down. The force needed is raised by pushing a wedge (iv) between the bend of the legs, forcing them outward and dropped again by sliding the wedge back out. The wedge is indirectly attached to a flexible base that pops between two stable states (i), a bit like the tops of twist-off jar lids that pop out to indicate the jar has been opened.
The whole thing is controlled by part ii, which links the flexible base to the wedges. In this device, it’s made of a polymer that has had magnetic particles embedded in it. This allows the state of the device to be controlled using an external magnetic field. Pull the central magnetic component up and the base pops upward, driving the wedge between the legs and raising the force needed to deform the device. Push the magnetic hardware back down and the wedge drops out of the way, and the force required to push the platform down drops with it.
On the grid
The researchers built a six-by-six array of these devices and showed that the devices could be individually addressed using other magnetic devices positioned above and below it. The process isn’t quick; it takes nearly a second for the device to change state, and the system has to cool down for four seconds after each of these. But the devices could be switched back and forth over 1,000 times without losing performance.
The six-by-six array allowed 37 different combinations of on/off states in individual devices, and the researchers tested the force required to flatten the platform in each of these states. As expected, that force varied based on the configuration, showing that the devices could collectively change the properties of the hardware they were part of.
But in addition to measuring the state of the devices via force, however, the researchers found they could also read their magnetic state, much like bits on a hard drive. Because of the change in the location of the magnetic part of the device, they found a five-fold difference in its magnetic properties when measured at the hardware’s surface.
In total, the grid of devices enabled three distinct measurements to be made: an overall change in the deformation properties of the surface they supported, a difference in the force required to deform individual elements, and a change in the magnetic properties of individual elements.
Collectively, all of that is pretty neat. It’s not, however, obviously useful. That’s in part due to the device’s size, but it’s also in part because there’s no obvious immediate need for a surface with fine-tuned compressibility or to read the state of the parts magnetically instead of simply remembering how they were set. But the researchers say that there should be a variety of means to shrink the device down. And they argue that the concept could be extended in a lot of ways now that it has been demonstrated. We’ll have to reserve judgement on utility until we see what the rest of the research community does with the concept.
Midday today, January 20, Dr. Rochelle Walensky will take over as director of the Centers for Disease Control and Prevention—and one of her top priorities will be to try to undo all the harms done to the agency by the Trump administration.
“How is it that I make sure that the people who are there—these incredible scientists, these incredible civil servants for their entire career—understand and feel the value that we should be giving them? They have been diminished. I think they’ve been muzzled, that science hasn’t been heard,” Walensky said in a brief, but wide-ranging interview with JAMA Tuesday. “This top-tier agency—world renowned—hasn’t really been appreciated over the last four years and really markedly over the last year. So, I have to fix that.”
Part of her plan to do that is unmuzzling those scientists and getting their science out to the public where it can make a difference. And that blends into the next challenge: “We obviously need to get this country out of COVID and the current pandemic crisis,” she said. And that will also entail increasing communication with the public, as well as state and local health authorities and members of Congress.
Walensky—a professor of medicine at Harvard Medical School and chief of the Infectious Disease division at Massachusetts General Hospital and Brigham and Women’s Hospital—will be new to government health work when she takes over the federal agency of over 10,000 staff today. “I will have all of the benefit of coming in from the outside and being able to look in and say ‘This feels really broken,’” she said. For any institutional knowledge she’ll need, she’ll rely on long-standing career staff, she added.
Like President-elect Biden, Walensky will immediately focus on helping smooth out and speed up the COVID-19 vaccine rollout—and reaching the administration’s goal of getting 100 million shots into arms in the first 100 days. According to CDC data as of January 19, the government has distributed more than 31 million doses, but less than 16 million have been administered.
A bit of a cushion
Picking up the pace will involve helping states set the right eligibility conditions for vaccination—conditions that aren’t so restrictive that vaccine doses end up sitting unused in refrigerators, or so loose that there are long lines outside overwhelmed vaccination sites, she said. The Biden team also aims to boost manufacturing for vaccine supplies, increase the number of people who can provide vaccines, and increase the number of places where vaccines are administered.
Addressing one of the most pressing topics in recent days, Walensky also said she and the Biden team have their eye on the concerning coronavirus variants that are popping up in various places around the world—including the US. The team is working to “dramatically” bolster surveillance efforts, including partnering with industry and academic labs, so that they can track any variants that develop in or enter the US and begin to spread.
The main things to worry about with variants are if they spread more easily, if they cause more severe disease and deaths, and if they make therapies and vaccines less effective, she noted. We’ve seen variants that seem to spread from person to person more easily. But we have yet to see evidence that variants are increasing disease and death or that they’re evading vaccines and therapies.
“I think the good news with regard to the variants is that the efficacy of the vaccines is so good and so high that we have a little bit of a cushion,” she said. Even if lab studies show vaccines aren’t quite as effective against a variant as the initial strain, “we’ll probably still end up with quite a good vaccine.” Her point has been echoed by many other experts who predict it will take years before the coronavirus evolves to completely outwit immune responses.
“I just want to remind people: almost no vaccine we have is 95 percent effective” like the COVID-19 vaccines, she added. “So, before we panic and say ‘well, should I really get the vaccine if it’s not going to work against the variant?’—It’s going to work against the variant. Will it be 95 percent? Maybe. Will it be 70 percent? Maybe. But our flu vaccines aren’t 70 percent effective every year and we still get them. So, I’m really optimistic about how these variants are going to go.”
There’s still a lot of uncertainty about how exactly the immune system responds to the SARS-CoV-2 virus. But what’s become clear is that re-infections are still very rare, despite an ever-growing population of people who were exposed in the early days of the pandemic. This suggests that, at least for most people, there is a degree of long-term memory in the immune response to the virus.
But immune memory is complicated and involves a number of distinct immune features. It would be nice to know which ones are engaged by SARS-CoV-2, since that would allow us to better judge the protection offered by vaccines and prior infections, and to better understand whether the memory is at risk of fading. The earliest studies of this sort all involved very small populations, but there are now a couple that have unearthed reasons for optimism, suggesting that immunity will last at least a year, and perhaps longer. But the picture still isn’t as simple as we might like.
Only a memory
The immune response requires the coordinated activity of a number of cell types. There’s an innate immune response that is triggered when cells sense they’re infected. Various cells present pieces of protein to immune cells to alert them to the identity of the invader. B cells produce antibodies, while different types of T cells perform functions like coordinating the response and eliminating infected cells. Throughout this all, a variety of signaling molecules modulate the strength of the immune attack and induce inflammatory responses.
Some of those same pieces get recruited into the system that preserves a memory of the infection. These include different types of T cells that are converted into memory T cells. A similar thing happens to antibody-producing B cells, many of which express specialized subtypes of antibodies. Fortunately, we have the means to identify the presence of each of them.
And that’s the focus of a major study that was published a couple of weeks ago. Nearly 190 people who had had COVID-19 were recruited, and details on all these cells were obtained for periods as long as eight months after infection. Unfortunately, not everyone donated blood samples at every point in time, so many of the populations were quite small; only 43 individuals provided the data for six months after infection, for example. There was also a huge range of ages (age influences immune function) and severity of disease. So the results should be interpreted cautiously.
Months after infection, T cells in this population still recognized at least four different viral proteins, which is good news in light of many of the variants in the spike protein that have been evolving. T cells that specialize in eliminating infected cells (CD8-expressing T cells) were present but had largely been converted to a memory-maintaining form. The number of cells declined over time, with a half-life of roughly 125 days.
Similar things were seen with T cells that are involved in coordinating immune activities (CD-4-expressing T cells). Here, for the general population of these cells, the half-life was about 94 days, and 92 percent of the people who were checked six months after infection had memory cells of this type. A specialized subset that interacts with antibody-producing B cells seemed to be the relatively stable, with almost everyone still having memory cells at over six months.
So overall, as far as T cells go, there are clear signs of the establishment of memory. It does decline over time, but not so rapidly that immunity would fade within a year. However, for most of the cell types examined, there are some individuals where some aspects of the memory seems to be gone at six months.
The B side
Like T cells, antibody-producing B cells can adopt a specialized memory fate; cells can also specialize in producing a variety of antibody subtypes. The first paper tracked both antibodies and memory cells. Overall, the levels of antibodies specific to the viral spike protein dropped after infection with a half-life of 100 days, the number of memory B cells increased over that time, and stayed at a plateau that started at about 120 days post-infection.
A second paper, published this week, looked at the trajectory of the antibody response in much more detail. Again, it involved a pretty small population of participants (87 in this case), but monitored for over six months. A bit under half of them had some long-term symptoms after their initial infections had cleared. As with the earlier study, the levels of antibodies they found declined in the months following the infection, dropping by anywhere from a third to a quarter, depending on the antibody type. Intriguingly, people with ongoing symptoms tended to have higher levels of antibodies across this period.
But when the team looked at antibody-producing memory cells, they noticed that the antibodies were changing over time. In memory cells, there’s a mechanism by which parts of the genes that encode the antibody pick up a lot of mutations over time. By continuing to select those cells that produce antibodies with a higher level of affinity, this can improve the immune response in the future.
That seems to be exactly what is happening in these post-COVID patients. At the first sampling time, the researchers identified the sequences of many of the genes that encode antibodies against coronavirus proteins. At the time of the second check months later, they were unable to find 43 of these initial antibody genes. But 22 new ones were identified, arising from the mutation process—by six months, the typical antibody gene had picked up between two and three times the number of mutations. In some cases, the authors were able to identify the ancestral antibody gene that picked up mutations to create the one present at six months.
The system seems to be working. One of the early antibodies was unable to bind some of the variants of the spike protein that have evolved in some coronavirus strains. But the replacements with more mutations could, suggesting it was higher affinity and for the spike protein than the earlier version. While the average antibody had similar affinities at the early and late time points, specific antibody lineages saw their ability to neutralize the virus increase.
The immune system has ways of preserving the spike protein to select for improved antibody variants after infections are cleared, and that may be part of what’s going on here. But in a number of participants (under half of those tested), there were still indications of active SARS-CoV-2 infections in the intestine, even though nasal tests came back negative. So it’s possible that at least some of the improved binding comes from continued exposure to the actual virus.
The big picture
Let’s emphasize again: these are both small studies, and we really need to see them replicated with larger populations and more consistent sampling. But at least when it comes to antibodies, the consistencies between these two studies are a step towards building confidence in the results. And those results are pretty good: clear signs of long-term memory and that the immune system’s ability to sharpen its defenses seems to be working against SARS-CoV-2.
Beyond that, the T cell results, while more tentative, also seem to hint at a long-term immunity. But there, the results aren’t as consistent, with different aspects of T cell immunity persisting in different patients. The researchers divided the different aspects into five categories, and found that fewer than half their study population still had all five categories of memory present after five months. But 95 percent of them had at least three categories present, suggesting the persistence of at least some memory. The problem is that, at this point, we don’t really understand what would provide protective immunity, so it’s difficult to judge the meaning of these results.