Connect with us

Science

Arecibo radio telescope’s massive instrument platform has collapsed

Published

on

The immense instrument platform and the large collection of cables that supported it, all of which are now gone.

On Monday night, the enormous instrument platform that hung over the Arecibo radio telescope’s big dish collapsed due to the failure of the remaining cables supporting it. The risk of this sort of failure was the key motivation behind the National Science Foundation’s recent decision to shut down the observatory, as the potential for collapse made any attempt to repair the battered scope too dangerous for the people who would do the repairs.

Right now, details are sparse. The NSF has confirmed the collapse and says it will provide more information once it’s confirmed. A Twitter account from a user from Puerto Rico shared an image that shows the support towers that used to hold the cables that suspended the instrument platform over the dish, now with nothing but empty space between them.

The immense weight of the platform undoubtedly caused significant damage to the disk below. The huge metal cables that had supported it would likely have spread the damage well beyond where the platform landed. It’s safe to say that there is very little left of the instrument that’s in any shape to repair.

It’s precisely this sort of catastrophic event that motivated the NSF to shut down the instrument, a decision made less than two weeks ago. The separate failures of two cables earlier in the year suggested that the support system was in a fragile state, and the risks of another cable snapping in the vicinity of any human inspectors made even evaluating the strength of the remaining cables unacceptably risky. It’s difficult to describe the danger posed by the sudden release of tension in a metal cable that’s well over a hundred meters long and several centimeters thick.

With inspection considered too risky, repair and refurbishment were completely out of the question. The NSF took a lot of criticism from fans of the telescope in response to its decision, but the collapse both justifies the original decision and obviates the possibility of any alternatives, as more recent images indicate that portions of the support towers came down as well.

The resistance the NSF faced was understandable. The instrument played an important role in scientific history and was still being used when funding was available, as it provided some capabilities that were difficult to replicate elsewhere. It also played a role as the most important scientific facility in Puerto Rico, drawing scientists from elsewhere who engaged with the local research community and helped inspire students on the island to go into science. And beyond all that, it was iconic—until recently, there was nothing else like it, which made it a feature in popular culture and extended its draw well beyond the island where it was located.

Lots of its fans were sad to contemplate its end and held out hope that some other future could be possible for it. With yesterday’s collapse, the focus will have to shift to whether there’s a way to use its site for something that appropriately honors its legacy.

Continue Reading

Science

New metamaterial merges magnetic memory and physical changes

Published

on

For applications like robotics, there’s usually a clear division of labor between the processors that control the robot’s body and the actuators that actually control the physical changes of that body. But a new paper being released today blurs the lines between the two, using a magnetic switch in a way that both stores a bit representing the hardware’s state and alters the physical conformation of the hardware. In essence, it merges memory and physical changes.

This particular implementation doesn’t seem to be especially useful—it’s much too big to be a practical form of memory, and the physical changes are fairly limited. But the concept is intriguing, and it’s possible that someone more adept at creative thinking can find ways of modifying the concept to create a useful device.

A magnetic metamaterial?

A metamaterial is generally defined as a material that is structured so that it has properties that aren’t found in bulk mixes of its raw materials. A broad reading of that definition, however, would mean that a car is a metamaterial, which makes the definition near meaningless. The researchers behind the new device, based at Switzerland’s École Polytechnique Fedeŕale de Lausanne, claim their creation is a metamaterial, but it’s fairly large (roughly a cube three centimeters on a side) and has a number of distinct parts. I’d tend to call that a device rather than a material and will use that terminology here.

So what is the device? The part that changes its configuration is a platform supported by a set of four legs that are bent inward (v and iii in the image below). The two different states of the system are read out by registering the amount of force needed to push the platform down. The force needed is raised by pushing a wedge (iv) between the bend of the legs, forcing them outward and dropped again by sliding the wedge back out. The wedge is indirectly attached to a flexible base that pops between two stable states (i), a bit like the tops of twist-off jar lids that pop out to indicate the jar has been opened.

The device in question, with the parts labelled.

The device in question, with the parts labelled.

Chen et. al.

The whole thing is controlled by part ii, which links the flexible base to the wedges. In this device, it’s made of a polymer that has had magnetic particles embedded in it. This allows the state of the device to be controlled using an external magnetic field. Pull the central magnetic component up and the base pops upward, driving the wedge between the legs and raising the force needed to deform the device. Push the magnetic hardware back down and the wedge drops out of the way, and the force required to push the platform down drops with it.

On the grid

The researchers built a six-by-six array of these devices and showed that the devices could be individually addressed using other magnetic devices positioned above and below it. The process isn’t quick; it takes nearly a second for the device to change state, and the system has to cool down for four seconds after each of these. But the devices could be switched back and forth over 1,000 times without losing performance.

The six-by-six array allowed 37 different combinations of on/off states in individual devices, and the researchers tested the force required to flatten the platform in each of these states. As expected, that force varied based on the configuration, showing that the devices could collectively change the properties of the hardware they were part of.

But in addition to measuring the state of the devices via force, however, the researchers found they could also read their magnetic state, much like bits on a hard drive. Because of the change in the location of the magnetic part of the device, they found a five-fold difference in its magnetic properties when measured at the hardware’s surface.

In total, the grid of devices enabled three distinct measurements to be made: an overall change in the deformation properties of the surface they supported, a difference in the force required to deform individual elements, and a change in the magnetic properties of individual elements.

Collectively, all of that is pretty neat. It’s not, however, obviously useful. That’s in part due to the device’s size, but it’s also in part because there’s no obvious immediate need for a surface with fine-tuned compressibility or to read the state of the parts magnetically instead of simply remembering how they were set. But the researchers say that there should be a variety of means to shrink the device down. And they argue that the concept could be extended in a lot of ways now that it has been demonstrated. We’ll have to reserve judgement on utility until we see what the rest of the research community does with the concept.

Nature, 2021. DOI: 10.1038/s41586-020-03123-5 (About DOIs).

Continue Reading

Science

First task for Biden’s CDC director: Fix everything Trump broke

Published

on

Enlarge / Dr. Rochelle Walensky, President-elect Joe Biden’s pick to head the Centers for Disease Control.

Midday today, January 20, Dr. Rochelle Walensky will take over as director of the Centers for Disease Control and Prevention—and one of her top priorities will be to try to undo all the harms done to the agency by the Trump administration.

“How is it that I make sure that the people who are there—these incredible scientists, these incredible civil servants for their entire career—understand and feel the value that we should be giving them? They have been diminished. I think they’ve been muzzled, that science hasn’t been heard,” Walensky said in a brief, but wide-ranging interview with JAMA Tuesday. “This top-tier agency—world renowned—hasn’t really been appreciated over the last four years and really markedly over the last year. So, I have to fix that.”

Part of her plan to do that is unmuzzling those scientists and getting their science out to the public where it can make a difference. And that blends into the next challenge: “We obviously need to get this country out of COVID and the current pandemic crisis,” she said. And that will also entail increasing communication with the public, as well as state and local health authorities and members of Congress.

Walensky—a professor of medicine at Harvard Medical School and chief of the Infectious Disease division at Massachusetts General Hospital and Brigham and Women’s Hospital—will be new to government health work when she takes over the federal agency of over 10,000 staff today. “I will have all of the benefit of coming in from the outside and being able to look in and say ‘This feels really broken,’” she said. For any institutional knowledge she’ll need, she’ll rely on long-standing career staff, she added.

Like President-elect Biden, Walensky will immediately focus on helping smooth out and speed up the COVID-19 vaccine rollout—and reaching the administration’s goal of getting 100 million shots into arms in the first 100 days. According to CDC data as of January 19, the government has distributed more than 31 million doses, but less than 16 million have been administered.

A bit of a cushion

Picking up the pace will involve helping states set the right eligibility conditions for vaccination—conditions that aren’t so restrictive that vaccine doses end up sitting unused in refrigerators, or so loose that there are long lines outside overwhelmed vaccination sites, she said. The Biden team also aims to boost manufacturing for vaccine supplies, increase the number of people who can provide vaccines, and increase the number of places where vaccines are administered.

Addressing one of the most pressing topics in recent days, Walensky also said she and the Biden team have their eye on the concerning coronavirus variants that are popping up in various places around the world—including the US. The team is working to “dramatically” bolster surveillance efforts, including partnering with industry and academic labs, so that they can track any variants that develop in or enter the US and begin to spread.

The main things to worry about with variants are if they spread more easily, if they cause more severe disease and deaths, and if they make therapies and vaccines less effective, she noted. We’ve seen variants that seem to spread from person to person more easily. But we have yet to see evidence that variants are increasing disease and death or that they’re evading vaccines and therapies.

“I think the good news with regard to the variants is that the efficacy of the vaccines is so good and so high that we have a little bit of a cushion,” she said. Even if lab studies show vaccines aren’t quite as effective against a variant as the initial strain, “we’ll probably still end up with quite a good vaccine.” Her point has been echoed by many other experts who predict it will take years before the coronavirus evolves to completely outwit immune responses.

“I just want to remind people: almost no vaccine we have is 95 percent effective” like the COVID-19 vaccines, she added. “So, before we panic and say ‘well, should I really get the vaccine if it’s not going to work against the variant?’—It’s going to work against the variant. Will it be 95 percent? Maybe. Will it be 70 percent? Maybe. But our flu vaccines aren’t 70 percent effective every year and we still get them. So, I’m really optimistic about how these variants are going to go.”

Continue Reading

Science

The persistence of memory in B cells: Hints of stability in COVID immunity

Published

on

Enlarge / The immune response involves a lot of moving parts.

There’s still a lot of uncertainty about how exactly the immune system responds to the SARS-CoV-2 virus. But what’s become clear is that re-infections are still very rare, despite an ever-growing population of people who were exposed in the early days of the pandemic. This suggests that, at least for most people, there is a degree of long-term memory in the immune response to the virus.

But immune memory is complicated and involves a number of distinct immune features. It would be nice to know which ones are engaged by SARS-CoV-2, since that would allow us to better judge the protection offered by vaccines and prior infections, and to better understand whether the memory is at risk of fading. The earliest studies of this sort all involved very small populations, but there are now a couple that have unearthed reasons for optimism, suggesting that immunity will last at least a year, and perhaps longer. But the picture still isn’t as simple as we might like.

Only a memory

The immune response requires the coordinated activity of a number of cell types. There’s an innate immune response that is triggered when cells sense they’re infected. Various cells present pieces of protein to immune cells to alert them to the identity of the invader. B cells produce antibodies, while different types of T cells perform functions like coordinating the response and eliminating infected cells. Throughout this all, a variety of signaling molecules modulate the strength of the immune attack and induce inflammatory responses.

Some of those same pieces get recruited into the system that preserves a memory of the infection. These include different types of T cells that are converted into memory T cells. A similar thing happens to antibody-producing B cells, many of which express specialized subtypes of antibodies. Fortunately, we have the means to identify the presence of each of them.

And that’s the focus of a major study that was published a couple of weeks ago. Nearly 190 people who had had COVID-19 were recruited, and details on all these cells were obtained for periods as long as eight months after infection. Unfortunately, not everyone donated blood samples at every point in time, so many of the populations were quite small; only 43 individuals provided the data for six months after infection, for example. There was also a huge range of ages (age influences immune function) and severity of disease. So the results should be interpreted cautiously.

Months after infection, T cells in this population still recognized at least four different viral proteins, which is good news in light of many of the variants in the spike protein that have been evolving. T cells that specialize in eliminating infected cells (CD8-expressing T cells) were present but had largely been converted to a memory-maintaining form. The number of cells declined over time, with a half-life of roughly 125 days.

Similar things were seen with T cells that are involved in coordinating immune activities (CD-4-expressing T cells). Here, for the general population of these cells, the half-life was about 94 days, and 92 percent of the people who were checked six months after infection had memory cells of this type. A specialized subset that interacts with antibody-producing B cells seemed to be the relatively stable, with almost everyone still having memory cells at over six months.

So overall, as far as T cells go, there are clear signs of the establishment of memory. It does decline over time, but not so rapidly that immunity would fade within a year. However, for most of the cell types examined, there are some individuals where some aspects of the memory seems to be gone at six months.

The B side

Like T cells, antibody-producing B cells can adopt a specialized memory fate; cells can also specialize in producing a variety of antibody subtypes. The first paper tracked both antibodies and memory cells. Overall, the levels of antibodies specific to the viral spike protein dropped after infection with a half-life of 100 days, the number of memory B cells increased over that time, and stayed at a plateau that started at about 120 days post-infection.

A second paper, published this week, looked at the trajectory of the antibody response in much more detail. Again, it involved a pretty small population of participants (87 in this case), but monitored for over six months. A bit under half of them had some long-term symptoms after their initial infections had cleared. As with the earlier study, the levels of antibodies they found declined in the months following the infection, dropping by anywhere from a third to a quarter, depending on the antibody type. Intriguingly, people with ongoing symptoms tended to have higher levels of antibodies across this period.

But when the team looked at antibody-producing memory cells, they noticed that the antibodies were changing over time. In memory cells, there’s a mechanism by which parts of the genes that encode the antibody pick up a lot of mutations over time. By continuing to select those cells that produce antibodies with a higher level of affinity, this can improve the immune response in the future.

That seems to be exactly what is happening in these post-COVID patients. At the first sampling time, the researchers identified the sequences of many of the genes that encode antibodies against coronavirus proteins. At the time of the second check months later, they were unable to find 43 of these initial antibody genes. But 22 new ones were identified, arising from the mutation process—by six months, the typical antibody gene had picked up between two and three times the number of mutations. In some cases, the authors were able to identify the ancestral antibody gene that picked up mutations to create the one present at six months.

The system seems to be working. One of the early antibodies was unable to bind some of the variants of the spike protein that have evolved in some coronavirus strains. But the replacements with more mutations could, suggesting it was higher affinity and for the spike protein than the earlier version. While the average antibody had similar affinities at the early and late time points, specific antibody lineages saw their ability to neutralize the virus increase.

The immune system has ways of preserving the spike protein to select for improved antibody variants after infections are cleared, and that may be part of what’s going on here. But in a number of participants (under half of those tested), there were still indications of active SARS-CoV-2 infections in the intestine, even though nasal tests came back negative. So it’s possible that at least some of the improved binding comes from continued exposure to the actual virus.

The big picture

Let’s emphasize again: these are both small studies, and we really need to see them replicated with larger populations and more consistent sampling. But at least when it comes to antibodies, the consistencies between these two studies are a step towards building confidence in the results. And those results are pretty good: clear signs of long-term memory and that the immune system’s ability to sharpen its defenses seems to be working against SARS-CoV-2.

Beyond that, the T cell results, while more tentative, also seem to hint at a long-term immunity. But there, the results aren’t as consistent, with different aspects of T cell immunity persisting in different patients. The researchers divided the different aspects into five categories, and found that fewer than half their study population still had all five categories of memory present after five months. But 95 percent of them had at least three categories present, suggesting the persistence of at least some memory. The problem is that, at this point, we don’t really understand what would provide protective immunity, so it’s difficult to judge the meaning of these results.

Science, 2021. DOI: 10.1126/science.abf4063
Nature, 2021. DOI: 10.1038/s41586-021-03207-w (About DOIs).

Continue Reading

Trending