Connect with us

Science

The best PC deals from Staples Black Friday ad include $150 Chromebook

Published

on

Staples Black Friday ad

Potential PC buyers are starting to get more options to consider if they plan to purchase a new system during the upcoming Black Friday shopping weekend. In addition to Dell’s Black Friday ad, we’ve also seen ads leaked from Costco and BJs Wholesale Club — even Walmart — with laptop and desktop deals. Now Staples has released its Black Friday plans, complete with a whole new slate of specials to mull over.

The office superstore has to make up with breadth for being unable to offer prices as dirt cheap as other retailers. For instance, its cheapest Chromebook special is a $149.99 HP with Intel Celeron processor, 4GB of RAM, 16GB of storage, and 11.6-inch screen — $50 more than the Samsung Chromebook Walmart is offering. It does have a 14-inch Acer Chromebook on sale, however, for $199.99 if you desire a bigger screen.

But not everyone wants the lowest-priced system, and for those who need more — be it RAM, storage, processing power, or better display — Staples has some intriguing options. For instance, for under $400, there’s an HP Pavilion laptop with Core i5, 8GB of RAM, terabyte hard drive, and 14-inch display for $369.99, or for the same price, a Pavilion desktop tower with a Core i5 and terabyte hard drive but with 12 gigs of memory. For $499.99, you can step up to a Pavilion notebook with a Core i7 CPU, 16GB of Intel Optane memory to complement the terabyte hard drive, and a 15.6-inch full HD display.

Not many laptops have adopted AMD’s Ryzen mobile chips yet, but Staples does have a couple of them as part of its Black Friday plans. There’s a 15.6-inch Acer Aspire with Ryzen 3 processor, 5 gigs of RAM, and terabyte hard drive for $319.99, and a Lenovo 330s that includes a Ryzen 7, 8GB of RAM, 1TB hard drive, and 15.6-inch full HD screen for $469.99.

If you prefer the flexibility of a 2-in-1 portable, there’s an Acer 13.3-inch Chromebook with MediaTek processor, 4 gigs of RAM, and 32 gigs of storage for $309.99, or a Core i3-based Acer Spin 3 with 8GB of RAM, 256GB solid-state drive, 15.6-inch full HD touchscreen, and integrated Amazon Alexa for $469.99. Fore more power (and cash), there’s an HP Pavilion x360 with Core i7, 8GB of memory, terabyte hard drive, and 15.6-inch full HD touch-enabled display for $729.99.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Science

How a Thanksgiving Day gag ruffled feathers in Mission Control

Published

on

Enlarge / Flight Director James M. (Milt) Heflin, in Mission Control during the flight of STS-26 in 1988.

NASA

The phone call from the “Mountain” to Mission Control in Houston came at just about the worst possible time. It was the wee hours of Thanksgiving morning in 1991. Up in space, the crew members on board space shuttle Atlantis were sleeping. Now all of a sudden, Lead Flight Director Milt Heflin faced a crisis.

The flight dynamics officer in Mission Control informed Heflin that the Cheyenne Mountain Air Force Station, which tracked orbital traffic, had called to warn that a dormant Turkish satellite had a potential conjunction with the space shuttle in only 15 minutes. Moreover, this potential debris strike was due to occur in the middle of a communications blackout with the crew, as the spacecraft passed over the southern tip of Africa.

There was no way for Heflin’s engineers to calculate an avoidance maneuver, wake the crew, and communicate with them before the blackout period began. Heflin was livid—why had the Air Force not given more warning about a potential collision? Typically, they provided about 24 hours’ notice. By God, if that satellite hit Atlantis, they could very well lose the astronauts as they slept. The crew of STS-44 might never awaken.

An experienced flight director who had started work at the space agency more than two decades earlier during the Apollo program, conducting oceanic recovery operations after the Moon landings, Heflin was largely unflappable. But now, he grew tense. “When I think about all of my time, I don’t remember ever being so nervous or upset about something as I was then,” he told Ars recently.

What Heflin did not know at the time, however, is that he had been snookered by two of his flight controllers during an otherwise boring overnight shift, during a fairly routine shuttle mission to deploy several Air Force payloads. There was no derelict satellite—the allusion to “turkey” on Thanksgiving had gone over his head. But the story did not end there.

Practical jokes

Back in the beginning, NASA was not the buttoned up space agency it is today. Early on, especially during the Mercury program, NASA’s decision makers moved quickly, often flying by the seats of their pants. There also was more room for practical jokes, even within the sanctum of Mission Control.

In his book The Birth of NASA, Manfred “Dutch” von Ehrenfried wrote about a fabled practical joke that took place a few weeks before John Glenn’s first orbital flight, in 1962, atop an Atlas rocket. Chris Kraft, NASA’s legendary first flight director, led his teams through long days and nights of training, simulations, and discussions on mission rules for this critical flight.

At the time, missions were planned and managed out of the Mercury Control Center at Cape Canaveral Air Force Station, and leading up to Glenn’s flight there were several scrubs. One night, to break the tedium, Kraft’s key lieutenant, Gene Kranz, decided to prank his boss the next day when two activities were due to occur simultaneously. Kraft would be leading a mission simulation while Kranz led a launch pad test with the Atlas rocket. While performing the mission simulation, Kranz knew Kraft would be watching the pad activities on a console television.

Working with John Hatcher, a video support coordinator for the control center, Kranz had an old video of an Atlas launch substituted into Kraft’s feed. Moreover, Kranz and Hatcher timed it such that the rocket would appear to liftoff immediately after Kraft threw the “Firing Command” switch as part of his simulation.

Here’s how von Ehrenfried characterizes what happened next in Florida:

As the simulation proceeded, Kraft would ask Kranz how the pad test was going and Kranz would give him a quick status check with a straight face and his head down. As the simulation got down to liftoff, at just the same moment Kraft threw the switch, Hatcher started the old Atlas liftoff video on Kraft’s console TV. Kraft’s eyes bulged and his forehead wrinkled as he stared at the TV. He turns to Kranz and says, “Did you see that?” Kranz plays dumb and says, “See what?” Without a pause, Kraft says, “The damned thing lifted off!” Hatcher and Kranz tried to keep a straight face but they both couldn’t hold back the laughter. Kraft says, “Who the hell did this?” He then realized he had been “had” and gave a half-hearted laugh. Kranz and Hatcher pulled Superman’s Cape and survived!

Continue Reading

Science

Robots invade the construction site

Published

on

Theresa Arevalo was in high school when she first tried finishing drywall at her brother’s construction company. “It’s a fine art,” she says of mudding—applying and smoothing drywall. “Like frosting a cake, you have to give the illusion that the wall is flat.”

Fast-forward a few decades: Arevalo now works at Canvas, a company that’s built a robot using artificial intelligence that’s capable of drywalling with almost as much artistry as a skilled human worker.

The robot has been deployed, under Arevalo’s supervision, at several construction sites in recent months, including the new Harvey Milk Terminal at San Francisco International Airport and an office building connected to the Chase Center arena in San Francisco.

About the size of a kitchen stove, the four-wheeled robot navigates an unfinished building carrying laser scanners and a robotic arm fitted to a vertical platform. When placed in a room, the robot scans the unfinished walls using lidar, then gets to work smoothing the surface before applying a near perfect layer of drywall compound; sensors help it steer clear of human workers.

The Canvas robot can help companies do more drywalling in less time. It requires human oversight, but its operator does not need to be an expert drywaller or roboticist.

It has long been impractical to deploy robots at construction sites, because the environment is so varied, complex, and changing. In the past few years, however, advances including low-cost laser sensors, cheaper robotic arms and grippers, and open source software for navigation and computer vision have made it possible to automate and analyze more construction.

The more advanced machines marching onto construction sites will help make construction less wasteful. According to McKinsey, productivity in construction has improved less than in any other industry over the past couple of decades. The arrival of more automation may also alter demand for labor in a number of building trades.

Kevin Albert, cofounder and CEO of Canvas, previously worked at Boston Dynamics (a company famous for its lifelike walking robots) and in the manufacturing industry. He says there’s great opportunity in construction, which generates about $1.4 trillion annually and accounts for around 7 percent of US GDP but has seen relatively little use of computerization and automation. “We really see construction as mobile manufacturing,” he says. “There’s this natural extension of what machines are now capable of out in the real world.”

Canvas is part of a boom in construction technology, says Alex Schreyer, director of the Building and Construction Technology Program at the University of Massachusetts, Amherst. He says some of the biggest progress is being made in prefabrication of buildings, using robotic processes to construct large parts of buildings that are then assembled on-site. But increasingly, he says, robots and AI are also finding their way onto conventional work sites.

Autonomous vehicles made by Volvo ferry materials and tools around some large sites. Technology from San Francisco startup Built Robotics lets construction machinery such as diggers and dozers operate autonomously. A growing array of robotic equipment can take over specialized construction tasks including welding, drilling, and brick-laying. “There are some really interesting things happening,” Schreyer says.

“So much potential”

An IDC report published in January 2020 forecasts that demand for construction robots will grow about 25 percent annually through 2023.

One big opportunity in construction, Schreyer says, is using computer vision and other sensing technologies to track the movement of materials and workers around a work site. Software can automatically flag if a job is falling behind or if something has been installed in the wrong place. “There is so much potential to do something with that using AI,” Schreyer says. “More companies are going to move into that AI space.”

Doxel, based in Redwood City, California, makes a mobile robot that scans work sites in 3D so that software can calculate how the project is progressing. A four-legged Boston Dynamics robot called Spot is being tested for the same purpose at a number of sites. Several companies sell drones for automated construction site inspection, including Propeller, vHive, ABJ Drones, and DJI.

Buildots, based in Tel Aviv, Israel, sells software that uses cameras fitted to the helmets of site managers, which automatically capture a site and process the images to identify discrepancies between plans and ongoing work. The technology is being used on several large European construction projects.

Roy Danon, Buildots’ cofounder and CEO, says the goal is to use the data collected from work sites to help companies design buildings and plan construction schedules better. “We believe we can have a huge impact on planning,” he says, “if we have enough projects that show how you plan and how things actually turn out.”

“The adoption of technology in construction has lagged behind almost everything except hunting and fishing for the past decades,” says Josh Johnson, a consultant at McKinsey who follows the building industry.

Enter the pandemic

A McKinsey report last month predicted a big shakeout across the construction industry over the next decade, with companies adopting technologies and methodologies from the manufacturing world. Things have already begun to change, thanks to technological progress and an increasingly tech-savvy workforce, Johnson says. The pandemic is accelerating the shift, too, by making it more difficult to bring workers to a site and forcing companies to reevaluate supply lines and processes. “It’s forcing many of these legacy [construction contractors] and large companies to begin investing,” Johnson says.

Arevalo, who oversees deployments of Canvas’ robot, says the drywalling robot cannot tackle corners or angles like a human; she says many apprentices see working with the robot as an opportunity to learn how to use more advanced robotic machinery.

The company also has the backing of the local union. “It’s critical for skilled workers to have great resources in their tool kit, and we are excited to be on the leading edge of technology in our industries by partnering with Canvas,” Robert Williams III, business manager at District Council 16, International Union of Painters and Allied Trades, said in a statement.

But this apparently hasn’t quelled concerns among construction workers who’ve seen the robot in action. “They love the fact that it’s so consistent, that the wall is gorgeous,” Arevalo says. “But then the next question is, ‘When is it going to take my job?’”

This story originally appeared on wired.com.

Listing image by Canvas

Continue Reading

Science

Why are nuclear plants so expensive? Safety’s only part of the story

Published

on

Should any discussion of nuclear power go on for long enough, it becomes inevitable that someone will rant that the only reason they’ve become unaffordable is a proliferation of safety regulations. The argument is rarely (if ever) fleshed out—no specific regulation is ever identified as problematic, and there seems to be no consideration given to the fact that we might have learned something at, say, Fukushima that might merit addressing through regulations.

But there’s now a paper out that provides some empirical evidence that safety changes have contributed to the cost of building new nuclear reactors. But the study also makes clear that they’re only one of a number of factors, accounting for only a third of the soaring costs. The study also finds that, contrary to what those in the industry seem to expect, focusing on standardized designs doesn’t really help matters, as costs continued to grow as more of a given reactor design was built.

More of the same

The analysis, done by a team of researchers at MIT, is remarkably comprehensive. For many nuclear plants, they have detailed construction records, broken out by which building different materials and labor went to, and how much each of them cost. There’s also a detailed record of safety regulations and when they were instituted relative to construction. Finally, they’ve also brought in the patent applications filed by the companies who designed the reactors. The documents describe the motivations for design changes and the problems those changes were intended to solve.

There are limits to how much even this level of detail can provide. You can’t determine, for example, whether the cost of a specific number of workers on a given building should be assigned to implementing safety regulations. And in many instances, design changes were done for multiple reasons, so there’s not simply a safety/non-safety breakdown. Still, the collection of sources they have allows them to make some very direct conclusions about the sources of changing costs and to build very informed models that can infer the reasons for other costs.

The researchers start out with a historic analysis of plant construction in the US. The basic numbers are grim. The typical plant built after 1970 had a cost over run of 241 percent—and that’s not considering the financing costs of the construction delays.

Many in the nuclear industry view this as, at least in part, a failure to standardize designs. There’s an extensive literature about the expectation that building additional plants based on a single design will mean lower costs due to the production of standardized parts, as well as management and worker experience with the construction process. That sort of standardization is also a large part of the motivation behind small, modular nuclear designs, which envision a reactor assembly line that then ships finished products to installations.

But many of the US’ nuclear plants were in fact built around the same design, with obvious site-specific aspects like different foundation needs. The researchers track each of the designs used separately, and they calculate a “learning rate”—the drop in cost that’s associated with each successful completion of a plant based on that design. If things went as expected, the learning rate should be positive, with each sequential plant costing less. Instead, it’s -115 percent.

Where’s that money go?

Figuring out what’s causing those changes involved diving into detailed accounting records on the construction of these nuclear plants; data on that was available for plants built after 1976. The researchers broke out the cost for 60 different aspects of construction, finding that nearly all of them went up, which suggests there wasn’t likely to be a single, unifying cause for the price increases. But the largest increases occurred in what they termed indirect costs: engineering, purchasing, planning, scheduling, supervision, and other factors not directly associated with the process of building the plant.

The increased indirect costs affected nearly every aspect of plant construction. As far as direct costs went, the biggest contributors were simply the largest structures in the plant, such as the steam supply system, the turbine generator, and the containment building.

Some of the changed costs are rather complicated. For example, many reactors shifted to a design that allowed greater passive cooling, which would make the plant more safe in the case of hardware failure. That in turn required separating the reactor vessel from the containment building walls. And that in turn allowed the use of lower-quality steel (which lowered the price), but more of it (which more than offset those savings). All of this also changed the construction process, although it’s difficult to determine exactly how this altered the amount of labor required.

To try to dive into the details, the researchers tracked the progress of material deployment rates—how quickly material brought to the site ended up being incorporated into a finished structure. While those rates have declined slightly for construction as a whole over the study period, they plunged for nuclear projects. Already, at the time of the Three Mile Island accident, steel was being deployed at about one third of the rate of the construction industry at large. Interviews with construction workers indicated that they were spending as much as 75 percent of their time idle.

Regulation

Since many of the researchers are in the Department of Nuclear Engineering at MIT, they are able to go through and connect the cost changes to specific motivations and check these connections by looking at patents and journal papers that describe the ideas driving these changes.

Some of the driving factors are definitely regulatory. After the Three Mile Island accident, for example, regulators “required increased documentation of safety-compliant construction practices, prompting companies to develop quality assurance programs to manage the correct use and testing of safety-related equipment and nuclear construction material.” Putting those programs in place and ensuring that documentation both added costs to the projects.

But those were far from the only costs. They cite a worker survey that indicated that about a quarter of the unproductive labor time came because the workers were waiting for either tools or materials to become available. In a lot of other cases, construction procedures were changed in the middle of the build, leading to confusion and delays. Finally, there was the general decrease in performance noted above. All told, problems that reduced the construction efficiency contributed nearly 70 percent to the increased costs.

In contrast, R&D related expenses, which included both regulatory changes and things like the identification of better materials or designs, accounted for the other third of the increases. Often, a single change met several R&D goals, so assigning the full third to regulatory changes is probably an over-estimate.

So, while safety regulations added to the costs, they were far from the primary factor. And deciding whether they were worthwhile costs would require a detailed analysis of every regulatory change in light of accidents like Three Mile Island and Fukushima.

As for the majority of the cost explosion, the obvious question is whether we can do any better. Here, the researchers’ answer is very much a “maybe.” They consider things like the possibility of using a central facility to produce high-performance concrete parts for the plant, as we have shifted to doing for projects like bridge construction. But this concrete is often more expensive than materials poured on site, meaning the higher efficiency of the off-site production would have to more than offset that difference. The material’s performance in the environment of a nuclear plant hasn’t been tested, so it’s not clear whether it’s even a solution.

In the end, the conclusion is that there are no easy answers to how to make nuclear plant construction more efficient. And, until there are, it will continue to be badly undercut by both renewables and fossil fuel.

Joule, 2020. DOI: 10.1016/j.joule.2020.10.001  (About DOIs).

Continue Reading

Trending