Connect with us

Gaming

‘The Division 2’ is the brain-dead, antipolitical, gun-mongering vigilante simulator we deserve – TechCrunch

Published

on

In The Division 2, the answer to every question is a bullet. That’s not unique in the pervasively violent world of gaming, but in an environment drawn from the life and richly decorated with plausible human cost and cruelty, it seems a shame; and in a real world where plentiful assault rifles and government hit squads are the problems, not the solutions, this particular power fantasy feels backwards and cowardly.

Ubisoft’s meticulous avoidance of the real world except for physical likeness was meant to maximize its market and avoid the type of “controversy” that brings furious tweets and ineffectual boycotts down on media that dare to make statements. But the result is a game that panders to “good guy with a gun” advocates, NRA members, everyday carry die-hards, and those who dream of spilling the blood of unsavory interlopers and false patriots upon this great country’s soil.

There are two caveats: That we shouldn’t have expected anything else, from Ubisoft or anyone; and that it’s a pretty good game if you ignore all that stuff. But it’s getting harder to accept every day, and the excuses for game studios are getting fewer. (Some spoilers ahead, but trust me, it doesn’t matter.)

To put us all on the same page: The Division 2 (properly Tom Clancy’s The Division 2, which just about sums it up right there) is the latest “game as a service” to hit the block, aspiring less towards the bubblegum ubiquity of Fortnite and than the endless grind of a Destiny 2 or Diablo 3. The less said about Anthem, the better (except Jason Schrier’s great report, of course).

From the bestselling author of literally a hundred other books…

It’s published by Ubisoft, a global gaming power known for creating expansive gaming worlds (like the astonishingly beautiful Assassin’s Creed: Odyssey) with bafflingly uneven gameplay and writing (like the astonishingly lopsided Assassin’s Creed: Odyssey).

So it was perhaps to be expected that The Division 2 would be heavy on atmosphere and light on subtlety. But I didn’t expect to be told to see the President snatch a machine gun from his captors and mow them down — then tell your character that sometimes you can’t do what’s popular, you have to do what’s necessary.

It would be too much even if the game was a parody and not, as it in fact is, deeply and strangely earnest. But I’m getting ahead of myself.

EDC Simulator 2

The game is set in Washington, D.C.; its predecessor was in New York. Both were, like most U.S. cities in this fictitious near future, devastated by a biological attack on Black Friday that used money as a vector for a lethal virus. That’s a great idea, perhaps not practical (who pays in cash?), but a clever mashup of terrorist plots with consumerism. (The writing in the first Division was considerably better than this one.)

Your character is part of a group of sleeper agents seeded throughout the country, intended to activate in the event of a national emergency, surviving and operating on your own or with a handful of others, procuring equipment and supplies on the go, taking out the bad guys and saving the remaining civilians while authority reasserts itself.

You can see how this sets up a great game: exploring the ruins of a major city, shooing out villains, and upgrading your gear as you work your way up the ladder.

And in a way it does make a great game. If you consider the bad guys just types of human-shaped monsters, your various guns and equipment the equivalent of new swords and wands, breastplates and greaves, with your drones and tactical launchers modern spells and auras, it’s really quite a lot like Diablo, the progenitor of the “looter” genre.

Moment to moment gameplay has you hiding behind cover, popping out to snap off a few shots at the bad guys, who are usually doing the same thing 10 or 20 yards away, but generally not as well as you. Move on to the next room or intersection, do it again with some more guys, rinse and repeat. It sounds monotonous, and it is, but so is baseball. People like it anyway. (I’d like to give a shout-out to the simple, effective multiplayer that let me join a friend in seconds.)

But the problem with The Division 2 isn’t its gameplay, although I could waste your time (instead) with some nitpicking of the damage systems, the mobs, the inventory screen, and so on. The problem with The Division 2 isn’t even that it venerates guns. Practically every game venerates guns, because as Tim Rogers memorably paraphrased CliffyB once: “games are power fantasies — and it’s easy to make power fantasies, because guns are so powerful, and raycasting is simple, and raycasting is like a gun.” It’s difficult to avoid.

No, the problem with The Division 2 is the breathtaking incongruity between the powerfully visualized human tragedy your character inhabits and the refusal to engage even in an elementary way with the themes to which it is inherently tied: terrorism, guns, government and anti-government forces, and everything else. It’s exploitative, cynical, and absurd.

The Washington, D.C. of the game is a truly amazing setting. Painstakingly detailed block by block and containing many of the most notable landmarks of the area, it’s a very interesting game world to explore, even more so I imagine if you are from there or are otherwise familiar with the city.

The marks of a civilization-ending disaster are everywhere. Abandoned cars and security posts with vines and grass creeping up between them, broken and boarded up windows and doors, left luggage and improvised camping spots. Real places form the basis for thrilling setpiece shootouts: museums, famous offices, the White House itself (which you find under limp siege in the first mission). This is a fantasy very much based in reality — but only on the surface. In fact all this incredibly detailed scenery is nothing more than cover for shootouts.

I can’t tell you how many times my friend and I traversed intricately detailed monuments, halls, and other environments, marveling at the realism with which they were decorated (though perhaps there were a few too many gas cans), remarking to one another: “Damn, this place is insane. I can’t believe they made it this detailed just to have us do the same exact combat encounter as the entire rest of the game. How come nobody is talking about the history of this place, or the bodies, or the culture here?”

When fantasy isn’t

Now, to be clear, I don’t expect Ubisoft to make a game where you learn facts about helicopters while you shoot your way through the Air and Space Museum, or where you engage in philosophical conversation with the head of a band of marauders rather than lob grenades and corrosive goo in their general direction. (I kind of like both those ideas, though.)

But the dedication with which the company has avoided any kind of reality whatsoever is troubling.

We live in a time when people are taking what they call justice into their own hands by shooting others with weapons intended for warfare; when paramilitary groups are defending their strongholds with deadly force; when biological agents are being deployed against citizenry; when governments are surveilling and tracking people via controversial AI systems; when the leaders of that government are making unpopular and ethically fraught decisions without the knowledge of their constituency.

Ultimate EDC simulator

This game enthusiastically endorses all of the previous ideas with the naive justification that you’re the good guys. Of course you’re the good guys — everyone claims they’re the good guys! But objectively speaking, you’re a secret government hit squad killing whoever you’re told to, primarily other citizens. Ironically, despite being called an agent, you have no agency — you are a walking gun doing the bidding of a government that has almost entirely dissolved. What could possibly go wrong? The Division 2 certainly makes no effort to explore this.

The superficiality of the story I could excuse if it didn’t rely so strongly on using the real world as set dressing for its paramilitary dress-up-doll fantasy.

Basing your game in a real world location is, I think, a fabulous idea. But in doing so, especially if as part of the process you imply the death of millions, a developer incurs a responsibility to do more than use that location as level geometry.

The Division 2 instead uses these deaths and the most important places in D.C. literally as props. Nothing you do ever has anything to do with what the place is except in the loosest way. While you visit morgues and improvised mass graves piled with body bags, you never see anyone dead or dying… unless you kill them.

It’s hard to explain what I find so distasteful about this. It’s a combination of the obvious emphasis on the death of innocents, in a brute-force attempt to create emotional and political relevance, with the utterly vacuous violence you fill that world with. It feels disrespectful to itself, to the setting, to set a piece of media so incredibly dumb and mute in a disaster so credible and relevant.

This was a deliberate decision, to rob the game of any relevance — a marketing decision. To destroy D.C. — that sells. To write a story or design gameplay that in any way reflects why that destruction resonates — that doesn’t sell. “We cannot be openly political in our games,” said Alf Condelius, the COO of the studio that created the game, in a talk before the game’s release. Doing so, he said, would be “bad for business, unfortunately, if you want the honest truth.” I can’t be the only one who feels this to be a cop-out of pretty grand proportions, with the truth riding on its coattails.

Perhaps you think I’m holding game developers to an unreasonable standard. But I believe they are refusing to raise the bar themselves when they easily could and should. The level of detail in the world is amazing, and it was clearly designed by people who understand what could happen should disaster strike. The bodies piled in labs, the desolation of a city overtaken by nature, the perfect reproductions of landmarks — an enormous amount of effort and money was put into this part of the game.

On the other hand, it’s incredibly obvious from the get-go that very, very little attention was paid to the story and characters, the dialogue, the actual choices you can make as a player (there are none to speak of). There is no way to interact with people except to shoot them, or for them to tell you who to shoot. There is no mention of politics, of parties, of race or religion. I feel sure more time was spent modeling the guns — which, by the way, are real licensed models — than the main “characters,” though it must have been time-consuming to so completely to purge those characters of any ideas or opinions that could possibly reflect the real world.

One tragedy please, hold the relevance

This is deliberate. There’s no way this could have happened unless Ubisoft, from the start, made it clear that the game was to be divorced from the real world in every way except those that were deemed marketable.

That this is what they considerable marketable is a sad sort of indictment of the people they are selling this game to. The prospect of inserting oneself into a sort of justified vigilante role where you rain hot righteous lead on these generic villains trampling our great flag seems to be a special catnip concoction Ubisoft thought would appeal to millions — millions who (or more importantly, whose wallets) might be chilled by the idea of a story that actually takes on the societal issues that would be at play in a disaster like this one. We got the game we deserved, I suppose.

Say what you will about the narrative quality of campaigns of Call of Duty and Battlefield, but they at least attempt to engage with the content they are exploiting to sell the game. World War II is marketable because it’s the worst thing that ever happened and destroyed the lives of millions in a violent and dramatic way. Imagine building a photorealistic reproduction of wartime Stalingrad, or Paris, or Berlin, and then filling it not with Axis and Allied forces but simplified and palatable goodies and baddies with no particular ethos or history.

I certainly don’t mean to equate the theoretical destruction of D.C. with the Holocaust and WWII, but as perhaps the most popular period and venue for shooters like this, it’s the obvious comparison to make thematically, and what one finds is that however poor the story of a given WWII game, it inevitably attempts to emphasize and grapple with the enormity of the events you are experiencing. That’s the kind of responsibility I think you take on when you illustrate your game with the real world — even a fantasy version of the real world.

Furthermore Ubisoft has accepted that it must take some political stances, such as the inclusion of same-sex player-NPC relationships in Assassin’s Creed: Odyssey — not controversial to me and many others, certainly, but hardly an apolitical inclusion in the present global political landscape. (I applaud them for this, by the way, and many others have as well.) It’s arguable this is not “overt” in that Kassandra and Alexios don’t break the first wall to advocate for marriage equality, but I think it is deliberately and unapologetically espousing a stance on a politically and societally charged issue.

It seems it is not that the company cannot be overtly political, but that it decided in this case that to be political on issues of guns, the military, terrorism, and so on was too much of a risk. To me that is in itself a political choice.

I do think Ubisoft is a fantastic company and makes wonderful games — but I also think the decision to completely divorce a game with fundamentally political underpinnings from the real politics and humanitarian conditions that empower it is a sad and spineless decision that makes them look both avaricious and inhumane. I know they can do better because others already have and do.

The Division 2 is a good game as far as games go. But games, like movies, TV, and other media, are very much art now, deserving of criticism as to their ideas as well as their controls and graphics; and as art, The Division 2 is as much a barren wasteland scoured of humanity as the D.C. it depicts.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Gaming

Animaniacs 2020: Just sit back and relax, it’s nostalgia to the max

Published

on

Enlarge / Come join the Warner Brothers (and the Warner Sister, Dot).

The Warner Brothers—and the Warner Sister—are back, thanks to Hulu. The streamer has rebooted the Emmy-winning, enormously popular Animaniacs, stalwart of 1990s afternoons, for a new generation and a new era.

Animaniacs first hit the small screen in 1993, part of a cohort of cartoons that tried to reach young audiences in a whole new way. At the highest level, Animaniacs was an animated variety show, with the main plot, such as it was, centered on Yakko, Wakko, and Dot Warner, animated creations from the 1930s who spent most of the 20th century locked up in a water tower until their escape in the 1990s. The show’s artistic DNA seemed to be equal parts Looney Tunes and Laugh-In, with a Dadaist streak and a heavy dose of Mel Brooks-style parody woven through.

Animaniacs was, in the end, a pretty weird show, equal parts absurdist and educational. And that suited me perfectly because I was, frankly, a pretty weird kid.

I was in middle school when Animaniacs began airing, right on the cusp of an exceptionally awkward and uncomfortable adolescence. I was the only child of two classical musicians, one of whom was also a politics junkie and total history buff. I could tell you anything you’d like to know about the Hollywood studio system, the music of Georg Freidrich Handel, or the rise and fall of the Soviet bloc, but I couldn’t name more than two of 1993’s Top 40 songs if you’d paid me.

And along came Animaniacs, a kids’ show that didn’t talk down to me. It felt, at the time, as though it had infinite layers. Not only could you get your daily dose of slapstick (and how), but also it had educational songs that actually stuck, wrapped in layers of slyly referential humor that rewarded you for paying attention—and for being able to get the references. Suddenly all that absolutely useless knowledge in my head about 1930s and 1940s Hollywood was useful. In a show carefully designed for the kids and the adults to laugh in completely different places, I was able to laugh in both, and Animaniacs seemed to relish giving me the opportunity.

But as Yakko, Wakko, and Dot themselves are the first to tell you during the pilot episode of the reboot—in song, of course—the world has changed quite a lot in the past 27 years. Nostalgia is cheap and easy; the adults who were once ’70s, ’80s, and ’90s kids are not above lapping it up at any opportunity. But a reference by itself is neither zany nor amusing, and today’s children have a decidedly different media diet and bar for humor than we did. So with constantly connected computers in our pockets, bringing us the latest and greatest in short-form humor on demand at all times, is there still room in our century for Animaniacs to be, well, funny?

I watched the first four episodes this weekend with my 7-year-old (more about her opinions in a bit) to find out. The answer, luckily, is yes—but it takes some time to get there. Much like Good Idea, Bad Idea, you need to see the positives and the negatives juxtaposed in order to get the most out of what you’re watching.

The nostalgia is the joke

Animaniacs was always a deliberately self-aware show that existed to break the fourth wall and frolic in a meta-referential field. That was one of its charms. The new show, however, lays that on so think in the first couple of episodes that the charm wears off. Fast.

The word that appears most often in my notes is lampshading: a trope wherein a creator specifically calls out some ludicrous thing they’re doing (i.e., hanging a lampshade on it) to make sure you know they know that you know. Animaniacs is very thorough with its lampshades: there’s a whole song-and-dance number (literally) in the first segment about how this isn’t the 1990s anymore, in which the Warners explain that their job is pop culture and that pop culture has changed.

Unfortunately, as the Warners also explain, the episode was written in 2018. Traditional TV takes time, even when you design it for a streaming service, and time is not on the Warners’ side when it comes to topicality. In 2020, a trend can hit TikTok at breakfast, be all over Twitter by lunchtime, and be yesterday’s news by dinner, and TV just can’t move that fast.

As a result, Animaniacs‘ jokes about Donald Trump feel deeply passé when the show is thinking about the covfefe era and we are now into the lame-duck period, and a Game of Thrones reference landed with an entire thud. Other attempts to stay topical feel almost ghoulish against the 2020 we ended up having: a segment riffing on the Olympics, for example, serves only to remind us that we cancelled the Olympics this year because of a widespread pandemic.

“Bloompf”

When the show leans into its worst impulses, it seems to become all form and forget its function. A Red Scare setup twice removed—told by a generation of adults who heard about it from their grandparents, rather than by a generation of adults who lived through it—feels pointless. Riffing on an idea from the ’50s by way of riffing on the ’90s becomes such a tangled nest of referents that it’s more dull than entertaining, and it borders on the offensive when it leans on ancient, dried-out stereotypes to do so.

But when Animaniacs widens its scope just the tiniest bit more, it works. Gloriously. Where the show find its feet is not in rehashing everything the Warners already did nearly 30 years ago but, instead, in discovering what the Warners can do now.

The first bit that made me genuinely laugh aloud—a real, hearty laugh—came at the tail end of episode 4, when the Warners don black turtlenecks to advertise a new ultrashort-form video service, “Bloompf.” It is a parody for today broadly, not for a single frozen moment in our fleeting decade, and it’s delivered with impeccable timing and a keen understanding of what, in our current reality, is best lampooned.

To prove their mousey worth, they’ll overthrow the Earth…

I, like many others, was a big fan of the “Pinky and the Brain” segments in the original. The genetically altered, megalomaniac lab mice were so popular that they earned their own spinoff show, which ran from 1995 to 1998. To this day, I can still sing every word of the theme song from their spinoff (which is two verses longer than the version in Animaniacs shorts). I particularly enjoy answering, “I think so, Brain, but how are we going to make pencils that taste like bacon?” in a deliberately atrocious mockery of Pinky’s already-atrocious accent whenever my 7-year-old asks if I know what she’s thinking.

I mention all this to explain how much it pains me to admit that Animaniacs‘ 2020 edition has, in fact, given us far, far too much of a good thing. This is not to say that Pinky and the Brain are unwelcome or that they have run out of creative ideas for taking over the Earth. But the way they feature in every episode, unsparingly, lends a sense of dry, formulaic necessity to their presence.

The opening credits of the whole show, both old and new, asks us to, “Meet Pinky and the Brain, who want to rule the universe,” and that’s all well and good. But in lieu of Goodfeathers flock together / Slappy whacks ’em with her purse in the rebooted opening credits, we have instead the line, our brand-new cast who tested well / in focus group research. There’s a joke there, but not a personality, and that shows.

Animaniacs is at heart a variety show—which I didn’t fully appreciate until the reboot brought with it a stunning lack of variety. The three-act cartoon format always was pleasantly modular, allowing the show’s creators to put together an episode from the palette of many different recurring characters. I liked Slappy Squirrel, felt largely indifferent to the Goodfeathers, and actively disliked Rita and Runt, Elmyra, and Mindy and Buttons (poor Buttons!)—but their presence was, I think, necessary in a way. I hope that season 2 broadens the show’s scope, the same way season 1 broadens the Warners’.

“This is a kids’ show!”

Animaniacs has always been loaded with double entendre and grown-up-friendly humor. That’s true twice over for the remake, which is counting on its original audience to have returned as the adults in the room. (*raises hand* present.) But as Yakko, Wakko, or Dot tend to reminds us after every sly wink at standards and practices, Animaniacs is, in theory, children’s programming. My opinion, therefore, is only half of what matters. Do actual children, who were not alive in the 1990s, enjoy the show?

Well, my kids do, at any rate. My second-grader seems to have found a kindred soul in Pinky, and I half expect her to start saying “narf!” around the house. And we do, indeed, laugh in completely different places, as the cartoon gods intended. For example, I cackled when a sports announcer said, “High jump: now legal in 12 states!” and she laughed when the character attempting the jump face-planted ingloriously. A perfect division of cartoon labor.

That said, your modern kid is also quite likely to have a YouTube-sized attention span. The traditional-length show segments all felt slightly too long for her, especially when they got mired down in stories and jokes she doesn’t quite understand. There is a dearth of catchy, two-minute song segments for which Animaniacs became famous in the first place—which is a pity, because I think she would both enjoy them and learn from them. (As, for that matter, would I.)

A victim of the binge

Which leads to the real feeling I took away from the show. In the end, you have to take Animaniacs for what it is: an artifact of the 1990s, trying to find its way in a world where we might by now be immune to zany thanks to incessant daily exposure. And the most important feature of broadcast programming in the 1990s was: you couldn’t binge it. You had to watch new episodes when they were delivered to you.

Animaniacs is not a great show to binge-watch, but it has the seeds of a great show in it. Go into the reboot with the spirit of the original, then, and take the episodes one or two at a time. They’ll seem better for it, and you’ll get to spread the laughs out longer.

Continue Reading

Gaming

Review: Synchronic is a time-bending slow burn of a sci-fi thriller

Published

on

Anthony Mackie and Jamie Dornan star as New Orleans paramedics who encounter a series of bizarre, gruesome accidents in the sci-fi thriller Synchronic.

Chances are you missed Synchronic, the latest sci-fi film written and directed by indie filmmakers Justin Benson and Aaron Moorhead, when it was released in limited theaters and drive-ins last month. Not only were many theaters still shut down because of the pandemic, the filmmakers themselves made the unusual move of warning potential viewers (via Instagram) of the health risks associated with indoor movie theaters. (“We personally wouldn’t go to an indoor theater, so we can’t encourage you to,” they wrote.)

It was admirably responsible of them, but it did severely limit the audience, especially since the film’s distributor inexplicably opted not to release it simultaneously on VOD—now a common practice in these pandemic times. And that’s a shame, because Synchronic is a smart, inventive, thought-provoking film, featuring standout performances from co-stars Anthony Mackie and Jamie Dornan.

(Mostly mild spoilers below, with a couple of significant plot twists below the gallery. We’ll give you a heads up when we get there.)

Benson and Moorhead are well-known around the film festival circuit, co-directing the 2017 sci-fi cult hit, The Endless, as well as 2014’s Spring (which made a splash at the Toronto International Film Festival that year) and 2012’s Resolution (which takes place in the same fictional universe as The Endless). Over coffee one day, they came up with the idea for Synchronic. “It was brand-new, completely insane, and made an odd sort of real-world sense,” the directors have said, where the past would be the main antagonist—a very different kind of movie monster. “We could also express how we tend to always be looking forward or backward for happiness, rather than right here in the moment.”

Per the official premise:

When New Orleans paramedics and longtime best friends Steve (Anthony Mackie) and Dennis (Jamie Dornan) are called to a series of bizarre, gruesome accidents, they chalk it up to the mysterious new party drug found at the scene. But after Dennis’s oldest daughter suddenly disappears, Steve stumbles upon a terrifying truth about the supposed psychedelic that will challenge everything he knows about reality—and the flow of time itself.

The film opens with the duo responding to a call concerning a couple in a motel. The couple’s drug-induced hallucinations resulted in the male partner somehow plunging several floors down the elevator shaft, while the woman is in shock and unresponsive, staring in horror at something only she can see. She also has a mysterious snake bite. Steve and Dennis also respond to a call involving a burned body in an amusement park and a drug user who has been stabbed by a vintage sword.

The common factor in all these bizarre calls is a new designer drug called “synchronic.” We learn that it’s similar to DMT (the hallucinogen in ayahuasca), with a molecular structure just sufficiently different for it to be technically legal. But this particular batch was rushed to market amid rumors of a pending FDA crackdown and has some pretty severe side effects. Steve manages to buy up the remaining supply at the local Big Chief smoke shop to get the drug off the local market, but not before Dennis’ 18-year-old daughter, Brianna (Ally Ioannides), goes missing after attending a fraternity party that left one young man dead.

In the midst of all this, we learn the results of Steve’s recent MRI, and the news is not good. He’s got an inoperable brain tumor on or near the pineal gland, a tiny pea-shaped region near the center of the brain that secretes the hormone melatonin, which is tied to sleep/wake cycles, among other functions. (Fun fact: the 17th-century philosopher Rene Descartes believed the pineal gland was the seat of the soul.) That turns out to be significant, since synchronic messes with the pineal gland—hence its unusual effects with regard to how users experience time.

(WARNING: a couple of significant spoilers below.)

Both Benson and Moorhead describe themselves as “armchair enthusiasts of astrophysics, philosophy, and futurism,” among other interests, and they particularly liked the idea of a designer drug that causes people to experience past, present, and future simultaneously (or all jumbled up), rather than in a neat linear progression. When Steve encounters the chemist who created synchronic, the chemist draws an analogy to a vinyl record: you drop the needle on whatever track you wish to play, but all those other tracks are still always there. “These tracks are like time, and synchronic is the needle,” the chemist explains. Steve is also something of an armchair physicist, citing a letter by Albert Einstein to a friend whose wife had died: “the distinction between past, present, and future is only a stubbornly persistent illusion.” Synchronic temporarily shatters that temporal illusion.

But there’s a twist: it’s not just one’s perception of the flow of time that is affected. The drug actually makes you physically experience different time periods. And if that happens to involve a Spanish conquistador attacking you because you just appeared out of nowhere in a swamp, you will suffer a very real death if he succeeds in skewering you with his sword. And teenagers whose pineal glands haven’t yet calcified can actually travel to another time period and get stuck there, which Steve realizes is what has happened to Brianna.

Because of his cancer, Steve has the uncalcified pineal gland of a teenager rather than an adult. So he thinks he can rescue Brianna with his limited supply of the remaining synchronic. One of my favorite elements of the film is how Steve conducts a series of videotaped experiments, gradually figuring out the “rules” at play. For instance, where you are standing turns out to determine which time period you end up in (for some reason, it’s always the past, never the future), and you have to return within a short window of time.

But Steve screws up when he decides to take his trusty doggo, Hawking, back in time with him for one experimental run. Suffice to say that dogs will be dogs, and Hawking doesn’t make it back to the present in time. Instead, Steve gets one final glimpse of Hawking whimpering sadly at his beloved master before the vision fades. And Steve only has enough synchronic left to either rescue Hawking or rescue Brianna. He makes the right call (Brianna), but that doesn’t make Hawking’s fate any less heartbreaking.

It’s the most upsetting scene in the film; I’m still kinda mad at Steve for risking Hawking instead of running that experiment with an animal that was not a beloved loyal canine companion. Yet there’s no denying its power. That moment is permanently seared onto my brain, and it’s critical in terms of raising the emotional stakes. So objectively, I have to applaud Benson and Moorhead for not blinking on that score. I can always console myself by imagining Hawking being befriended by a young boy, and they go on to share all kinds of fun adventures. Hawking still gets to live his best life, albeit in a distant past.

The entire structure of this movie is admirably tight, as Benson and Moorhead continuously add extra constraints to further heighten the tension and build genuine suspense. Synchronic is a smoldering slow burn that pays off with a surprisingly moving, bittersweet conclusion. But I still maintain that Hawking was a Very Good Boi who deserved better.

Synchronic should be coming to VOD in the next few months.

Listing image by Well Go USA

Continue Reading

Gaming

Xbox Series X/S vs. PlayStation 5: Our launch-month verdict

Published

on

Enlarge / L-R: Xbox Series X, PlayStation 5, Xbox Series S.

Sam Machkovech

Though this year’s newest consoles have only been on store shelves for less than two weeks, we’ve already published tens of thousands of words about the Xbox Series X/S and the PlayStation 5. Between months of tech previews, picture-filled unboxings, comprehensive reviews, coverage of some of the biggest launch games, and more, you could spend all day doing nothing but reading our detailed thoughts about Sony and Microsoft’s new consoles.
If you don’t have the time for all that, we understand. That’s why we’ve put together this handy, head-to-head summary comparing the most important features of both systems directly. By the end, we hope you’ll know if it’s time for you to upgrade your console, and which path you should take if it is.

Hardware design

Both the PS5 ($499 with disc drive, $399 without) and the Xbox Series X ($499) are really big. The Series X astounds as a chunky, minimalist cuboid, with a minimum 6″ clearance on any of its sides—making it a nightmare for an average entertainment center’s shelves. The PS5 gets its minimum clearance down to 4.25″, but that comes at the cost of being 50 percent bigger than Series X in total volume. Once you find a place to put either, the other differences boil down to your aesthetic preferences: black monolith with mild green accents, or a curvy popped-collar tower?

Both are quiet (excepting discs spinning in the disc drives) but the PS5 has a slight discernible fan noise, whereas Xbox Series X is literally whisper quiet. While we’ve seen reports about “coil whine” affecting certain PS5 customers, we haven’t been able to duplicate that noise issue.

While those two consoles’ cooling systems are not identical, their silicon makeup is similar enough to explain why they draw very similar amounts of power. Each maxes out at roughly 205W at next-gen games, though they run closer to 190W on average.

Xbox Series S ($299), meanwhile, is as quiet as its Series X sibling (owing to, among other things, an identical 12″ fan system), while shrinking to a form factor on par with 2017’s Xbox One X. The longer we’ve sat with it, the more we’ve grown to like its “Bluetooth speaker” design of a black ring on an otherwise white box—especially as slotted into a crowded entertainment center. Its power draw is also phenomenal, never exceeding 90W on the console’s highest-drawing games.

Hardware power

Put aside all the talk of GCN compute units, RDNA 2 cores, Zen 2 Jaguar cores, and the like. When it comes to running actual games, the Xbox Series X and the PS5 are practically indistinguishable. Third-party titles available on both systems look and run almost identically, and you’d be hard pressed to pick one from the other in blind tests.

Series X power usage
Rest mode 16-30.5W
Rest mode (w/ download) 33-55W
Idle on menu 62W
Netflix 64W
Playing 4K Blu-ray 64-76W
Gameplay (Spelunky X360) 101-104W
Gameplay (Gears 5 XSX) 170-198W
Installing Dark Souls II from disc 70 – 71.5W
Playing Dark Souls II (w/ disc in drive) 103-105W
Series S power usage
Rest mode 8.6-17.5W
Rest mode (w/ download) 16-18W
Idle on menu 31W
Netflix 40W
Playing 4K Blu-ray n/a
Gameplay (Spelunky X360) 53W
Gameplay (Gears 5 XSS) 50-85W
Installing Dark Souls II from disc n/a

Only one title proves an exception at this point: Assassin’s Creed Valhalla. While both PS5 and Xbox Series X target identical graphical settings and not-quite-4K resolutions (and they look good doing so), its Series X version currently struggles to lock to 60fps as well as PlayStation 5 does. That’s not enough data to declare PS5 the “stronger” console, and we’ll be coming back to that question as we compare more third-party games in the coming months.

Compared to their predecessors, games on the new consoles do look better, taking advantage of higher resolutions and graphical techniques like ray tracing (which is especially noticeable in reflections). But depending on the game, the increase in fidelity is more marginal than you might expect for a $500 upgrade. The seven-year-old hardware Sony and Microsoft are looking to replace has aged better than you might have expected, and the mid-generation upgrades that came out in 2016 and 2017 continue to hold up quite well.

Where you’ll see a huge jump in 2020’s consoles is in frame rates. Games like Yakuza: Like A Dragon, Spider-Man: Miles Morales, and Assassin’s Creed Valhalla look quite similar when comparing screenshots across “last-gen” and “next-gen” systems. But the bump from 30 fps on older consoles to 60 fps on newer consoles makes a huge difference when seeing these games in motion.

PS5 power usage
Rest mode 28-32W
Rest mode (w/ download) 42-45W
Idle on menu 67W
Netflix 71-73W
Playing 4K Blu-ray 76-79W
Gameplay (Downwell PS4) 70-76W
Gameplay (Tony Hawk 1+2 PS4) 116-130W
Gameplay (Miles Morales PS5) 156-205W
Installing Knack from disc 124 – 134W
Playing Knack (w/ disc in drive) 116-127W

In the case of some games, like the PS5-exclusive adventure of Demon’s Souls, that extra 60 fps fluidity contributes to atmosphere in incredible ways. But even that game is mechanically identical to its source material, which dates back to PS3. And another Sony exclusive, the surprisingly charming Sackboy: A Big Adventure, is so similar between its PS4 and PS5 versions that we recommend anyone missing out on new consoles rush to play that family-friendly game on their last-gen machines.

All of the new consoles enjoy blistering fast loading times, thanks to the now-standard PCIe 4.0-rated NVMe storage. It’s not quite a return to the “hit power and start playing instantly” days of cartridge gaming, but it’s close.

The PS5 appears to have the loading time edge in some cases (like the aforementioned Assassin’s Creed Valhalla), but the differences across next-gen consoles are minor at this point. Meanwhile, Xbox Series enjoys the benefits of Xbox Quick Resume, allowing near-instant swapping from game to game. on the PS5, you have to endure a (quick) load from the main menu when swapping to a new title, rather than resuming directly from where you left off.

As of press time, though, some Series X/S games choke on this Quick Resume feature. We hope Xbox fixes these edge cases soon, because even with faster storage, PS5 feels sluggish in comparison without its own version of Quick Resume.

One important note: Xbox Series S has been advertised as able to play Series X’s up-to-4K games, only pared down to resolutions ranging from 1080p to 1440p. In action, that sales pitch is mildly misleading, as visual downgrades from X to S also include reductions in shadow resolution, level-of-detail (LoD) scaling, ambient occlusion, and other features, depending on the game. For the most part, we’ve seen identical frame rates between Series X and Series S, which is arguably a bigger deal, but Assassin’s Creed Valhalla remains a striking exception: only 30fps on Series S, compared to 60fps on Series X. Until we compare more next-gen Series X/S games, this issue remains a huge asterisk for the $299 Series S.

Continue Reading

Trending