Connect with us

Gadgets

The damage of defaults – TechCrunch

Published

on

Apple popped out a new pair of AirPods this week. The design looks exactly like the old pair of AirPods. Which means I’m never going to use them because Apple’s bulbous earbuds don’t fit my ears. Think square peg, round hole.

The only way I could rock AirPods would be to walk around with hands clamped to the sides of my head to stop them from falling out. Which might make a nice cut in a glossy Apple ad for the gizmo — suggesting a feeling of closeness to the music, such that you can’t help but cup; a suggestive visual metaphor for the aural intimacy Apple surely wants its technology to communicate.

But the reality of trying to use earbuds that don’t fit is not that at all. It’s just shit. They fall out at the slightest movement so you either sit and never turn your head or, yes, hold them in with your hands. Oh hai, hands-not-so-free-pods!

The obvious point here is that one size does not fit all — howsoever much Apple’s Jony Ive and his softly spoken design team believe they have devised a universal earbud that pops snugly in every ear and just works. Sorry, nope!

A proportion of iOS users — perhaps other petite women like me, or indeed men with less capacious ear holes — are simply being removed from Apple’s sales equation where earbuds are concerned. Apple is pretending we don’t exist.

Sure we can just buy another brand of more appropriately sized earbuds. The in-ear, noise-canceling kind are my preference. Apple does not make ‘InPods’. But that’s not a huge deal. Well, not yet.

It’s true, the consumer tech giant did also delete the headphone jack from iPhones. Thereby depreciating my existing pair of wired in-ear headphones (if I ever upgrade to a 3.5mm-jack-less iPhone). But I could just shell out for Bluetooth wireless in-ear buds that fit my shell-like ears and carry on as normal.

Universal in-ear headphones have existed for years, of course. A delightful design concept. You get a selection of different sized rubber caps shipped with the product and choose the size that best fits.

Unfortunately Apple isn’t in the ‘InPods’ business though. Possibly for aesthetic reasons. Most likely because — and there’s more than a little irony here — an in-ear design wouldn’t be naturally roomy enough to fit all the stuff Siri needs to, y’know, fake intelligence.

Which means people like me with small ears are being passed over in favor of Apple’s voice assistant. So that’s AI: 1, non-‘standard’-sized human: 0. Which also, unsurprisingly, feels like shit.

I say ‘yet’ because if voice computing does become the next major computing interaction paradigm, as some believe — given how Internet connectivity is set to get baked into everything (and sticking screens everywhere would be a visual and usability nightmare; albeit microphones everywhere is a privacy nightmare… ) — then the minority of humans with petite earholes will be at a disadvantage vs those who can just pop in their smart, sensor-packed earbud and get on with telling their Internet-enabled surroundings to do their bidding.

Will parents of future generations of designer babies select for adequately capacious earholes so their child can pop an AI in? Let’s hope not.

We’re also not at the voice computing singularity yet. Outside the usual tech bubbles it remains a bit of a novel gimmick. Amazon has drummed up some interest with in-home smart speakers housing its own voice AI Alexa (a brand choice that has, incidentally, caused a verbal headache for actual humans called Alexa). Though its Echo smart speakers appear to mostly get used as expensive weather checkers and egg timers. Or else for playing music — a function that a standard speaker or smartphone will happily perform.

Certainly a voice AI is not something you need with you 24/7 yet. Prodding at a touchscreen remains the standard way of tapping into the power and convenience of mobile computing for the majority of consumers in developed markets.

The thing is, though, it still grates to be ignored. To be told — even indirectly — by one of the world’s wealthiest consumer technology companies that it doesn’t believe your ears exist.

Or, well, that it’s weighed up the sales calculations and decided it’s okay to drop a petite-holed minority on the cutting room floor. So that’s ‘ear meet AirPod’. Not ‘AirPod meet ear’ then.

But the underlying issue is much bigger than Apple’s (in my case) oversized earbuds. Its latest shiny set of AirPods are just an ill-fitting reminder of how many technology defaults simply don’t ‘fit’ the world as claimed.

Because if cash-rich Apple’s okay with promoting a universal default (that isn’t), think of all the less well resourced technology firms chasing scale for other single-sized, ill-fitting solutions. And all the problems flowing from attempts to mash ill-mapped technology onto society at large.

When it comes to wrong-sized physical kit I’ve had similar issues with standard office computing equipment and furniture. Products that seems — surprise, surprise! — to have been default designed with a 6ft strapping guy in mind. Keyboards so long they end up gifting the smaller user RSI. Office chairs that deliver chronic back-pain as a service. Chunky mice that quickly wrack the hand with pain. (Apple is a historical offender there too I’m afraid.)

The fixes for such ergonomic design failures is simply not to use the kit. To find a better-sized (often DIY) alternative that does ‘fit’.

But a DIY fix may not be an option when discrepancy is embedded at the software level — and where a system is being applied to you, rather than you the human wanting to augment yourself with a bit of tech, such as a pair of smart earbuds.

With software, embedded flaws and system design failures may also be harder to spot because it’s not necessarily immediately obvious there’s a problem. Oftentimes algorithmic bias isn’t visible until damage has been done.

And there’s no shortage of stories already about how software defaults configured for a biased median have ended up causing real-world harm. (See for example: ProPublica’s analysis of the COMPAS recidividism tool — software it found incorrectly judging black defendants more likely to offend than white. So software amplifying existing racial prejudice.)

Of course AI makes this problem so much worse.

Which is why the emphasis must be on catching bias in the datasets — before there is a chance for prejudice or bias to be ‘systematized’ and get baked into algorithms that can do damage at scale.

The algorithms must also be explainable. And outcomes auditable. Transparency as disinfectant; not secret blackboxes stuffed with unknowable code.

Doing all this requires huge up-front thought and effort on system design, and an even bigger change of attitude. It also needs massive, massive attention to diversity. An industry-wide championing of humanity’s multifaceted and multi-sized reality — and to making sure that’s reflected in both data and design choices (and therefore the teams doing the design and dev work).

You could say what’s needed is a recognition there’s never, ever a one-sized-fits all plug.

Indeed, that all algorithmic ‘solutions’ are abstractions that make compromises on accuracy and utility. And that those trade-offs can become viciously cutting knives that exclude, deny, disadvantage, delete and damage people at scale.

Expensive earbuds that won’t stay put is just a handy visual metaphor.

And while discussion about the risks and challenges of algorithmic bias has stepped up in recent years, as AI technologies have proliferated — with mainstream tech conferences actively debating how to “democratize AI” and bake diversity and ethics into system design via a development focus on principles like transparency, explainability, accountability and fairness — the industry has not even begun to fix its diversity problem.

It’s barely moved the needle on diversity. And its products continue to reflect that fundamental flaw.

Many — if not most — of the tech industry’s problems can be traced back to the fact that inadequately diverse teams are chasing scale while lacking the perspective to realize their system design is repurposing human harm as a de facto performance measure. (Although ‘lack of perspective’ is the charitable interpretation in certain cases; moral vacuum may be closer to the mark.)

As WWW creator, Sir Tim Berners-Lee, has pointed out, system design is now society design. That means engineers, coders, AI technologists are all working at the frontline of ethics. The design choices they make have the potential to impact, influence and shape the lives of millions and even billions of people.

And when you’re designing society a median mindset and limited perspective cannot ever be an acceptable foundation. It’s also a recipe for product failure down the line.

The current backlash against big tech shows that the stakes and the damage are very real when poorly designed technologies get dumped thoughtlessly on people.

Life is messy and complex. People won’t fit a platform that oversimplifies and overlooks. And if your excuse for scaling harm is ‘we just didn’t think of that’ you’ve failed at your job and should really be headed out the door.

Because the consequences for being excluded by flawed system design are also scaling and stepping up as platforms proliferate and more life-impacting decisions get automated. Harm is being squared. Even as the underlying industry drum hasn’t skipped a beat in its prediction that everything will be digitized.

Which means that horribly biased parole systems are just the tip of the ethical iceberg. Think of healthcare, social welfare, law enforcement, education, recruitment, transportation, construction, urban environments, farming, the military, the list of what will be digitized — and of manual or human overseen processes that will get systematized and automated — goes on.

Software — runs the industry mantra — is eating the world. That means badly designed technology products will harm more and more people.

But responsibility for sociotechnical misfit can’t just be scaled away as so much ‘collateral damage’.

So while an ‘elite’ design team led by a famous white guy might be able to craft a pleasingly curved earbud, such an approach cannot and does not automagically translate into AirPods with perfect, universal fit.

It’s someone’s standard. It’s certainly not mine.

We can posit that a more diverse Apple design team might have been able to rethink the AirPod design so as not to exclude those with smaller ears. Or make a case to convince the powers that be in Cupertino to add another size choice. We can but speculate.

What’s clear is the future of technology design can’t be so stubborn.

It must be radically inclusive and incredibly sensitive. Human-centric. Not locked to damaging defaults in its haste to impose a limited set of ideas.

Above all, it needs a listening ear on the world.

Indifference to difference and a blindspot for diversity will find no future here.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Gadgets

This 22-year-old builds chips in his parents’ garage

Published

on

Enlarge / Sam Zeloof completed this homemade computer chip with 1,200 transistors, seen under a magnifying glass, in August 2021.

Sam Kang

In August, chipmaker Intel revealed new details about its plan to build a “mega-fab” on US soil, a $100 billion factory where 10,000 workers will make a new generation of powerful processors studded with billions of transistors. The same month, 22-year-old Sam Zeloof announced his own semiconductor milestone. It was achieved alone in his family’s New Jersey garage, about 30 miles from where the first transistor was made at Bell Labs in 1947.

With a collection of salvaged and homemade equipment, Zeloof produced a chip with 1,200 transistors. He had sliced up wafers of silicon, patterned them with microscopic designs using ultraviolet light, and dunked them in acid by hand, documenting the process on YouTube and his blog. “Maybe it’s overconfidence, but I have a mentality that another human figured it out, so I can too, even if maybe it takes me longer,” he says.

Zeloof’s chip was his second. He made the first, much smaller one as a high school senior in 2018; he started making individual transistors a year before that. His chips lag Intel’s by technological eons, but Zeloof argues only half-jokingly that he’s making faster progress than the semiconductor industry did in its early days. His second chip has 200 times as many transistors as his first, a growth rate outpacing Moore’s law, the rule of thumb coined by an Intel cofounder that says the number of transistors on a chip doubles roughly every two years.

Zeloof now hopes to match the scale of Intel’s breakthrough 4004 chip from 1971, the first commercial microprocessor, which had 2,300 transistors and was used in calculators and other business machines. In December, he started work on an interim circuit design that can perform simple addition.

Zeloof says making it easier to tinker with semiconductors would foster new ideas in tech.
Enlarge / Zeloof says making it easier to tinker with semiconductors would foster new ideas in tech.

Sam Kang

Outside Zeloof’s garage, the pandemic has triggered a global semiconductor shortage, hobbling supplies of products from cars to game consoles. That’s inspired new interest from policymakers in rebuilding the US capacity to produce its own computer chips, after decades of offshoring.

Garage-built chips aren’t about to power your PlayStation, but Zeloof says his unusual hobby has convinced him that society would benefit from chipmaking being more accessible to inventors without multimillion-dollar budgets. “That really high barrier to entry will make you super risk-averse, and that’s bad for innovation,” Zeloof says.

Zeloof started down the path to making his own chips as a high school junior, in 2016. He was impressed by YouTube videos from inventor and entrepreneur Jeri Ellsworth in which she made her own, thumb-sized transistors, in a process that included templates cut from vinyl decals and a bottle of rust stain remover. Zeloof set out to replicate Ellsworth’s project and take what to him seemed a logical next step: going from lone transistors to integrated circuits, a jump that historically took about a decade. “He took it a quantum leap further,” says Ellsworth, now CEO of an augmented-reality startup called Tilt Five. “There’s tremendous value in reminding the world that these industries that seem so far out of reach started somewhere more modest, and you can do that yourself.”

Computer chip fabrication is sometimes described as the world’s most difficult and precise manufacturing process. When Zeloof started blogging about his goals for the project, some industry experts emailed to tell him it was impossible. “The reason for doing it was honestly because I thought it would be funny,” he says. “I wanted to make a statement that we should be more careful when we hear that something’s impossible.”

Zeloof’s family was supportive but also cautious. His father asked a semiconductor engineer he knew to offer some safety advice. “My first reaction was that you couldn’t do it. This is a garage,” says Mark Rothman, who has spent 40 years in chip engineering and now works at a company making technology for OLED screens. Rothman’s initial reaction softened as he saw Zeloof’s progress. “He has done things I would never have thought people could do.”

Zeloof’s project involves history as well as engineering. Modern chip fabrication takes place in facilities whose expensive HVAC systems remove every trace of dust that might trouble their billions of dollars of machinery. Zeloof couldn’t match those techniques, so he read patents and textbooks from the 1960s and ’70s, when engineers at pioneering companies like Fairchild Semiconductor made chips at ordinary workbenches. “They describe methods using X-Acto blades and tape and a few beakers, not ‘We have this $10 million machine the size of a room,’” Zeloof says.

Zeloof had to stock his lab with vintage equipment too. On eBay and other auction sites he found a ready supply of bargain chip gear from the 1970s and ’80s that once belonged to since-shuttered Californian tech companies. Much of the equipment required fixing, but old machines are easier to tinker with than modern lab machinery. One of Zeloof’s best finds was a broken electron microscope that cost $250,000 in the early ’90s; he bought it for $1,000 and repaired it. He uses it to inspect his chips for flaws, as well as the nanostructures on butterfly wings.

Continue Reading

Gadgets

Google Labs starts up a blockchain division

Published

on

Here’s a fun new report from Bloomberg: Google is forming a blockchain division. The news comes hot on the heels of a Bloomberg report from yesterday that quoted Google’s president of commerce as saying, “Crypto is something we pay a lot of attention to.” Web3 is apparently becoming a thing at Google.

Shivakumar Venkataraman, a longtime Googler from the advertising division, is running the blockchain group, which lives under the nascent “Google Labs” division that was started about three months ago. Labs is home to “high-potential, long-term projects,” basically making it the new Google X division (X was turned into a less-Google-focused Alphabet division in 2016). Bavor used to be vice president of virtual reality, and Labs contains all of those VR and augmented reality projects, like the “Project Starline” 3D video booth and Google’s AR goggles.

Just like “algorithms,” “AI,” and “5G,” “blockchain” is often used as the go-to buzzword for rudderless tech executives hoping to hype up investors or consumers. A blockchain is really just a distributed, P2P database, sort of like if BitTorrent hosted a database instead of pirated movies and Linux ISOs. The database is chopped up into blocks, and each new block contains a cryptographic hash of the previous block, forming a chain of records that protect each other against alterations. On a traditional database, transactions are verified by the database owner, but on a blockchain, nobody owns the database, so each transaction needs to be verified by many computers. This is the big downside of blockchains: everyone’s constant transaction verifications use a massive amount of electricity and computing power.

The decentralized nature of blockchains means nobody can take down your database, which cryptocurrencies like Bitcoin leverage to make a wealth transaction system that no government controls. But it’s not always clear why you would add all the complication and energy usage of a blockchain to your project.

Not much is known about the group, except that it is focused on “blockchain and other next-gen distributed computing and data storage technologies.” Google’s growth into a web giant has made it a pioneer in distributed computing and database development, so maybe it could make some noise in this area as well.

Continue Reading

Gadgets

The reviews are in: AMD’s mining-averse RX 6500 XT also isn’t great at gaming

Published

on

Enlarge / The Sapphire AMD Radeon RX 6500 XT, yet another GPU that you probably won’t be able to buy. (credit: Sapphire)

When AMD announced its budget-friendly RX 6500 XT graphics card at CES early this month, the company suggested that the product had been designed with limitations that would make it unappealing to the cryptocurrency miners who have been exacerbating the ongoing GPU shortage for over a year now. But now that reviews of the card have started to hit, it’s clear that its gaming performance is the collateral damage of those limitations.

Reviews from Tom’s Hardware, PCGamer, TechSpot, Gamers Nexus, and a litany of other PC gaming YouTube channels are unanimous: The RX 6500 XT is frequently outperformed by previous-generations graphics cards, and it comes with other caveats beyond performance that limit its appeal even further. (Ars hasn’t been provided with a review unit.)

The core of the problem is a 64-bit memory interface that limits the amount of memory bandwidth the card has to work with. Plus, the card has only 4GB of RAM, which is beginning to be a limiting factor in modern games, especially at resolutions above 1080p. Many tests saw the RX 6500 XT outperformed by the 8GB variant of the RX 5500 XT, which launched at the tail end of 2019 for the same $199 (and you could actually find and buy it for that price).

Read 6 remaining paragraphs | Comments

Continue Reading

Trending