There was a day when computers and other devices connected happily to accessories, and ease of connection mattered. It appears that those days are gone. When Apple removed the earphone jack from its newest iPhone and MacBook devices, leaving only USB-C connectors, it heralded the beginning of the end for something we used to take for granted: ports.
Apple isn't the only vendor to drop ports in favour of skinniness. HP's new Spectre 13, a thing of beauty, only has two USB-C ports and a headphone jack (though HP does include a USB-C to USB 3.1 dongle in the box). Dell's shiny XPS 13, on the other hand, also has a USB-C port, but the company didn't abandon current tech. It includes a pair of USB 3.0 ports as well. However, for other ports such as RJ-45 (Ethernet), HDMI, VGA, or DisplayPort, you need a dongle; a multi-connection adapter providing HDMI/VGA/Ethernet/USB 3.0 that plugs into the USB-C port costs $99.99.
But why? In our quest for supposedly "sexier" products, we're throwing functionality out the window.
Proponents will claim that it's progress. They will say that one must make some sacrifices to make devices sleek and slim and light and gorgeous, and that USB-C offers the versatility to hook up almost anything.
Maybe so, but it comes at a price. You have to venture into dongle hell to use most existing peripherals. You need a dongle to get standard USB ports, and one for each of the VGA, HDMI or DisplayPort video (depending on what monitors you own), and one for Ethernet and … well, you get the picture. And dongles, as we have found out more than once in the past during changes of port configurations (remember the old PS/2 mouse and keyboard connectors), are a royal pain. They're easy to forget or lose, and they can break. Over and over again. And they're not cheap. Apple alone sells over 15 different dongles; prices range from $10 (CDN) to $59.
Furthermore, there are times when I'm convinced that some of the designers never use the products they create. One of the first things iPhone 7 users realized was that, with the single USB-C port, they can only connect one thing at a time. They can't, for example, charge the device and listen to music at the same time without, yes, a specialized dongle – which Apple did not supply (at least they included a USB-C to headphone jack adapter in the box so customers could continue to use their sometimes very expensive headphones). That, granted, provides a lucrative add-on business for vendors and third parties, but it's not friendly to the consumer's wallet.
Technology vendors aren't the only ones whose reality check has apparently bounced. Consider: bread machines make loaves that are taller than commercial loaves. It has always been so. Yet have you ever seen a toaster that can accommodate bread machine slices? I remember precisely one model, now discontinued, that had tall enough slots to toast an entire slice of bread machine bread without leaving an untoasted strip at the top. The silly thing is, many bread machine vendors also make toasters – toasters that can't handle the bread from their own machines.
And let's think about cook stoves. Many of the popular dual oven units have changed from a configuration of one small and one full-sized oven to two equal-sized ovens, neither full-sized. I imagine they're cheaper to build, and they may look more elegant, but their designers, again, didn't think things through. Neither of the equal-sized ovens can accommodate an average turkey, for example. Another dual-oven model actually has a single door with a divider between the two sections. Its designer probably never bakes – when you open the door to check something in one of the ovens, the other half loses heat too, which could be catastrophic to some toothsome treats.
Early dual-oven models also had two timers. That makes sense – one per oven. On current models, they've cheaped out, and only have one timer, giving standalone timer vendors an opportunity. Sigh.
I'm sure you can think of other examples where form – and a few pennies of cost – trumped function, to users' great annoyance.
Design gurus acknowledge the problem, especially in IT. At the Canadian Innovation Exchange 2016 conference, reports Shane Schick, Microsoft's Bill Buxton said that good design is about reducing what he called the “impedance mismatch” between humans and IT. He said that we should stop trying to create the next big thing, and that the next big thing actually isn't a new thing, it's changing the relationship between existing things.
Of course, the next big thing, even if it is a marketing hallucination (think 3D TV), can be very lucrative for vendors if it flies, and an expensive embarrassment if it doesn't. Yet they take comfort in Henry Ford's comment that if he'd asked people what they wanted, they'd have asked for a faster horse, and continue to toss ideas against the consumer wall to see what sticks. Or, worse yet, they "improve" things into unusability, figuring they can make a fix in the next model and people will buy it. And sometimes that works. Apple aficionados may complain bitterly about the missing headphone jack, but they still buy the iPhone 7, and computer users are still snapping up the latest laptops and putting up with the dongles. I have no doubt that once peripherals are equipped with USB-C connections, customers with newer computers will replace perfectly good devices with new ones, just to escape dongle hell, as we have done several times in the past (a lot of printers ended up on the shelf when the parallel port was discontinued).
And then, in all too short a time, someone will come up with yet another connector, and we'll go through it all again.