We hear it mostly from proud CEOs and recruiters, as a sweet nothing designed to tempt candidates to drop their counter-offers, or a statement in a desperate pitch deck. We’re changing the world! All it takes is a few hundred fearsome intellects and laptops. Are you in or out?
It’s a bold claim, and when it crops up in more laughable contexts it’s easy to discount as hubris. But consider the claim more closely and it’s harder to deny. Technology loves to demolish the status quo, and it’s doing it with aplomb.
The world is struggling to come to terms with the implications of such rapid change. So far, specific industries—music, news, film—have had to pick up most of the debris, but now technology is destabilizing some of society’s central pillars: law, finance, education, defense, and politics. We’ve recently seen the rise of a rogue currency outside the global financial system. Crowdsourced vigilantism. The further erosion of the concept of ownership. State-sponsored hacking. Technologists are already making buildings busier than any hospital, and cities ten times the size of Tokyo. We’re hacking around the limitations of space and time. It’s sci-fi stuff, unevenly distributed straight into our inboxes. Today we label these feats “digital,” but before long that qualifier will no longer make much sense.
It’s hardly surprising that work on this scale challenges existing systems and behaviors.
From the inside, it’s exciting to see technology race ahead of the social frameworks that surround it. Our industry is in thrall to disruption, just so long as it happens to someone else. And although we can feign disinterest —“It’s not our fault you can’t keep up…”— power is a fresh, intoxicating phenomenon for geeks like us.
Outside our bubble, the change is more worrying. Technology is becoming the lingua franca of the modern elite, but it’s a language the world doesn’t yet fully understand. Today, a tiny clique has disproportionate influence on global culture. This group is largely young, male, white, and concentrated around wealthy urban regions, particularly the San Francisco Bay Area.
Doubtless many readers identify with this group, as do I. But we must admit it’s not a group that’s terribly well versed in the ways of the world. Therefore, those of us in the privileged position of affecting the course of technology have a duty to inform others of our intentions and listen to their feedback.
Grass-roots schemes like the UK’s Code Club encourage digital literacy by introducing the public to the fundamentals of programming. These are important and welcome initiatives. But as well as raw technical knowledge, we need to stir up public debate on the societal implications of technology. Who controls our data? How does DRM affect commerce and the public’s possessions? What will privacy mean in a Glass-wearing era?
The tech industry is well placed to begin these conversations—not because we understand the likely cultural impact (frankly, we’re pretty clueless there) but because we have advanced warning of emerging technologies.
As Bill Buxton has argued, new technologies take roughly 20 years to reach the mainstream. The mouse, the touchscreen, the mobile phone, wearable computers: none was an overnight success. All existed as prototypes in R&D labs and thought experiments in academic papers long before they were commercialized.
Industry insiders can all take a strong guess at what will be the next generation’s disruptive digital technologies. It’s time to talk about the potential impact of 3D printing, voice inputs, pervasive networks, and embedded computing today, not when the products hit the shelves.
We already have the tools required to begin these discussions: blogs, tweets, conference talks, conversations with friends outside the industry. We can also reach further by taking up the issues with government representatives or the media, writing books, and starting public campaigns. But soapboxes and councils aren’t the only way to raise these issues: our products can also speak for us.
One way we can clarify the function and implications of new technology is to design self-disclosure into it. Innovative products should help users form accurate mental models of how they work, and discuss consequences the user may not have considered. For example, a voice-operated technology could explain how to prevent others from triggering it, or remind the user to be conscious of her environment before issuing sensitive commands. A networked car telemetry system could notify the user exactly who has access to this data, and why sharing it with the insurance company could lead to lower premiums.
This notion of self-disclosure doesn’t sit too well with the modern preference for seamlessness. I’ve previously questioned whether the “invisible interface” deprecates style and risks homogenizing design. But there are broader questions too.
The black box model — a device or product that hides its mechanics and complexity — can be useful for designing appealing, marketable products. However, it can also act against users’ interests.
First, black boxes are harder to diagnose and debug. Without an entry point (serviceable hardware, an API, or even just some flashing LEDs) and knowledge of how the thing works, a black box can be stubborn and uncommunicative when something goes wrong. Imagine a house full of co-operating devices that all fail because an OS somewhere in that network has crashed. The designer of this system must provide some visibility into the workings of the network, allowing the user to resolve the problem.
Second, black box devices have the potential to reduce the user’s agency. It’s hard to understand a device that seems to act of its own accord. A seamless device demands trust, but offers no way for the user to decide if that trust is warranted. There’s a risk that invisible interfaces could therefore become breeding grounds for unethical design. Since the user has little insight into the workings of the system, it becomes easier to slip personal data to an unknown IP, connect to a premium-rate phone line, or perform some other hostile act.
So seamlessness may not be the right model for new genres of technology. Perhaps it’s better for the first wave of innovative devices to be explicit about their workings and implications, helping the public to understand and react appropriately. Once people become more familiar with the technology, designers can carefully taper this self-disclosure off.
It may be harder to design a slick user experience if we expose the workings of a device, but advocating transparency is about designing a good experience for humankind, not just a single user. It demonstrates an ethical, holistic mindset that’s becoming ever more important as technology becomes central to people’s lives.
For ethical values to thrive in our field, we can’t let the pace of change seduce us into thinking we’ve no time for them. Designers and engineers alike need to think deeply about the implications of the things we make, and appreciate the value of doing so. We also need role models. I long for our industry to stop fetishizing entrepreneurs and billion-dollar buyouts, and instead to praise technologists who inform the public about new technology, or companies that make tough decisions for the greater good.
Individuals within the tech industry also need the courage to do the right thing. The job market is so strong that the only response to unethical pressure from employers should be a hearty middle-fingered farewell. And where dark patterns do emerge, the industry must highlight these dirty tricks, and explain to the public how they can avoid being taken for a ride.
Finally, the tech community should educate itself about global issues. Our tiny elite needs to understand the world in order to affect it positively. Efforts to travel, to learn about other cultures and contexts, and to consider use cases beyond those of our nearest neighbors will help reduce the risk of technological imperialism. It would be a mistake to assume that a solution that works for a Western techie will work for a North African trader.
These are complex times for the tech industry, and the consequences of taking a wrong step could be severe. Let’s dedicate thoughtful time to ensure the effect we’re having on the world is positive. The results will also be good for our own industry: an informed public means a greater trust of and appetite for our work.