On Changing the World

We hear it mostly from proud CEOs and recruiters, as a sweet nothing designed to tempt candidates to drop their counter-offers, or a statement in a desperate pitch deck. We’re changing the world! All it takes is a few hundred fearsome intellects and laptops. Are you in or out?

Article Continues Below

It’s a bold claim, and when it crops up in more laughable contexts it’s easy to discount as hubris. But consider the claim more closely and it’s harder to deny. Technology loves to demolish the status quo, and it’s doing it with aplomb.

The world is struggling to come to terms with the implications of such rapid change. So far, specific industries—music, news, film—have had to pick up most of the debris, but now technology is destabilizing some of society’s central pillars: law, finance, education, defense, and politics. We’ve recently seen the rise of a rogue currency outside the global financial system. Crowdsourced vigilantism. The further erosion of the concept of ownership. State-sponsored hacking. Technologists are already making buildings busier than any hospital, and cities ten times the size of Tokyo. We’re hacking around the limitations of space and time. It’s sci-fi stuff, unevenly distributed straight into our inboxes. Today we label these feats “digital,” but before long that qualifier will no longer make much sense.

It’s hardly surprising that work on this scale challenges existing systems and behaviors.

Once you have something that grows faster than education grows, you’re always going to get a pop culture.

Alan Kay

From the inside, it’s exciting to see technology race ahead of the social frameworks that surround it. Our industry is in thrall to disruption, just so long as it happens to someone else. And although we can feign disinterest —“It’s not our fault you can’t keep up…”— power is a fresh, intoxicating phenomenon for geeks like us.

Outside our bubble, the change is more worrying. Technology is becoming the lingua franca of the modern elite, but it’s a language the world doesn’t yet fully understand. Today, a tiny clique has disproportionate influence on global culture. This group is largely young, male, white, and concentrated around wealthy urban regions, particularly the San Francisco Bay Area.

Doubtless many readers identify with this group, as do I. But we must admit it’s not a group that’s terribly well versed in the ways of the world. Therefore, those of us in the privileged position of affecting the course of technology have a duty to inform others of our intentions and listen to their feedback.

Grass-roots schemes like the UK’s Code Club encourage digital literacy by introducing the public to the fundamentals of programming. These are important and welcome initiatives. But as well as raw technical knowledge, we need to stir up public debate on the societal implications of technology. Who controls our data? How does DRM affect commerce and the public’s possessions? What will privacy mean in a Glass-wearing era?

The tech industry is well placed to begin these conversations—not because we understand the likely cultural impact (frankly, we’re pretty clueless there) but because we have advanced warning of emerging technologies.

As Bill Buxton has argued, new technologies take roughly 20 years to reach the mainstream. The mouse, the touchscreen, the mobile phone, wearable computers: none was an overnight success. All existed as prototypes in R&D labs and thought experiments in academic papers long before they were commercialized.

Industry insiders can all take a strong guess at what will be the next generation’s disruptive digital technologies. It’s time to talk about the potential impact of 3D printing, voice inputs, pervasive networks, and embedded computing today, not when the products hit the shelves.

We already have the tools required to begin these discussions: blogs, tweets, conference talks, conversations with friends outside the industry. We can also reach further by taking up the issues with government representatives or the media, writing books, and starting public campaigns. But soapboxes and councils aren’t the only way to raise these issues: our products can also speak for us.

One way we can clarify the function and implications of new technology is to design self-disclosure into it. Innovative products should help users form accurate mental models of how they work, and discuss consequences the user may not have considered. For example, a voice-operated technology could explain how to prevent others from triggering it, or remind the user to be conscious of her environment before issuing sensitive commands. A networked car telemetry system could notify the user exactly who has access to this data, and why sharing it with the insurance company could lead to lower premiums.

This notion of self-disclosure doesn’t sit too well with the modern preference for seamlessness. I’ve previously questioned whether the “invisible interface” deprecates style and risks homogenizing design. But there are broader questions too.

The black box model — a device or product that hides its mechanics and complexity — can be useful for designing appealing, marketable products. However, it can also act against users’ interests.

First, black boxes are harder to diagnose and debug. Without an entry point (serviceable hardware, an API, or even just some flashing LEDs) and knowledge of how the thing works, a black box can be stubborn and uncommunicative when something goes wrong. Imagine a house full of co-operating devices that all fail because an OS somewhere in that network has crashed. The designer of this system must provide some visibility into the workings of the network, allowing the user to resolve the problem.

Second, black box devices have the potential to reduce the user’s agency. It’s hard to understand a device that seems to act of its own accord. A seamless device demands trust, but offers no way for the user to decide if that trust is warranted. There’s a risk that invisible interfaces could therefore become breeding grounds for unethical design. Since the user has little insight into the workings of the system, it becomes easier to slip personal data to an unknown IP, connect to a premium-rate phone line, or perform some other hostile act.

So seamlessness may not be the right model for new genres of technology. Perhaps it’s better for the first wave of innovative devices to be explicit about their workings and implications, helping the public to understand and react appropriately. Once people become more familiar with the technology, designers can carefully taper this self-disclosure off.

It may be harder to design a slick user experience if we expose the workings of a device, but advocating transparency is about designing a good experience for humankind, not just a single user. It demonstrates an ethical, holistic mindset that’s becoming ever more important as technology becomes central to people’s lives.

For ethical values to thrive in our field, we can’t let the pace of change seduce us into thinking we’ve no time for them. Designers and engineers alike need to think deeply about the implications of the things we make, and appreciate the value of doing so. We also need role models. I long for our industry to stop fetishizing entrepreneurs and billion-dollar buyouts, and instead to praise technologists who inform the public about new technology, or companies that make tough decisions for the greater good.

Individuals within the tech industry also need the courage to do the right thing. The job market is so strong that the only response to unethical pressure from employers should be a hearty middle-fingered farewell. And where dark patterns do emerge, the industry must highlight these dirty tricks, and explain to the public how they can avoid being taken for a ride.

Finally, the tech community should educate itself about global issues. Our tiny elite needs to understand the world in order to affect it positively. Efforts to travel, to learn about other cultures and contexts, and to consider use cases beyond those of our nearest neighbors will help reduce the risk of technological imperialism. It would be a mistake to assume that a solution that works for a Western techie will work for a North African trader.

These are complex times for the tech industry, and the consequences of taking a wrong step could be severe. Let’s dedicate thoughtful time to ensure the effect we’re having on the world is positive. The results will also be good for our own industry: an informed public means a greater trust of and appetite for our work.

8 Reader Comments

  1. This is a brilliant observation, though it discretely evades the elephant in the room: the major players in the paradigm here are three-fold, not two. Indeed few would deny a relationship—a tradeoff, in this case—between slick user experiences and an advocacy of transparency, but what of the third pillar? Revenue.

    Companies are acutely aware that transparency can affect adoption—which carry implications beyond the sale of products themselves—to the wares into which those products transform their users. “Doing the right thing” is more naturally occurring than I think people realize…until it comes with a price tag…

  2. I disagree, Scotty Z. For one thing, things like DRM actually cost the user money and remove functionality, not the other way around. In fact, the giving up of rights for a slicker interface almost always comes at a higher cost: Linux is free, but not very popular.

    However, even in cases where you are correct, we cannot disregard our ethics to make money. There needs to be a balance, and if we can improve lives and not be as rich as a result, we need to choose to be good.

  3. Some interesting points raised here. For me, these are problems which occur as the distance between thought, action, and result rapidly shrinks. I think this shrinkage probably accounts for why we have tight, homogenous clusters in tech communities; they are the continuation of the original founding communities. This has allowed them better (and rapidly improving) access to the tools which enable this kind of work.

    I would add that the focus on solving problems for this extremely narrow demographic has meant that broader access to the tools and knowledge necessary to participate in tech has been limited. For instance, “natural” language coding initiatives often rely on speaking English, or the focus on social networking rather than solving real and pressing issues to do with environment, access to food, healthcare, etc.

    Technologists need to think more globally, and probably think less about the US consumer market. Unfortunately, that’s where the money is (for now).

  4. I really like the point on the honest design aspect (as Rams would probably put it). Another point to highlight, imho, is a need for ethical design. Ethically designed products inspire people using them to better themselves (as opposed to taking advantage of human’s compulsive disorders to generate revenues).

  5. Before anything like this can be discussed you need to live where I live. Web design is bad, at best a contorted amalgam of “borrowed” templates and the content reflects the literacy of the locale and culture. 50% of the county I live in does not have the ‘tubes’ and those that do, 50% have dial up access. Limited access to the tubes effects how people see and use the web. They are bound by liturgy that is 100’s of years old. They attempt to understand by importing what you big guys do but something happens along the way. They ~adapt visually but they gut the concepts and insides are replaced with the local culture and liturgy. BTW I am only 180 miles from Chicago and 16 miles from a metro population of 1 million+. You speak of tech imperialism and then you make a comparison between Western culture and N. Africa There are some common misconceptions that need to be addressed. The issues we face are just outside your door.

  6. Brilliant piece. It highlights something that often worries me – the bubble of fast-moving, exciting technological change has a special air of grassroots action, bootstrapping and ‘going against the man’ that it often ignores the wider socioeconomic global context the industry is part of. You’re absolutely right that “efforts to travel, to learn about other cultures and contexts, and to consider use cases beyond those of our nearest neighbors will help reduce the risk of technological imperialism.” These experiences and efforts to broaden our horizons don’t have to come in place of revenue-generating activities. Sometimes, simply having seen a different context of living and thinking can cajole us to design differently, to build things that work differently, and to question decisions that only make sense within our context.

    When our eyes are set toward WWDC and various announcements of new products and fancy technology, it’s easy to forget that “should I switch to the new iPhone” isn’t something the rest of the world is worried about. It doesn’t mean we have to feel guilty and quit our jobs in favor of unpaid volunteer roles, it just means we have to think a little more broadly.

    Thanks for this!

Got something to say?

We have turned off comments, but you can see what folks had to say before we did so.

More from ALA

I am a creative.

A List Apart founder and web design OG Zeldman ponders the moments of inspiration, the hours of plodding, and the ultimate mystery at the heart of a creative career.
Career