The A List Apart Blog Presents:

Awkward Cousins

Article Continues Below

As an industry, we’re historically terrible at drawing lines between things. We try to segment devices based on screen size, but that doesn’t take into account hardware functionality, form factor, and usage context, for starters. The laptop I’m writing this on has the same resolution as a 1080p television. They’d be lumped into the same screen-size–dependent groups, but they are two totally different device classes, so how do we determine what goes together?

That’s a simple example, but it points to a larger issue. We so desperately want to draw lines between things, but there are often too many variables to make those lines clean.

Why, then, do we draw such strict lines between our roles on projects? What does the area of overlap between a designer and front-end developer look like? A front- and back-end developer? A designer and back-end developer? The old thinking of defined roles is certainly loosening up, but we still have a long way to go.

The chasm between roles that is most concerning is the one between web designers/developers and native application designers/developers. We often choose a camp early on and stick to it, which is a mindset that may have been fueled by the false “native vs. web” battle a few years ago. It was positioned as an either-or decision, and hybrid approaches were looked down upon.

The two camps of creators are drifting farther and farther apart, even as the products are getting closer and closer. John Gruber best described the overlap that users see:

When I’m using Tweetbot, for example, much of my time in the app is spent reading web pages rendered in a web browser. Surely that’s true of mobile Facebook users, as well. What should that count as, “app” or “web”?

I publish a website, but tens of thousands of my most loyal readers consume it using RSS apps. What should they count as, “app” or “web”?.

The people using the things we build don’t see the divide as harshly as we do, if at all. More importantly, the development environments are becoming more similar, as well. Swift, Apple’s brand new programming language for iOS and Mac development, has a strong resemblance to the languages we know and love on the web, and that’s no accident. One of Apple’s top targets for Swift, if not the top target, is the web development community. It’s a massive, passionate, and talented pool of developers who, largely, have not done iOS or Mac work—yet.

As someone who spans the divide regularly, it’s sad to watch these two communities keep at arm’s length like awkward cousins at a family reunion. We have so much in common—interests, skills, core values, and a ton of technological ancestry. The difference between the things we build is shrinking in the minds of our shared users, and the ways we build those things are aligning. I dream of the day when we get over our poorly drawn lines and become the big, happy community I know we can be.

At the very least, please start reading each other’s blogs.

5 Reader Comments

  1. Makes me think of a rant I went on about “Retina Display” a couple of years back: http://answerguy.com/2012/03/22/ipad-retina-display-stupid-marketing-trick/ .

    Over time, the people who have either contacted me or commented have fallen into two camps: those who agreed with me based on just the math angle(s), and those who disagreed based on the fact that even on a small screen “text looks way better at higher ppi”.

    Fascinating how the two camps each think they’re right, AND THEREFORE THE OTHER IS WRONG.

  2. I guess I don’t see the similarities between Swift and web development languages. I currently program in Java, Scala, JavaScript, and TypeScript. From my perspective, Swift seems to most resemble Kotlin (sort of Scala-lite) but with features to tie it closer to Objective-C’s data model (i.e. C consts, Objective-C classes, and reference counting– NOT garbage collection.)

    There are a lot of places where Swift could have made it easier to compile to JavaScript, but didn’t. In particular, they wouldn’t encourage the use of C-style structs. If they wanted to court web developers, they would have encouraged a programming style that abstracts away from the hardware and the heavyweight Objective-C runtime– thereby leaving them more breathing room to someday compile to JavaScript.

    The point is: there’s a lot of consensus now about how to write a programming language. Things like type inference rather than explicit type declarations. And eliminating or containing nulls. And garbage collection– or at least avoiding having the programmer do memory management.

    In that light, TypeScript is a back-port of modern language conventions to JavaScript, while Swift is a back-port of modern language conventions to Objective-C. (And then there’s Dart, which strikes me as a cross between Java and JavaScript.) Any resemblance of Swift to web languages can be explained entirely as simply modern language conventions.

  3. @dleppik Even so, if the only ties to languages used on the web are modern language conventions, that’s still enough to break down the largest barrier to entry that most web developers encounter in iOS/Mac development—Objective-C.

  4. Great read! Both mediums have value that depend drastically on the goal. We’re in the process on developing Foundation For Apps and one of the big issues is not trying to do too much. We definitely think there are a lot of native apps that could get better reach by being native apps, but we totally see the need for the technology that native allows.

    Keep up the good work 🙂

  5. computer technology has created a corresponding boom in animation. Using software, animators can easily produce high-quality, high-artistry animation and mix the aesthetics of traditional cel animation with dazzling 3-D effects. Focus of SAVE is on orienting students to develop creativity & provide personalized career counseling that helps ensure that they stay on track for a successful career.During the course students get multiple opportunities to showcase their skills and to interact with experienced professionals.

Got something to say?

We have turned off comments, but you can see what folks had to say before we did so.

More from ALA

Design for Amiability: Lessons from Vienna

Computing was born in a Viennese café. Between 1928 and 1934, while Hitler plotted and Europe crumbled, a motley crew of mathematicians, philosophers, architects, and economists gathered weekly to puzzle out the limits of reason—and invented Computer Science in the process. What made their collaboration possible wasn't just brilliance (though they had plenty). It was amiability: the careful design of a social space where difficult people could disagree without destroying each other. Longtime A List Apart contributing author Mark Bernstein mines this forgotten history for lessons that might just save today's embattled web from its worst impulses. Spoiler: it involves better coffee service and the looming threat of public humiliation.

From Beta to Bedrock: Build Products that Stick.

Building towards bedrock means sacrificing some short-term growth potential in favour of long-term stability. But the payoff is worth it: products built with a focus on bedrock will outlast and outperform their competitors, and deliver sustained value to users over time. Liam Nugent shows us how.

User Research Is Storytelling

At a time when budgets for user experience research seem to have reached an all-time low, how do we get stakeholders and executives alike invested in this crucial discipline? Gerry Duffy walks us through how the research we conduct is much like telling a compelling story, complete with a three-act narrative structure, character development, and conflict resolution—with a happy ending for researchers and stakeholders alike.

Discover more from A List Apart

Subscribe now to keep reading and get access to the full archive.

Continue reading