What Will Save Us from the Dark Side of CSS Pre-Processors?

Writing CSS by hand for a site or app of any considerable size seems quaint these days, in the way that shaping a piece of wood with an adze seems quaint. Admirable, perhaps, but even if it gives you a tangible connection to the exact outcome, the vestigial quirks, limitations, and tedium of that workflow make it feel archaic.

Article Continues Below

Until a few years ago, this direct method was our only real option. We managed CSS by hand, and it got complicated and crazypants. So when pre-processors started showing up—Sass, LESS, Stylus—we clutched at them, giddy and grateful like sleep-deprived parents.

Pre-processors to the rescue!#section2

Their fans know that pre-processors do a lot of stuff. Variables, functions, nesting and calculations are part of the pre-processor assortment, but there’s often also support for concatenation, minification, source maps, output formatting. Sass feels like an authoring tool, framework, configuration manager, transform and build tool in one. They’ve become very popular—especially Sass, the juggernaut. Huzzah! Such power!

Pre-processors… to the rescue?#section3

Yet as with other powerful things, Sass has a dark side. Its potential malevolence tends to manifest when it’s wielded without attention or deep understanding. In the recent article A Vision for our Sass, Felicity Evans points out some of the ways unmindful use of Sass can result in regrettable CSS.

Pre-processors have a way of keeping us at arm’s length from from the CSS we’re building. They put on us a cognitive burden to keep up on what’s evolving in CSS itself along with the tricks we can pull off specific to our pre-processor. Sure, if we’re intrepid, we can keep on top of what comes out the other end. But not everyone does this, and it shows.

Overzealous use of the @extend feature in Sass can create bloated stylesheet files bobbing in a swamp of repeated rules. Immoderate nesting can lead to abstruse, overlong, and unintentionally overspecific selectors. And just because you can use a Sass framework like Compass to easily whip up something artful and shiny, it doesn’t guarantee that you have any sort of grip on how your generated CSS actually works.

Pre-processors… FTL?#section4

Diabolical output is one risk, and yet there are additional ways pre-processors can trip us up.

Working with a pre-processor means writing source in its Domain-Specific Language (DSL). (You could also pen your source using entirely vanilla CSS, but that would be pretty pointless, as the power of pre-processing comes from operations on variables, mixins, and other features written in their particular syntax.) You feed this source to the pre-processor and out comes CSS ready for browsers. You couldn’t take your source and use it in a browser. It’s not ready yet.

That means that the source is not entirely portable. So choosing a particular pre-processor may be a long-term commitment—Sass and other pre-processors can create a certain amount of lock-in.

On a conceptual level, the breadth of pre-processors’ scope is significant enough that it can insinuate itself into the way we think and design. In that sense, it’s not a tool but a system. And this can get under the skin of people—especially devs—who thrive on separation of concerns.

This is beginning to sound like an argument to ditch Sass and its brethren and return to the homespun world of handcrafted CSS. And yet that is a false dichotomy: pre-processors or nothing. There are other tools for managing your CSS, and I’m especially hopeful for a new (ish) category of tools called post-processors.

Post-processors to the rescue!#section5

In contrast to pre-processors’ distinct syntaxes, post-processors typically feed on actual CSS. They can act like polyfills, letting you write to-spec CSS that will work someday and transforming it into something that will work in browsers today. Ideal CSS in, real-life CSS out.

You may already be using a post-processor alongside your pre-processor without being aware of it. The popular autoprefixer tool is in fact a post-processor, taking CSS and adding appropriate vendor prefixes to make it work in as many browsers as possible.

Many post-processors—especially those written with a plugin approach—do only one specific thing. One might polyfill for rem units. Another might autogenerate inline image data. You can pick and choose the modular plugins you need to transform your CSS.

Post-processors typically edge out their pre- brethren in build-time speediness.

And because they can be used as modular chunks, they can serve a balm for the aforementioned separation of concerns violations.

With this kind of authoring, we have a built-in necessity to stay current on the way new specs express themselves in actual CSS syntax, and that means post-processors’ transformations aren’t as inscrutable. Plugin authoring, too, is pegged to the same specs. Everyone is marching to the same, standards-driven beat.

This is starting to feel like outright post-processor boosterism, isn’t it?


Post-processors… to the rescue?#section6

The case for post-processors isn’t entirely coherent. There isn’t even any consensus about the definition. The way I’m explaining post-processors is my own interpretation. Don’t take it as gospel.

Real-world implementations don’t help to clear the picture, either. Several modules written using postcss, a JavaScript framework for post-processors, involve custom syntax that doesn’t align with the definition I’m outlining here (valid CSS in, valid CSS out). By my definition, myth.io would be a post-processor, but is described by its maintainers as a pre-processor. Maybe post-processors aren’t even a thing, or only exist in my fevered, idealistic imagination.

Post-processors may hold more appeal to certain members of the web-building audience. The appeal of shaving some milliseconds off build has more clout with some than others. Modularity is one thing, but pre-processors can do so many things. It’s hard to wean from something that serves us so well.

Taking a path paved with lean, modular post-processing plugins involves sacrifices. No more nesting. Instead of an endless horizon of mixin possibilities, you may be bound to CSS spec realities, like calc or CSS variables. One promising framework for rolling out post-processing plugins is postcss, but it’s young yet and its documentation is in a correspondingly awkward adolescent phase.

Knowing your craft to the rescue!#section7

Remember that thing I said earlier about false dichotomies? Gotta remember that, because pre- and post-processors aren’t mutually exclusive.

We happily use both in my office. Some of our more dev-y designers have taken a shine to the post-processing philosophy, while other designers remain pleased with the all-in-one, intuitive oomph Sass gives them. Both are right.

Though each camp might have a different approach to tools, the important commonality they share is a deep understanding of the CSS that comes out the other side. Neither set of tools is a crutch for ignorance. Know your craft.

Although there are different ways to get there, a thoughtful understanding of CSS is a prerequisite for continued success in building great things for the web. Whether you’re one to meditatively chip with an adze along a raw CSS stylesheet or you prefer to run it through a high-tech sawmill, you’re always better off understanding where you’re starting from and where you’re trying to go.

24 Reader Comments

  1. I’m just wondering if CSS is not going to rescue us from post-processors that rescue us from pre-processors…

    And I’m not kidding when I say this: sometimes, I prefer writing VanillaCSS, using no tricks impossible to maintain, and guess what? It is cool.

  2. Truth be told, I think pre/post-processors should be installed at the server level along with concatenation, minification and image processing services. There’s little need for a company to marry itself to a technology like Sass (just an example, calm down) that’s run on a local machine – I’ve already seen it fail getting stuck in legacy systems. Especially when those kinds of tech change so much (looking at: Sass, Less, Stylus AND Grunt, Broccoli, Gulp as examples)

    I’d love to write vanilla CSS, push to a server and have a apache module or server-side framework process and polyfill the code. It would be a lot more future-proof, but that’s just my opinion.

  3. I don’t see why this is such an issue: while it is true that it is possible to write illegible compiled-CSS, it’s also possible to write illegible vanilla CSS (I’ve seen a lot). But a nicely commented, partitioned Sass, Less or whatever code will output readable or at least understandable CSS which will be reusable in the future.

    I’m personally using a combination of Sass and Autoprefixer with certains restrictions (my own set of good practices if you will): only simple (or exhaustively commented) mixins, limited use of extend and no more than 4 level of nesting. The resulting, non-minified, CSS output can be reused as-is if needed.

  4. Thanks for mentioning myth.io. Since I started using that, all my needs for LESS, SASS & Co have disappeared. The one thing that makes it perfect (imho) is that it uses plain CSS and just adds the latest features not implemented by all browsers yet (vars, math, color manipulation, …) coupled with taking away the need for vendor-prefixes. So as a programmer I don’t need to learn a special preprocessor-syntax, but just use the things that will be in the next iteration of CSS anyway.

  5. Sounds like you’re saying there are some tools, and if you use them without proper knowledge and precautions, they can cause you pain and make life harder. If that’s the case, this could have been a tweet; am I missing something? Does this unpack to something more controversial or challenging?

  6. The last section is exactly right. Force multiplying technologies amplify everything that goes in, including ineptitude. It’s not just garbage in, garbage out. It’s garbage in, 10x garbage out. But, used wisely, such tools are very valuable.

  7. I agree with Corey. Misuse of a tool due to lack of proper knowledge and precautions does not mean that the tool is broken. In fact, in this case, that assertion can even be dangerous as it can lead to confusion or aversion to utilizing good tools because authoritative voices have called them in to question. This is not to say we shouldn’t constantly question the tools we use and keep an eye on better solutions to avoid becoming attached to and protective of less than ideal technologies. However, right now CSS is the problem and CSS pre-processors are what will save us – even if it ultimately results in their abandonment and their better qualities being integrated in to some other solution or, ideally, CSS itself. That could take a while though and there might be other solutions along the way but what we should avoid is unnecessarily calling in to question the effectiveness of these tools and potentially fragmenting or diminishing the concentrated and collective effort the web community has devoted to making CSS pre-processors such an excellent and unified solution to so many of the fundamental problems with CSS. (Key word ‘unnecessarily’) When a better solution emerges, we should be ready to put our weight behind it but, even after reading this article a couple of times, I fail to see the problem with CSS pre-processors that we need to be saved from. I know this: after experiencing the glory of working with pre-processors I’ll never (willingly) write vanilla CSS, as it exists now, again. I look forward to the day when it has evolved enough to ditch the tools but, until then… pre-processors are the technology through which we can implement good ideas and best practices while we wait for CSS to catch up.

  8. I guess I’m just stuck in my ways, but I disagree with the whole assertion that managing CSS by hand was a big deal. Or at least crafting the very basics of it seems pretty simple.

    Don’t get me wrong, our system has functionality to perform concatenation, minification, and variables … but that’s it. I haven’t felt the need for much more … but then again, I am the type that loves to keep up with the trade and implement emerging features. For example, I recently updated an admin for a client and included the will-change property. That’s not very well supported at this time but it’s still a good thing to have for future support IMHO.

  9. I’ve found that adapting BEM syntax in my LESS and SASS files means I still have a clue what the output CSS will look like. Preprocessors output quality worries me, so I limit myself to using concatenation, minification, variables, calculations, and no nesting at all except in edge cases. I have now met designers who joined the industry in the last 3 years, due to joining forward-thinking companies, have never written vanilla CSS. Now that’s scary — like a more technical version of those Dreamweaver users who never checked the output HTML of their Design Mode work.

  10. I think that tools like SASS, when used correctly, are amazingly useful to designers and developers. I understand the concerns that when used incorrectly they cause all sorts of issues but it reminds me of when javascript libraries first became a big thing. Suddenly anyone could use a library and a plugin or two to create all sorts of amazing effects with little or no knowledge of javascript. This led to all sorts of the same issues as you outline, I would argue you could almost just replace css/pre-processor/SASS with javascript/library/jQuery above and have something that would closely resemble how I _still_ feel about these libraries.

    Yet I wouldn’t argue against their use, I would just argue for the people using them to spend the time to get a greater understanding of the underlying technologies so that they use them as a tool to accomplish their goals rather than as a crutch to support a lack of knowledge and understanding.

    In the end I guess it boils down to “I have the same concerns – I suggest a different solution”. Don’t throw pre-processing out, used correctly it can solve a great number of real world issues with CSS and save everyone a whole lot of time.

  11. As someone who spends half of my time at work deciphering other people’s badly written PHP, it occurs to me that it’s not the tool, it’s the author.

    I am very disciplined in how I write my code, making sure it’s easy to read and coherent. This goes for how I write my PHP, my HTML, and my SCSS. My predecessors were not. Occasionally you find someone who was organized in their CSS but their PHP is a jungle. Others I go through and find where they used a ‘div’ instead of an ‘a’ tag. (WHY?!?!)

    If you are disciplined and follow best practices, all of this becomes a moot point. Yes it’s possible to create crazy CSS with pre-processors but it’s also possible to create crazy CSS all on your own, just like it’s possible to create crazy PHP, crazy HTML, or crazy Javascript.

    To go cliché, “With great power comes great responsibility.”

  12. @Alan Moore That Dreamweaver comment is what I have been thinking for a while when hearing about tools and more tools that helps speed up development. Yes, if developers just use it and never bother to learn what it is doing for them, chances are they are just relying on the infamous Design Mode. Well said.

  13. @ powrsurg + @ Alan Moore —still happily crafting CSS by hand with BBEdit these days in a hand crafted post + beam home in Vermont. It may be quaint and vanilla, like Ben & Jerry’s, but to me, it’s all about learning, growing, and knowing what the code is doing, and not about speed and production. In this sense it’s a bit slower, but precise and analogous to the slow food movement.

  14. I would argue that we’ll have to worry less and less about the raw CSS we ship to users as time goes on. Just like we don’t worry about the compiled assembler code when we write JavaScript today.

    As Chris Coyier points out, using Sass just means going up one step up the “abstract-o-meter”. Adding that layer of abstraction is a natural response to the growing complexity of what we’re trying to build on the web. I’m happy to use a powerful authoring language like Sass and let my compiler turn that into whatever kind of robot-barf the browser happens to understand.

  15. There is no ‘dark side’ to pre-processors, I find. If anything, I think it’s more a problem to do with with how the CSS is written. What’s more, writing CSS Vanilla on a website with a large scale is just asking to be a maintenance nightmare in my eyes.

    That’s why I write CSS using a pre-processor and include a post-processor: because I usually work on projects with a wide architecture. Therefore, I need something to keep the scalability under control as well as practice a CSS methodology to prevent bad authorship. The post-processor helps take care of styles that are necessary to prefix and not bloat my styles, nothing more.

    As for maintainability, maybe when pure CSS variables and more pragmatic ways of writing for the larger scale become a thing, then we won’t need to have this debacle but the reality is we have alternatives for them *now.*

  16. A great article. I’ve seen the best and the worst of both worlds — pre processors are a mess to set up and often get in the way when things just need to be done. A lot of time get’s lost in figuring out the documentation of a mixin. The @extend directive is definitely something not to be messed with a lot since inproper usage often leads to bloated and unreadable code. Autoprefixer now solves vendor prefixing issues so there is no need for pre processing them.

    Writing CSS using post processors like Myth or Pleeease is a walk in the park since it’s only CSS. The main issue with post processing CSS is the lack of nesting, proper variables and media query bubbling.

    There are a lot of cases when pre processing is really valuable such as automatisation of .png or .svg sprites, math calculation and advance functions. Utilising the best of both worlds is a reality today, but upcoming CSS features such as calc(), variables, color and custom MQ will slowly push out the need for pre processing.

  17. Mr. Wright and Lain are on point. CSS is not mystical and I think server-side processing could help. When I made the 6,000 lines of CSS code for the Lenmar battery central that powers best buy’s battery finder (I don’t maintain that code anymore) 3 years ago it would have been handy, but it felt too new. 3 months later I was screaming how awesome it is, but if you’ve got the SASS to say we should use LESS preprocessing maybe we should call the next version MORESASSY and get rid of the “dark side” you mentioned so we can bring the pre-processor suffering to light…

  18. Thanks for the article. However, I do feel that in very rare cases, there is a dark side to anything. Its ultimately people, programmers.

    We were having this conversation the other day at my organization on how to improve this, that and the other; what new technology to adopt; what new tools to bring in etc.

    At some point, one among us stood up and said. “Nothing, I repeat, Nothing can replace good programmers”. We all knew that was true.

  19. Agree with the likes of @Alan Moore’s Dreamweaver comment, @JamesTaylor and @NicolasHoffman… I like writing vanilla CSS. It lets me have an intimate relationship with the code and it means I know everything that’s happening and why. I may be alone here, but I think eventually CSS preprocessors will go away and much of their functionality will be brought to native CSS. There are so many gross websites out there that take so long to load, even on high bandwidth connections. People seem to care less and less about performance, but as we move more and more to mobile it’s going to become even more important. It’s true people can mess up vanilla CSS too, but I think that if people really understand CSS then it’s less likely to happen writing it directly vs using a preprocessor.

Got something to say?

We have turned off comments, but you can see what folks had to say before we did so.

More from ALA