"I simply said, “Um… Ok. I guess. If you think so."
This line. It's terrible to admit, but this is so recognizable, and can change your whole trajectory.
There are some really great tools out there for showing interaction that don’t require coding skills to use—Flinto for Mac, Principle, Pixate, Invision, After Effects, and others. I like the feel of massaging code as much the next guy, but I’m not so enamored with the mystique of monospaced green text on a black background that I’m blind to how comparatively inefficient it is.
The thing is, code was never a good idea, it’s a hack. It’s not natural and it requires people to behave like computers, rather than computers adapting to us. We’ve had to do it to get here, and I appreciate that. But I look forward to the day that our tools are so good that the code is completely hidden from us, and programming is officially a dead art.
"if I didn’t already know it then I certainly wouldn’t go to the considerable trouble of learning it."
lol. good luck staying relevant in this industry.
Code is a hack like math is a hack. It's not a normative way of thinking, but with practice comes proficiency and mastery. Technology isn't natural by definition, it has to be discovered and learned.
Also, the different ways of thinking that software development opens you up to is incredible. I'm surprised anyone would want to hide all that behind a GUI for no real reason other than just to do it.
Additionally, a GUI would severely decrease precision & accuracy, which is required for during development.
... says the film industry, never.
This sounds like a point from someone that doesn't code. The filmmaker uses this application so he or she doesn't have to manually code it. What do you think the driving force behind the film makers GUI is? What do you think the video is saved as? There will always be a developer that needs to write the application to help those that can't code.
Find me a SAAS that can pass form data to two instances of SF, another CRM and send that data via email as a fallback. Code isn't going any where anytime soon. Sorry.
And just like computers have successfully hidden a great deal of math from us, they'll eventually hide a lot of programming from us, if not all. See also: the singularity.
I worry that this will take unnecessarily long though, because the people best equipped to bring us into to a post-code world, i.e. programmers, actually like code too much to want to do it.
Have you ever programmed in an assembly language? Moved bits between CPU registers? What about allocating memory in C? I don't think you're being fair to the amount of math computers are already hiding from us.
In a lot of ways, we're already moving beyond code in the way I think you're envisioning it, too. Apple's toolkit for building iOS apps includes graphical tools for building the interface, where you just drag and drop components the way you want them to appear onscreen. Services like Parse let you build backends without ever writing a line of code. We've actually been down this road with HTML and CSS, too—we had tools like Frontpage and Dreamweaver, but collectively decided that it was better to write our code by hand. We backpedaled away from this code-less future!
Ultimately, I think, we will move away from code for simple things like building apps. But it'll take time to get there… and there will always be something cooler and more complex that you do need code for.
It's not a normative way of thinking, but with practice comes proficiency and mastery.Technology isn't natural by definition..
I disagree. The fact that logic completely dominates our reasoning and our decision making coupled with the phenomenon that all computational expressions can be modeled, in the end, using a chain of logical expressions is one of the biggest reasons we think artificial intelligence may one day be able to fully replicate human reasoning and thought.
The fact we are such logical beings to begin with is a 'hint' that perhaps they are all computational, but we simply possess such a natural genius as to think it's 'not natural by definition'.
This is a romantic idea and in my opinion, a pretty accurate summary of things to come. But I think that day is further off than you're suggesting. My guess is that the current situation will prevail for at least the next 10 years. It may never change fully.
Like Dan mentioned, math has been around for decades. It is possible that the limitless possibilities code offers us simply cannot be significantly abstracted without diluting their effectiveness.
But I think that day is further off than you're suggesting. My guess is that the current situation will prevail for at least the next 10 years.
I didn’t actually put forth an estimate, but yours sounds reasonable enough.
The situtation I'm referring to is the fact that we have millions of potential designers trying to enter the web development world who feel blocked, because the way websites are currently built is so abstract and so unfamiliar. In a lot of cases it's the exact opposite to what most visual people expect.
I remember feeling that way myself when I was starting out. I persisted and broke through, but countless others don't. So while, as you mentioned, the current situation is a lot better than it was 5 years ago and is improving every day, it's still clearly not optimal.
So many things wrong with the last few statements.
As a practicing programmer who's trying to understand design better, I agree wholeheartedly with this. Just like we're seeing great progress and associated flux in design thinking and tooling (esp. prototyping), front-end programming is also undergoing massive churn, and is an area of specialization that can take you down a rabbit hole that is as deep as you want it to be. There is language churn (Elm, ES6, TypeScript, Flow, Coffee..), framework revolution (React, Ember, Angular, Mercury, Om, ..), flux in CSS methodologies (BEM, SMACSS, SUIT, Atomic), and each of these divisions have many other smaller choices to be made like testing frameworks, utility libraries, promise libraries, async, and so forth.
The point I want to make is that it will take years of practice, attuning to a different way of thinking, and immersing our already limited mental bandwidth to a different field, for either a programmer to become better at design or for a designer to become better at programming. And a lot of this programming churn, for anyone except a professional programmer, is an unproductive use of our lives. We have to keep up in part because of fashion fads, and in part due to the massive entropy inherent in any changing system. But even for a professional programmer, not all of the churn is worthwhile - if there was a way to figure out only those things that matter, those that adds to the essence of our practice, we could simply focus on them, and cut out all the noise. This is unfortunately hard in practice.
Very well said, and very down to earth.
"but I’m not so enamored with the mystique of monospaced green text on a black background that I’m blind to how comparatively inefficient it is."
maybe you're just not good at it.
twenty years [ago] I had pixel perfect mockups [..]
Well, there weren't many pixels back then. : )
Doesn't necessarily mean pixel perfect was easy :)
Each one of them was a big decision.
Really great read.
"If there’s any kind of addiction you’ll experience as a designer, I guarantee it’s the rush you’ll get after making something with your own two hands that didn’t exist in the universe until you made it. Getting someone else to make it for you is certainly rewarding, but it’s not nearly the same experience."
Been ruminating on making the leap for quite some time... Hmmm
Great post! Thoroughly enjoyed it.
"These screenshots you have here are plenty. It’s all we’ve ever done before, so there’s really no need to spend this kind of time on a prototype."
“Um… Ok. I guess. If you think so.”
This is the real "lesson" to be taken from this piece. The PM rejected a new way of doing things, because it was different from his usual ways and new methods most often seem like "too much work". And Andrei accepted the PM's rejection, mostly because the PM had the hierarchical authority.
The fact that 20 years ago it was some proprietary Macromedia code is not important. The scene could happen today, and it wouldn't matter if the prototype was pure HTML/CSS/JS, in Framer with CoffeeScript, or in Pixate or Form or Principle.
The final part about "designers should learn to code" is too specific, distracts from this actual point, and sends the whole discussion into "design vs. code" clichés.
And Andrei accepted the PM's rejection, mostly because the PM had the hierarchical authority.
Speaking as a PM today, and having been in the author's shoes before, I just want to point out that over the years I think "hierarchical authority" has been moving toward "do-what-it-takes-to-get-your-job-done".
I agree, "designers should learn to code" as a statement isn't very fair. The author's point and the whole debate should actually be more like "designers should learn to code because ____". He doesn't say it explicitly (and I think he should)-- learning to code helps build empathy. And maybe one could get away with a career in "this industry" as a designer who doesn't have perfect empathy for the engineers they work with.
But empathy, not just for the people consuming your work, but also for the people working along with you I would argue, only helps everyone.
Totally agree about the empathy among working peers! That being said the empathy should go both ways, programmers should learn at least about design principles and the "whys" of design so that they don't feel they're wasting their time when polishing interfaces.
I've always sympathized with Tyler Gaw's statements about why he codes:
Programming is a means to an end. I have ideas and want to see them come to life. I use code to make those ideas reality... I push myself to learn new languages, frameworks, and practices. No language is off limits
I feel like that's what this guy is saying too. And, for what it's worth, that's how I feel about coding myself. It's a way for me to express ideas I have.
The best thing about designing in code is that the overhead is bigger but iterations are fast. Once you designed a style guide in code, making prototypes is easier and faster than creating mockups with Photoshop or Sketch.
But every designer is different, we should do what works for us. There's no absolute path.
Such an insightful read. Has actually made me rethink the 'should designers code' argument. Whilst you could argue that you could spread yourself too thin (skill set wise) I think the positives outweigh the negative.
From my experience the closer you can get to the real thing, be it a website or app, the better. Sure, it takes time but saves agonising cycles of flats, hand holding and constant explanation.
"It still boggles me to this day how some people in tech find this to be some sort of revelation."
It is probably is the youth. It is not there fault, they weren't around then.
Who remembers mbed interactor? I use to do some cool stuff outside that wasn't Flash. http://www.prnewswire.com/news-releases/mbed-software-announces-support-for-w3cs-proposed-synchronized-multimedia-integration-language-smil-77364317.html