“I got frustrated one night and started firing rubber bands at my screen to help me think. It was a habit I had back then to shake up my thinking. Probably more practical on a tough glass CRT than on a modern flat screen! After firing a few rubber bands, I was still stuck. So I fired up a doobie to see if that would help. As I flicked my lighter and looked at the fire, it all came together. Fire a rubber band. Fire up a doobie. Fire an event!”
We are overdue as a society for seriously questioning what has become, but what has not always been, the dominant model of “innovation”. Recent weeks have drawn a bold underline beneath what has been clear to many for a long time: that those controlling massive amounts of capital and power in our society are not the smartest, or most level-headed, or most altruistic among us. Venture capital may be the best way to serve the interests of capital, but we need to consider alternative models that prioritize the interests of people.
Every day is science fiction
Science fiction is one of my favorite genres because of its power to make the strange familiar and the familiar strange.
Kim Stanley Robinson wrote an anti-dystopian essay in which he discusses how science fiction works:
For a while now I’ve been saying that science fiction works by a kind of double action, like the glasses people wear when watching 3D movies. One lens of science fiction’s aesthetic machinery portrays some future that might actually come to pass; it’s a kind of proleptic realism. The other lens presents a metaphorical vision of our current moment, like a symbol in a poem. Together the two views combine and pop into a vision of History, extending magically into the future.
I read that and then, a day later, stumbled upon a thought experiment published on the wonderfully quirky website of Ville-Matias Heikkilä.
The thought experiment, titled “Inverted computer culture”, asks the reader to image a world where computing is seen “as practice of an ancient and unchanging tradition.”
It is considered essential to be in a properly alert and rested state of mind when using a computer. Even to seasoned users, every session is special, and the purpose of the session must be clear in mind before sitting down. The outer world is often hurried and flashy, but computers provide a “sacred space” for relaxing, slowing down and concentrating on a specific idea without distractions.
What a dream. I encourage you to read the piece which is quite short. It struck me as being exemplary of the aforementioned double action of science fiction—both a vision of the future and a metaphor for the current moment. You can imagine how a fictional immune response to our current culture might drive us toward a world of computing and technology like the one imagined here.
To push it a bit further, I prompted ChatGPT to write a story based on the thought experiment and threw the result into a gist. You can read the story it came up with here.
The story’s alright, but the last paragraph is something else. It captures so many of the feelings I have about computing and the web:
As she sat there, lost in her work, she knew that she would never leave this place, this sacred space where the computers whispered secrets to those who knew how to listen. She would be here always, she thought, a part of this ancient tradition, a keeper of the flame of knowledge. And in that moment, she knew that she had found her true home.
Here’s to all those who know how to listen.
How blogs shaped the web
I have a lot of nostalgia for the era of blogging that I grew up with during the first decade or so of the 2000s.
Of course there was a ton of great content about technology and internet culture, but more importantly to me it was a time of great commentary and experimentation on the form of blogging and publishing.
As social media and smartphones were weaving their ways into our lives, there was a group of bloggers constructing their own worlds. Before Twitter apps and podcast clients became the UI playgrounds of most designers, it was personal sites and weblogs that were pioneering the medium.
Looking back, this is probably where my meta-fascination with the web came from. For me the most interesting part has always been the part analyzing and discussing itself.
Robin Sloan puts it well (as he is wont to do):
Back in the 2000s, a lot of blogs were about blogs, about blogging. If that sounds exhaustingly meta, well, yes — but it was also SUPER generative. When the thing can describe itself, when it becomes the natural place to discuss and debate itself, I am telling you: some flywheel gets spinning, and powerful things start to happen.
Design, programming, and writing started for me on the web. I can recall the progression from a plain text editor to the Tumblr theme editor to learning self-hosted WordPress.
All of that was driven by the desire to tinker and experiment with the web’s form. How many ways could you design a simple weblog? What different formats were possible that no one had imagined before?
Earlier this week I listened to Jason Kottke’s recent appearance on John Gruber’s podcast and was delighted to hear them discuss this very topic. Jason is one of the original innovators of the blog form, and I’ve been following his blog, kottke.org, since I was old enough to care about random shit on the internet.
Kottke.org turned 25 years old this week, and Jason has been publishing online for even longer than that. All along the way, he has experimented with the form of content on the web. He’s not alone in that—many bloggers like him have helped to mold the internet into what it is today. The ones that influenced me besides kottke.org are Daring Fireball, Waxy.org, Jim Coudal and Coudal Partners, Shawn Blanc, Rands in Repose, Dave Winer, and more that I’m certainly forgetting.
Jason and John have an interesting conversation during the podcast (starting around 25 minutes in) about how the first few generations of bloggers on the web defined its shape. Moving from print to digital mediums afforded a labyrinth of new avenues to explore.
It’s always important to remind ourselves that many of the things we take for granted today on the web and in digital design had to be invented by someone.
Early weblogs did not immediately arrive at the conclusion of chronological streams—some broke content up into “issues”, some simply changed the content of their homepages entirely.
It wasn’t until later that the reverse-chronological, paginated-or-endless scrolling list of entries was introduced and eventually became the de-facto presentation of content on the web. That standard lives on today in the design of Twitter, Instagram, etc., and it’s fascinating to see that tradition fading away as more sites embrace algorithmic feeds.
By the way, I’d be remiss here if I didn’t mention Amy Hoy’s amazing piece How the blog broke the web. Comparing the title of her piece with the title of this one, it’s clear that not everyone sees this shift in form as a positive one, but she does a great job in outlining the history and the role that blogs played in shaping the form of the web. Her particular focus on early content management systems like Movable Type is fascinating.
Another great example that Jason and John discuss on the podcast is the idea of titling blog posts.
They point out that many early sites didn’t use titles for blog posts, a pattern which resembles the future form of Tweets, Facebook posts, text messages, and more. But the rise of RSS readers, many of which made the assumption that entries have titles and design their UIs around that, forced many bloggers to add titles to their posts to work well in the environment so popular with their readers.
Jason mentions that this was one of the driving factor for kottke.org to start adding titles to posts!
This is an incredible example of the medium shaping the message, where the UI design of RSS readers heavily influenced the form of content being published. When optimizing for the web, those early bloggers and the social networks of today both arrived at the same conclusion—titles are unnecessary and add an undue burden to publishing content.
This difference is the very reason why sending an email feels heavier than sending a tweet. Bloggers not using titles on their blog posts figured out tweeting long before Twitter did.
When referring to the early bloggers at suck.com, Jason said something that I think describes this entire revolution pretty well.
[…]there was in information to be gotten from not only what they linked to, but how they linked to it, which word they decided to make the hyperlink.
It’s not often that you have an entirely new stylistic primitive added to your writing toolbox. For decades you could bold, italicize, underline, uppercase, footnote, etc. and all of a sudden something entirely new—the hyperlink.
With linking out to other sites being such a core part of blogging, it’s no surprise that the interaction design of linking was largely discussed and experimented with. Here’s a post from Shawn Blanc discussing all the ways that various blogs of the time handled posts primary geared towards linking to and commenting on other sites.
Another similar example is URL slugs—the short string of text at the end of a web address identifying a single post. For many of my favorite bloggers, the URL slug is a small but subtle way to convey a message that may or may not be the same as the message of the post itself. One other stylistic primitive unique to the web.
The different ways in which bloggers designed their site or linked to words became a part of their unique style, and it gave their each of them an entirely new way to express themselves.
It’s hard to communicate how grateful I feel for this era of experimentation on the web, and specifically for Jason Kottke’s influence on me as a designer. The past 25 years have been a special time to experience the internet.
There was a time when I thought my career might be curved towards blogging full-time and running my own version of something like kottke.org. Through exploring that I found my way to what I really loved—design and software. My work continues to benefit from what I learned studying bloggers and publishers online.
Whether you care much about writing or not, I encourage you to have a blog. Write about what interests you, take great care of how you present it to the world, and you might be surprised where it takes you. There are new forms around every corner.
I don’t need to be an authority on anything. I don’t need you to agree with my arguments. I know this is probably too long, too broad, and too egotistical for the mass market to read, and you most likely skimmed over it. I wrote this just now, and I’m going to publish it now, even though it’s Sunday and it won’t see peak traffic. I don’t want to write top-list posts 10 times a day. I don’t want to be restricted to my blog’s subject or any advertisers’ target demographic. This site represents me, and I’m random and eccentric and interested in a wide variety of subjects.
It is boundary-violating, to have a website in the corner of your bedroom. Websites are meant to be in the cloud. Eternal, somehow, transcendent, like the voice of code floating down from the sky. But no, there it is. It is real! I can kick it! Argumentum ad lapidem.
Ambient internet
The recent fad of the metaverse is all about digitizing the physical world and moving our shared experiences (even more so) onto the internet.
I wonder what an opposite approach might look like—one where, instead of making the physical digital, we instead attempt to bring the online world into our physical spaces (and no, I don’t remotely mean AR or VR).
The first thing that comes to mind for me is Berg’s now-defunct Little Printer project from back in 2012 or so. Little Printer was a web-connected thermal printer that lived in your home and allowed you to receive print-outs of digital publications, your daily agenda, messages from friends, etc.
Little Printer was an attempt at bridging the physical and digital, essentially creating a social network manifested as a physical object in the home and consumed via paper and ink.
Personal websites are the digital homesteads for many. Those sites live somewhere on a web server, quietly humming away in a warehouse meant to keep them online and secure. For each of us those servers represent empty rooms waiting to be decorated with our thoughts, feelings, interests, and personalities. We then invite strangers from all over the world to step inside and have a look.
Like the Little Printer, I wish that my web server could exist in my home as a physical object that could be touched, observed, and interacted with.
Hosting a web server yourself is surprisingly difficult today given the advances we’ve made in consumer technology over the last few decades. Hosting content on someone else’s server has become as simple as dragging and dropping a folder onto your web browser. There are countless business that will happily rent out online space to for very cheap (or even free, with the hopes that eventually you’ll upgrade and give them money).
We’re all tenants of a digital shopping mall, sharing space controlled by corporate entities who may not share our values or interests.
When someone visits my website, I wish it could feel more like inviting them into my home. What if my website lived in my home with me?
Imagine if having a web server in the home was as common as any other appliance such as a refridgerator. You might look over and see your friend (or a welcome stranger!) browsing your website. You could see what they’re browsing—look at photos with them, listen to a song together, whatever—and start a conversation about any of it.
I’m certainly not the only one who has imagined this. A while ago I stumbled upon a project by a student named Jeeyoon Hyun called “Personal Pet Pages” which is a small, personal web server with a fiendly screen displaying what’s going on inside the server.
Ever since we’ve decided that servers are something heavy, enigmatic, gigantic black boxes belonging to corporations - not individuals - we have slowly lost agency towards our own small space on the Internet. But actually, servers are just computers. Just as your favorite cassette player or portable game console, they are something that you can possess and understand and enjoy.
Jeeyoon’s idea combines turns a web server into a sort of virtual pet, one that you can move around and interact with.
Matt Webb has also considered the idea:
It is boundary-violating, to have a website in the corner of your bedroom. Websites are meant to be in the cloud. Eternal, somehow, transcendent, like the voice of code floating down from the sky. But no, there it is. It is real! I can kick it! Argumentum ad lapidem.
Those fixated with the idea of the metaverse might are interested in bringing real-world objects into the cloud. I wonder instead how we might try to bring objects from the cloud into the real world and into our homes. How would we design webpages differently if our materials included the servers that they’re hosted on?
Our obstacle is that we live in an attention-deficit culture. We are bombarded with more and more information on television, radio, cell phones, video games, the Internet. The constant supply of stimulus has the potential to turn us into addicts, always hungering for something new and prefabricated to keep us entertained. When nothing exciting is going on, we might get bored, distracted, separated from the moment. So we look for new entertainment, surf channels, flip through magazines. If caught in these rhythms, we are like tiny current-bound surface fish, floating along a two-dimensional world without any sense for the gorgeous abyss below. When these societally induced tendencies translate into the learning process, they have devastating effect.
— Josh Waitzkin, “The Art of Learning”
Remember the Hockney photos? The size of what we’re making is unknown until we know what we’re putting there. So, it’s better to come up with an arrangement of elements and assign them to a size, rather than the other way around. We need to start drawing, then put the box around it.
Simply put, the edgelessness of the web tears down the constructed edges in the company. Everything is so interconnected that nobody has a clear domain of work any longer—the walls are gone, so we’re left to learn how to collaborate in the spaces where things connect.
an edgeless surface of unknown proportions comprised of small, individual, and variable elements from multiple vantages assembled into a readable whole that documents a moment
So this is a good start, but it is only a start. Could those simple sites I showed earlier assist us beyond the page and provide a larger way to think? To put a finer point on it: What would happen if we stopped treating the web like a blank canvas to paint on, and instead like a material to build with?
The web is forcing our hands. And this is fine! Many sites will share design solutions, because we’re using the same materials. The consistencies establish best practices; they are proof of design patterns that play off of the needs of a common medium, and not evidence of a visual monoculture.
I believe every material has a grain, including the web. But this assumption flies in the face of our expectations for technology. Too often, the internet is cast as a wide-open, infinitely malleable material. We expect technology to help us overcome limitations, not produce more of them. In spite of those promises, we typically yield consistent design results.
The awe goes—time takes it.
The speed with which Twitter recedes in your mind will shock you. Like a demon from a folktale, the kind that only gains power when you invite it into your home, the platform melts like mist when that invitation is rescinded.
The amount that Twitter omits is breathtaking. More than any other social platform, it is indifferent to huge swaths of human experience and endeavor. I invite you to imagine this omitted content as a vast, bustling city. Scratching at your timeline, you are huddled in a single small tavern with the journalists, the nihilists, and the chaotic neutrals.
sometimes, looking back at an accounting of it all, i’m disappointed that it doesn’t add up to quite as much as i thought it would by now. it’s not a clean narrative. i had always believed that i was a certain kind of person: decisive, confident, brilliant, glamorous, empathetic. these days, battered by the reality of going up against the world, i’m more tired and less sure.
sometimes, looking back at an accounting of it all, i’m disappointed that it doesn’t add up to quite as much as i thought it would by now. it’s not a clean narrative. i had always believed that i was a certain kind of person: decisive, confident, brilliant, glamorous, empathetic. these days, battered by the reality of going up against the world, i’m more tired and less sure.
Where it all began
I remember the first time I saw a Mac in person. I was in middle school, but on the campus of the nearby college because my dad had a gig as a stand-in drummer for a local band.
While hanging out backstage—something I often had the privilege of doing from a young age as the son of a drummer—I saw a girl, sitting on the ground, typing away on a brand new MacBook Air.
The Air had just been introduced to the world, and I remember rewatching the announcement video online. Steve Jobs talked about the computer at Macworld only to reveal that it had been on stage with him the entire time inside a manilla envelope. He opened it and pulled out the thinnest computer in the world. I had no idea a computer could even look like that.
After my dad’s show I immediately pointed out the girl and her computer, and I remember him sharing my excitement so much that he asked the girl if we could look at it a bit closer. She was kind and happy to show it off and even let me hold it. From then on, I was hooked. I knew that’s the computer I’d own one day, and sure enough I’d get my first Mac, a MacBook Air, a few years later in high school.
And now Apple has introduced a MacBook Air thinner than the original iPhone. I wonder what middle school me, who coveted but did not own an iPhone at the time, would think about that.
I received the new M2 MacBook Air (in Midnight) a few months ago and I’ve been smitten with it. It is a cool, dark slab of silent compute, and it feels dense and book-ish in the most satisfying way.
The battery life deserves its own mention, and feels like a leap ahead for personal computers in its own right.
In all honesty I thought the time had come when a computer could not longer really excite me in the way that original MacBook Air did. But, this new one takes me right back there. It reminds me how lucky we all are to carry around devices that can conjure up all sorts of magic. And it takes me back to my beginnings in software when people wrote about the design of new iOS and Mac apps like they were art critics.
My life and friends and relationships and career are all in there, wound up with the electrons.
In setting up and using this new computer for the first time, however, I’ve realized how much devices today are like shells. The real computers, the ones that store our data and perform tasks on our behalf, are behemoths sitting in data centers. Setting up a new computer today is mostly a task of signing into various web applications to access your data, not transferring data onto the machine itself.
Our computers have become internet computers. And that might mean that the physical devices we own will trend towards nothingness—their goal is no longer to impress or inspire, but to be so small and light as to fall away entirely.
There’s something about that which makes me feel a bit melancholy. It feels like the days of computing devices being objects with personality and conviviality are fading. The computer is no longer a centerpiece, it’s an accessory, a thin client for some other machine or machines which are hidden away from us.
As a deeply lonely teenager, I learned that I could earn others’ regard and become valued in a community by “doing cool stuff on the internet.” So, even today, my automatic response to these fears is to switch to an activity which produces some kind of visible output. Make a prototype, write up some notes, sketch a concept. These are appropriate behaviors at times, of course, but not when pursued as fearful substitutes for what I’m actually trying to do.
Why is this so hard? Because you’re utterly habituated to steady progress—to completing things, to producing, to solving. When progress is subtle or slow, when there’s no clear way to proceed, you flinch away. You redirect your attention to something safer, to something you can do. You jump to implementation prematurely; you feel a compulsion to do more background reading; you obsess over tractable but peripheral details. These are all displacement behaviors, ways of not sitting with the problem. Though each instance seems insignificant, the cumulative effect is that your stare rarely rests on the fog long enough to penetrate it. Weeks pass, with apparent motion, yet you’re just spinning in place. You return to the surface with each glance away. You must learn to remain in the depths.
My favorite aspect of websites is their duality: they’re both subject and object at once. In other words, a website creator becomes both author and architect simultaneously. There are endless possibilities as to what a website could be. What kind of room is a website? Or is a website more like a house? A boat? A cloud? A garden? A puddle? Whatever it is, there’s potential for a self-reflexive feedback loop: when you put energy into a website, in turn the website helps form your own identity.
Design is the process of taking the available data and coming up with its representation. The outcome is reasonably well specified and understood.
Discovery is about the transformation (usually expansion) of that input. It’s the evolution of the design. The uncovering of new states and new ideas throughout the process itself.
We used to have a map of a frontier that could be anything. The web isn’t young anymore, though. It’s settled. It’s been prospected and picked through. Increasingly, it feels like we decided to pave the wilderness, turn it into a suburb, and build a mall. And I hate this map of the web, because it only describes a fraction of what it is and what’s possible. We’ve taken an opportunity for connection and distorted it to commodify attention. That’s one of the sleaziest things you can do.
And you know, these little animations look awfully similar to animated GIFs. Seems that any time screens appear, some kind of short, looping animated imagery of animals shows up, as if they were a natural consequence of screens.
Just like any material, screens have affordances. Much like wood, I believe screens have grain: a certain way they’ve grown and matured that describes how they want to be treated. The grain is what gives the material its identity and tells you the best way to use it. Figure out the grain, and you know how to natively design for screens.
The interfaces we build are where we put the padding. You give a user something to grasp onto when you make a metaphor solid. In the case of software on a screen, the metaphors visually explain the functions of an interface, and provide a bridge from a familiar place to a less known area by suggesting a tool’s function and its relationship to others.
Today’s internet is largely shaped by a dialog between two ideas. One position considers personal data as a form of property, the opposing position considers personal data as an extension of the self. The latter grants inalienable rights because a person’s dignity - traditionally manifested in our bodies or certain rights of expression and privacy - cannot be negotiated, bought, or sold.
What remains explicitly clear is the fact that folks are not gathering in the digital equivalent of parks and town squares, they are gathering in online centers of commerce. Our digital public spaces, often called “platforms,” are really purpose-built shopping malls.
I believe this is because the comparison to films misses a central property of interfaces that is so constitutive that it outweighs the other similarities: Agency — it is human action that is indispensable to an interface. Like visitors to a building, users of an interface are given the agency to choose their own path, to move through it at their own speed and discretion: to wander and to linger, to move swiftly and purposefully, or to explore. Another striking similarity is that interfaces are, like buildings, never experienced all at once, but piecemeal: screen by screen, or room by room. Only in the user’s mind are they shaped into a coherent entity, are seen as a uniform whole.
Even traditional user interfaces are fundamentally three-dimensional — the third dimension in this case being time — and in this regard, they are similar to films.
this year, i hope to come to trust myself more. i hope to know when i need to love my work deeply, and when i need to be able to set myself free. i hope to find more balance and more quiet—in the world around me, yes, but especially in the recesses of my mind. i hope to love people for exactly who they are, knowing that a person's strengths and their flaws are often two sides of the same coin. i hope to want more for myself—and not the kind of wants manufactured for me by brands on instagram or thought leaders on twitter or microtrendsetters on tiktok, but the ones for which my soul hungers, the ones that replenish and renew me. little wants and big wants, but my own wants. i hope to think less and do more. i hope to grow stranger. i hope to get better at hoping against hope.
it's hard, living in such persistently unprecedented times, to know what is the natural process of aging and what's the specific peculiarity of aging in this time.
there's a running joke (is joke the word?) on twitter that we're all still stuck in 2020, or that we're about to begin year eight of 2016. in my own life, at least, that has felt true. 2016 is the last year i can recall feeling deeply optimistic about what the new year would bring, for me and for the world at large. since then, the fragile hopes i bore for each new year have been flattened again and again into the formless sameness of a world where time means nothing and yet somehow everything manages to keep getting worse.
This is truly a core guiding methodology to how I approach the web: as a composable, iterable, resilient thing. Something that invites creation, play and generative exploration.
Over the phone, Susan tells me all kinds of things. That she used her social engineering skills to sneak past military checkpoints and into Area 51. That she went dumpster diving with a young Charlie Sheen. That she figured out how to set off US missiles from a phone booth—a feat Kevin Mitnick was once accused, famously, of being capable of pulling off. That she once sprang an accomplice from jail over the phone, posing as a clerk from a different precinct.
“Whether I… perform some kind of ruse to gain access, or whether I just go seduce the guy and blackmail him afterwards… if I want to get into that computer, I’m going to get into it,” Susan said at the conference, as her almost entirely male audience laughed nervously. “That’s one advantage women hackers have over you guys,” she added, “if you’re willing to use it.”
She and her new friends cruised the city at night, searching for unsecured dumpsters outside of phone company offices. The manuals and interoffice memos they pilfered from the trash were maps to the parts of the phone network that were hidden from view. By leveraging the information they found dumpster diving — everything from internal jargon to access codes and employee names — they were able to pull more complex and ambitious scams.
She claims to be one of only three women to have slept with all four Beatles, securing the trickiest, Paul McCartney, through an elaborate pretext that involved having his wife Linda whisked away in a limo for a staged photoshoot. When she was still underage, she hitch-hiked to Vegas with Johnny Thunders (no relation) from the New York Dolls. In a 1979 tabloid tell-all, she’s pictured with Andy Gibb, Donny Osmond, and Ringo Starr. Once, tearing down the Pacific Coast Highway in a convertible Mercedes, “flying on coke” with Mick Ralphs, the guitarist for Bad Company, she decided she must be immortal — a theory she’d test with enough overdoses that she considers herself lucky to be alive today.