Pasting the Web

Flickr/The Next Web

Short-linking—that act of repackaging ungainly, often ugly strings of letters, numbers, ampersands, and question marks into elegantly tiny URLs—has been around for more than a decade, but only gained mainstream traction with the 2006 launch of Twitter and its capping of tweet-length at 140 characters. While the mechanics are complicated, the short story about the recent techie flare-up over short-linking is that Twitter has moved away from third-party shorteners to its own, the use of which is now mandatory for all links shared directly on the service. For the uninitiated, here's an example:

Long link for this piece:

Short link for this piece:

It's been a small shift, little noticed by most, but now that trading links on social media—Hey, check out this video of Hillary yelling at Republicans!—is a main feature of the Internet-connected world, short-links are part of the real fabric of the web. Deep in some software-developer circles, not all are pleased with Twitter's "link wrapping."

The rest of us might only think about it, if at all, when it changes the way we expect the Web to work. Take copy-and-paste. Go onto and use your browser's link-copy feature ("copy link address" in Chrome, for example) to copy a short URL and paste it up in the address bar. What had been "" (the Prospect's URL shortener) is now "" (Another thing about Twitter's short-linking that can make developers' heads burst: links can end up longer than the ones they replace.)

"I was wondering what I was doing wrong," says Larry Tesler, on a phone call, about copying one link on Twitter and getting another. Tesler would be the one to notice—he's the guy who invented copy-and-paste back at the famed technological playground of Xerox PARC in the 1970s. Tesler had been searching for a way to equip computer users to move text around easily, no matter the mode or program they happened to be in.

Links, as inventor of the World Wide Web Tim Berners-Lee has put it, "turn the Web's content into something of greater value: an interconnected information space," and one of the more concrete objections to modern short-linking is that we're undoing years of self-training in navigating that space. We use long links to tell us where we're headed and to decide whether we want to go there. "If someone changes a URL on you, you're thinking it's a virus," adds Tesler. Indeed, we've spent years learning—and teaching others—that if you mouse over a link and it looks fishy, run away.

That responsibility's now been outsourced to Twitter. In fact, one of the justifications Twitter gives for changing custom short-links to its own is that these links often mask spam. Part of the power of "" is that Twitter can match the corresponding long links to a database of malicious URLs and protect users from them.

But beyond that, Twitter has many reasons to want to act as the middleman on short-linking. More than ever, digital data is currency. Knowing who clicks what can be used to power everything from statistics add-ons for premium accounts to targeted advertising and advanced market research.

No one's suggesting marching in the streets over annoying copy-and-paste outcomes. (Well, except maybe on OneWebDay.) But it should serve as a signal of something larger, sort of like how you might realize bigger issues are at play when your pants suddenly don't fit.

Earlier this winter, web veteran Anil Dash caused a stir in some corners of the online world with a post titled, "The Web We Lost." In it, Dash had at "bullshit turf battles"—e.g., Twitter restricting Tumblr's use of a "friend finder" or Instagram's opting-out of displaying photos on Twitter. According to Dash, such practices only teach up-and-coming entrepreneurs to "make more narrow-minded, web-hostile products."

The good news, perhaps, is that we the people of the Internet are getting better at noticing and responding to big grabs for control. When, for example, Instragram's new terms of service seemed to suggest that the company could turn its users' photos into uncompensated endorsements, the reaction was fierce. Much attention went to Kim Kardashian's swift, past-tense lament. "I really loved Instagram :-( " she tweeted. "I need to review this new policy. I don't think its [sic] fair." The company scrambled to reassure that that wasn't what it meant, but either way, its user base shrank.

Kardashian, of course, has built an empire on the compensated endorsement, but there are scores of us who are eager to use the Web, to borrow a phrase from the anthropologist Mary Catherine Bateson, "to compose our own lives." Here's one data point: Bitly, a company that helps users track clicks and other metrics for custom URLs like, reports that it's currently doing so for 30,000 publishers of all shapes and sizes. The Internet is the greatest tool that history has ever known for such micro-entrepreneurialism, whether professional or personal. But increasingly, warned Dash, it's being built "to make a small number of people even more wealthy, instead of letting lots of people build innovative new opportunities for themselves on top of the Web itself."

The argument being made is simple: Instagram's big policy changes are important. But if we're interested in for whom the Web is being built, tiny tweaks like Twitter's evolving handling of short-links matters, too.

Alas, the evidence suggests that we're not so good at recognizing when things are being designed in ways we might not want—not just, but how handcuffed users are in editing comments on Instagram, or the challenges of extracting old tweets from Twitter, or the rigamarole of having to go through the Mac App Store to download TweetDeck. Little things, yes, but they challenge the values-laden expectations of the Internet of those of us who remember when it was uphill both ways. The Web was writeable. You owned what you posted. The Internet was decentralized, without gatekeepers.

To seemingly minor design choices, we're hardwired to adapt. When we push a door the wrong way, our first thought is rarely, Huh, someone did a bad job designing that door. "Invariably," writes usability expert Don Norman in his classic book The Design of Everyday Things, "people feel guilty and either try to hide the error or blame themselves for 'stupidity' or 'clumsiness.'"

That's all the more true for the technologies—Facebook, Twitter, our iPhones—with which our love is, well, kinda intimate. "We forgive them," said Norman in a recent call on tech design, "their slights and faults."

So, simply put, maybe we should quit doing that. Maybe we should open ourselves up to the possibility that not only is the Web writ large not being designed the best way it could be, but that we're capable of doing it better. Maybe the guy who invented copy-and-paste isn't copying-and-pasting wrong. Maybe we're building the Web wrong.

"Why should you care?" Berners-Lee asked in a 2010 Scientific American piece. "Because the Web is yours. It is a public resource on which you, your business, your community, and your government depend. We create the Web, by designing computer protocols and software; this process is completely under our control. We choose what properties we want it to have and not have." Protocols. Process. Properties. The simple things.

It's a vision of Internet stewardship that demands not just rallying over the egregious; it also calls for us to be thoughtful about the mundane.

You may also like