Young people, snake people, gather and let me tell you about a golden era: the early days of the internet. The primary places for meeting new people were chatrooms and newsgroups. You had to use a modem to get online, so every moment was precious. And the greatest virtue of the age – the defining value that is now nearly lost to us – was shutting the hell up.
In the very early days of the internet, it was sometimes considered rude to “lurk” – to listen but not participate – on bulletin board systems (BBSs). But BBSs had very limited resources (some could only support one user at a time), so it made sense that everyone had to take a turn: if you were reading but not participating, other people couldn’t connect.
By the time I was actively using BBS successors like Usenet newsgroups and Internet Relay Chat in the mid- to late 90s, lurking for a little while was practically a requirement. The newsgroups I participated in recommended that newbies hang out for at least a few days and preferably more, absorbing a sense of the culture and personalities involved. Woe betide those who charged in flapping their gums, asking questions that had already been covered multiple times.
The same applied on IRC chat rooms: people could see you join the conversation, so it wasn’t totally possible to slink in unnoticed, but it was best not to come in with your ass on your shoulders. Successful new members stayed quiet, watching the flow and tone of conversation, until they had a reason to pipe up.
I try not to get too Andy Rooney about the state of the internet; the fact that it’s in flux is one of its great benefits, and many of the changes are for the better. (The early days were quieter, but that’s partly because access was more restricted – which is to say, more classist, elitist and US- and European-centric.) But there’s no doubt that social media encourages a “talk first, ask questions later or never” approach, and it drags discourse well below the lowest common denominator. There’s a preponderance of bad-faith arguers launching tiresome straw-man attacks, sure, but even the well-meaning can ruin a conversation by barging in demanding answers to basic questions. Your sincere but uninformed Facebook pal, who truly just wants to understand why it’s not #AllLivesMatter or which pronoun to use for Caitlyn Jenner is the social media equivalent of the friend who wanders in halfway through the movie and says: Who’s that? What are they doing?
It’s easier than ever to inform yourself before you chime in online, but that’s not the culture anymore. Social media is social; it encourages communication and interaction from the get-go, not observation and reflection. Far from lurking to get the lay of the land, you’re expected to tweet to announce that you’re tweeting. (All four Twitter founders posted “just setting up my twttr” within about 10 minutes of each other on the service’s first day of operation.)
Even once you’ve posted your first tweet, there are plenty of opportunities for benevolent lurking – opportunities most people don’t make use of. There are many different Twitters – black Twitter, feminist Twitter, media Twitter, Weird Twitter – with their own personalities and vocabularies and traditions and arguments, and a conscientious user could and should listen carefully to each before participating. The same goes for Facebook groups, Tumblr communities and anywhere else that people gather online. You will always contribute better, learn more, embarrass yourself less if you lurk for a while, if you watch which arguments play out and how and by whom. And yet, in our pipe-up-happy online culture, few people try.
The “social” aspect of social media may feel like it encourages us to be always on, always talking to each other, even when a step back would reveal that our contribution isn’t necessary. But in the offline world, “social life” has a range of meanings, from constant chatter to quiet companionship. It’s important to remember that media can be social without being relentlessly extroverted. Lurk for a while when you can.
This article was corrected on 4 June 2015 to indicate that the author meant Andy Rooney, not Mickey Rooney.
[“source-theguardian.com”]