Facebook Is the New AOL

Facebook Is the New AOL
Facebook Is the New AOL

Nilay Patel, for The Verge:

Just think about it for a minute. Of course Facebook is the new AOL.

Honestly, the title of Nilay Patel’s latest op-ed for The Verge does have a pretty catchy headline. And while it is, indeed, both fun and trendy to paint Facebook as the social network of our mothers, the point is well made: from the nerdy corner of the internet where sites like The Verge and Gizmodo thrive, Facebook truly does seem to be losing its prowess amongst tweens and 20-somethings.

It wasn’t always that way. In 2006, I was a sophomore in college when my friend recommended I drop Myspace like it was hot in lieu of Facebook. What? Eschew the beautify-my-homepage-for-dummies HTML editors for Facebook’s bland setup? Yeah right.

Over the next 6-12 months, Facebook took off. By that time it did, I had hundreds of Facebook friends, ranging from random people whom I have never met, all the way to pop superstars like Phil Collins. Alas, I have not met him either. Anyone who was anyone was using Facebook. Myspace? Pshh. That was so yesterday.

Today, many still argue that Facebook has an enduring presence that could never be supplanted. At least, not anytime soon. However, starting in 2010, Twitter and Instagram and a host of other social media aggregates started to take foothold. And in doing so, the newer generation of trendy services started to swallow up the younger people form Facebook. This target audience, after all, has historically been the group most social media services are determined to command as quickly as possible. Once the slow bleeding of target audience begins, who is left? The moms. And the grandma’s. And police departments fending off terrible PR regarding the unlawful shooting of canines.

Why does this happen? Has this happened before?

Of course it has. It happens all the time.


Rewinding the tape a few more years brings us to the year 2000. I was in the 7th grade. And despite all the Y2K hubbub, my grandparents thought it best to fashion our household with a personal computer, so we children could learned ourselves. Typing was very important, they felt. My grandmother worked for a large firm in Silicon Valley contracted with government entities to build electronic components destined for F-16s, among other things. Pretty gangster if you ask me. Living and working in San Jose brought with it a certain pizzazz for The Valley. And as such, her exposure to the personal computer was mostly rooted in the romantic concept of entrepreneurial nerds making revolutionary products. That is, when she wasn’t being personally terrified of the personal computer. The last thing she wanted was for her grandchildren to be as aloof, if not paralyzed, by the prospect of working with computers as she was. So she got us a computer: The Compaq Presario 5280. Thing had an Iomega Zip drive and everything. And just like everyone else, we never used it. Still: so tight.

My parents were about as scared as my grandparents were of computers, and my younger siblings weren’t old enough to use the computer anyways. So it was pretty much all me. When I wasn’t playing N64, I was using the computer. There wasn’t even Windows XP in 2000. It was all about Windows 98. As a tinkerer, I naturally liked navigating the Explorer, and perusing the intricacies of the Control Panel to figure out what were the best ways to complete xyz tasks. Sure, we had Encarta, but nothing was cooler than fiddling. I became a master of MSConfig. My parents warned me to be careful. “You might break it,” I can hear them saying. ‘Silly rabbit,’ I thought. (Someday I’ll talk about the greatest sitcom of the 90’s.)

But they were right. I did break it. In fact I broke a lot of things. Nothing hardware-related mind, you. Trust me: the software ‘things’ I broke were just as devastating. Somehow, I accidentally configured the system to be muted. And it wasn’t something stupid like activating, from the task bar, system mute. No, it was much deeper than that. I don’t even remember exactly what it was that I did, but I did something. Perhaps I had disabled the sound card from within the ‘Control Panel --> Devices’ section (or whatever that section was called.)

Eventually I fixed it. I fixed a lot of things actually. My crowning achievement was purging, to the best of my ability, all the stupid bundled software that so clearly was the status quo for Windows machines. Add / Remove Programs style, you see. Nobody liked that junk. (And non-Apple’rs are astonished when we fanbois scoff at the laptops of non-fanbois, adorned with such pretty Intel Inside stickers.)

But not everyone was like me. In fact, for most people, the perception of the personal computer—despite its increasing place in the workforce and, well, everywhere—was still one of fear.

The iMac came out two years before our Compaq, but didn’t sell nearly as many models as the HP-owned beast. Without divulging into the annotated history of Apple 2.0, suffice it say that it took several other contributing factors for Apple computers to attain their current level of notoriety and fandom.

One of those factors was the ease-of-use Mac OS provided. And if I was in 7th grade today, and my grandparents were still alive, there is no doubt in my mind that our family would have been gifted with a new iMac. Imagine how much better our computer experience would have been without the awkward Windows phase?

It is an interesting idea: what if we could just go back in time and erase all the deprecated ‘things’ in our life, and re-live them with the good stuff we know now? Imagine how much better things would be? Imagine, imagine, imagine.

Interesting, yes, but that notion completely misses the point.

Things change for a reason. Something new and better can only come around if something old and less better was there first. It is the nature of iteration.

Another example: Apparently, people go to the movie theaters less nowadays. Anecdotally, I can see this being a real thing. Moreover, I can imagine a future when people just don’t go the movie theaters at all. Why should they? We live in a society where most things are on-demand. Why can’t movies traditionally shown first in theaters instead be shown in the comfort of our living rooms? Why shouldn’t the movie industry adapt to the changing media-consumption habits? (That is, in fact, the greatest criticism of the MPAA et al.)

Maybe in ten years I will be formulating a similar post, one in which I detail the fall of Snapchat, and the rise of whatever the next big thing will be.