What if facebook never existed




















When the service initially known as Twittr debuted in July , it was intended as a way to use text messaging to send on-the-go updates to friends. As Twitter started to catch on, the fact that Facebook existed may have actually helped its cause: The ways in which it was like Facebook and unlike Facebook both helped shape its identity. But Google might be even more powerful today than it is.

And the conventional wisdom at the time was that the way to compete with it was to try and build a better search engine, which companies such as Yahoo and Microsoft tried to do. And unlike other companies Google coveted, such as advertising giant Doubleclick, Facebook refused to be acquired, by Google or anyone else. Of all the services Facebook has competed with, the one it cribbed the most ideas from was FriendFeed — a startup, founded in , which had a Like button before Facebook did, plus various other features Facebook later borrowed.

Then, in , Facebook simply bought FriendFeed. Some even use Facebook for comments. The social network is like a thread that stitches together the whole Internet, giving it some cohesiveness it would otherwise lack. But nobody else had anywhere near as much success providing this sort of plumbing for other sites. Very quickly, Facebook insinuated itself into the rest of the web in a way that was unique, though Google, Twitter and others later followed its lead.

In , when Facebook introduced the newsfeed — a unified list of what your friends were up to — there were howls of outrage and accusations that it encouraged stalking ; before long, it became impossible to imagine Facebook without it. Facebook often stumbles, and sometimes it just goes too far — most famously in the case of Beacon , its feature that auto-posted public updates about what members were buying at third-party shopping sites. Some people will never trust the site.

You see something that annoys you or you feel strongly about and you join a Facebook campaign group — problem solved. But how often do we actually see them in the flesh? Think about it. When was the last time you were invited to a reunion or old friends get together? Sign up to receive our informative emails. You must be logged in to post a comment. Hit enter to search or ESC to close. A strange phenomenon occurred in offices and workplaces everywhere just recently.

But what was it that caused this strange and uncharacteristic behavioural shift? If radiation levels suggest nuclear explosions in, say, three American cities simultaneously, the sensors notify the Doomsday Machine, which is programmed to detonate several nuclear warheads in response.

At that point, there is no going back. The fission chain reaction that produces an atomic explosion is initiated enough times over to extinguish all life on Earth.

There is a terrible flash of light, a great booming sound, then a sustained roar. We have a word for the scale of destruction that the Doomsday Machine would unleash: megadeath. Nobody is pining for megadeath.

But megadeath is not the only thing that makes the Doomsday Machine petrifying. The real terror is in its autonomy, this idea that it would be programmed to detect a series of environmental inputs, then to act, without human interference. The concept was to render nuclear war unwinnable, and therefore unthinkable. Kahn concluded that automating the extinction of all life on Earth would be immoral.

Now we need to learn how to survive the social web. P eople tend to complain about Facebook as if something recently curdled. It is in their very architecture. I pressed him on this once and he laughed. No one, not even Mark Zuckerberg, can control the product he made.

The social web is doing exactly what it was built for. Facebook does not exist to seek truth and report it, or to improve civic health, or to hold the powerful to account, or to represent the interests of its users, though these phenomena may be occasional by-products of its existence.

Facebook is an agent of government propaganda, targeted harassment, terrorist recruitment, emotional manipulation, and genocide—a world-historic weapon that lives not underground, but in a Disneyland-inspired campus in Menlo Park, California. Somewhere along the way, Facebook decided that it needed not just a very large user base, but a tremendous one, unprecedented in size.

That decision set Facebook on a path to escape velocity, to a tipping point where it can harm society just by existing. Limitations to the Doomsday Machine comparison are obvious: Facebook cannot in an instant reduce a city to ruins the way a nuclear bomb can. And whereas the Doomsday Machine was conceived of as a world-ending device so as to forestall the end of the world, Facebook started because a semi-inebriated Harvard undergrad was bored one night.

But the stakes are still life-and-death. Megascale is nearly the existential threat that megadeath is. These incentives lead to design choices such as reaction buttons that encourage users to engage easily and often, which in turn encourage users to share ideas that will provoke a strong response. Every time you click a reaction button on Facebook, an algorithm records it, and sharpens its portrait of who you are.

Facebook has enlisted a corps of approximately 15, moderators, people paid to watch unspeakable things —murder, gang rape, and other depictions of graphic violence that wind up on the platform. Even as Facebook has insisted that it is a value-neutral vessel for the material its users choose to publish, moderation is a lever the company has tried to pull again and again. At megascale, this algorithmically warped personalized informational environment is extraordinarily difficult to moderate in a meaningful way, and extraordinarily dangerous as a result.

Facebook has conducted social-contagion experiments on its users without telling them. Facebook has acted as a force for digital colonialism , attempting to become the de facto and only experience of the internet for people all over the world. Facebook has bragged about its ability to influence the outcome of elections.

Unlawful militant groups use Facebook to organize. Government officials use Facebook to mislead their own citizens, and to tamper with elections.



0コメント

  • 1000 / 1000