Posts Tagged social media

Faceless in the crowd

As in a good fantasy story, there are parts of the Internet you just shouldn’t go to.

Peruse the comments section of just about any website, and you’re likely to run across vitriol-spewing trolls, hurling obscenities–and sometimes even rape or death threats–in arguments about seemingly everything.

In “The Epidemic of Facelessness,” a recent opinion piece for The New York Times, Stephen Marche attributes the rise of casually monstrous behavior on the Internet to the fact that attackers never see their victims’ faces.

Pulling examples from a diverse catalog that includes ancient Roman law, French phenomenology, and neuroscience, Marche argues that actually seeing another person’s face is the key to empathy.

That doesn’t typically happen online, hence the ease with which rape and death threats get thrown around.

It also means people need to work to imbue others with humanity. Attackers need to realize the people they’re threatening are, well, people, and their attacks should be understood in the context of a complex human psyche.

Remembering not to leave our humanity online is an admirable and necessary goal to work towards, but it will likely get harder to do as we rely more on indirect digital communication.

Because while society still shuns Internet trolls, it also continues to devalue humanity  at the expense of performing discreet tasks more efficiently.

That’s what digital technology does. It lets us do everything from shopping to wishing each other “Happy Birthday” quickly, cleanly, and efficiently.

Saving money and time is good, of course, but it’s possible this obsession with digital efficiency is also grooming people to be less tolerant of each other.

The number of situations where strangers are forced to candidly interact in everyday life is diminishing. Does using one of those self-cehckout machines really save that much time, or do you just prefer not having to exchange pleasantries with a human cashier?

It’s not that people need to be in the mood to talk to each other all of the time, but with Internet-related technology making it so easy to put each other at a distance, it’s hard to see how the “epidemic of facelessness” can be cured.

Beneath the shiny confidence of Silicon Valley futurism, the way of life being constructed around the Internet is potentially damaging to human empathy, even if it is easier.

, , ,

Leave a comment

The algorithms of progress

After 200 posts, I still have a love/hate relationship with the Internet.

I mean that in the most literal sense: I love the opportunities the Internet has made possible, but I hate most of what comes with using it and interacting with people through it.

Without the Internet, I wouldn’t have a job right now. I certainly wouldn’t be able to cover the car industry from a house in Connecticut.

However, the Internet has also de-valued skills.

For many jobs, remote working has opened up a pool of applicants that literally spans the nation. People with job-specific skills are much more interchangeable than they ever have been.

That’s great if, like me, you want to write about cars without moving to Detroit, but it also means that being good at something just doesn’t cut it anymore.

People are expected to bring much more than relevant skills to a job; they’re expected to bring specific training, connections, and name recognition.

Some call this the entrepreneurial spirit; I call it blurring the line between work and life.

Because when people expect less from organizations, organizations expect more from people. So much for punching out at 5:00 p.m.

Those aren’t the only terms the Internet dictates.

We work for it: we design content for it, adapt messages to suit it, alter our language so that both humans and Google will comprehend it.

Then someone invents a new “breakthrough in communications” that must be satiated on its own terms.

Earlier this year I got a Twitter account, because everyone else has one.

As far as I can tell, Twitter is just a forum for anyone who has ever been involved with Star Trek, and a gruesomely effective way to relay information during a disaster.

Every time a celebrity does something, it explodes like a healthcare exchange website on October 1, 2013. I can’t see how this leads to productive discourse.

We shouldn’t feel obligated to make room for new social media in our lives, but we do. That’s what frustrates me the most about living in the shadow of the Internet.

After several generations of continuous technological progress, people seem resigned to the Digital Age being just another part of an inexorable historical movement. Nothing stays the same forever.

When I was in first grade I learned to type on beige Macs and play with floppy disks. The teachers said computers would one day be an important part of my life. It was a self-fulfilling prophecy.

Even if we use a piece of technology, we should still be allowed to evaluate its effect on us, and tailor it to our lives–not the other way around.

The Internet has certainly changed the way people live, but whether “different ” really means “better” — and doesn’t mean “worse” is a determination we need to make. It’s easy to assume we have no agency in the face of progress, but we need to take account of how we use technology.

, , , , , , ,

Leave a comment

Defying categorization

“Categorizing people is not something we do here” was the slogan used during my college orientation to teach us impressionable freshmen not to discriminate, generalize, or judge based on a person’s skin color, religion, sexual orientation, etc.

Since embracing diversity is second nature for most New England liberal arts students, that slogan became the punchline of many fine jokes, but what’s really funny is how far some people are taking the act of categorization.

Reading Thought Catalog, one would think introverted people are an oppressed minority. The site recently ran a list article on things you should never say to an introvert, and a POV piece on how the deck of life is stacked against the less-talkative, because things like job interviews are dependent entirely on sociability and charisma.

I’m not going to argue that being outgoing doesn’t make certain parts of life easier, but the whole idea of categorizing people as either “introverted” or “extroverted’ is an oversimplification worthy of a “Physics for English Majors” class.

Obviously, when many individuals act a certain way, it’s easy to recognize patterns of behavior. But to extrapolate that and apply one’s observations to every introverted or extroverted person is crazy. We’re not robots, are we?

What’s the threshold for introversion anyway? Should the American Psychiatric Association add some diagnostic criteria to the DSM-V? What if someone doesn’t fit the template of “introvert” or “extrovert,” just as most people don’t fit classic social stereotypes like “jock” or “nerd?”

The answer to all of those questions is the same: human beings are individuals, and their behavior can’t be accounted for by gross generalizations. They are conscious of their actions and capable of changing. Labeling people just obfuscates that fact.

I’ve always thought my generation knew enough about the dangers of generalizations based on race, religion, or sexual orientation, but here we are creating new generalizations based on how much a person talks at parties. One of those Thought Catalog articles was followed by “The Current State of Public Discourse” on the site’s feed. A tad ironic, no?

Everyone wants to make sense of the chaos that is human interaction, but that chaos is the essential fact of it. Individuality makes our actions unpredictable, and it can’t be any other way.

Categorizing people may give the categorizer a sense of serenity, but it also dehumanizes the people being categorized by making it seem like they are not in control of their own actions. That’s why it is not something we do here.

, , , , , , ,

4 Comments

Boston is bombed, one tweet at a time

Since I got a Twitter account recently, I haven’t been sure of what to do with it. On Monday, I found a very good, but very unpleasant, use for it.

As with so many things these days, I found out about the Boston Marathon bombings through a reference on someone’s Facebook profile. Scrolling through the newsfeed, I saw a status from a college classmate:

“Slowly finding out more about what happened during the Boston Marathon,” it read.

I jumped over to Twitter and, sure enough, a photo of the scene of the first explosion had already been retweeted by a friend. Reports of a series of explosions were starting to come in, intermixed with Pulitzer prize winners and the announcement that Chris Hardwick will be in Baltimore on May 24.

“Two men had bombs strapped to themselves and they both went off,” a tweet posted 32 minutes before I logged on read, “everyone is scrambling.”

Switching over to the New York Times’ website, there were only a few short lines confirming that explosions had occurred, not even using the word “bomb.”

Facebook and the news sites stayed quiet a bit longer, but Twitter was shot through with  reports, mostly from the Associated Press and journalists who were already on site. The Boston Globe posted a video of the first explosion, and soon it was possible see it from nearly every angle by scanning the tweets.

Not everything tweeted that day was accurate (the report of suicide bombers doesn’t jibe with what investigators are learning about the bombs) but the most necessary information was imparted as quickly as possible.

So that, it seems, is what Twitter is for.

, , , , , , , , ,

Leave a comment

Twittering away

I finally got a Twitter account and I’m not really sure why. You should definitely follow me (@SAEdelstein) while I figure that out. I promise it will be entertaining.

I’ve never been an early adopter of social media; I usually start by asking “What the hell is this for?” before caving when a critical mass of friends and potential employers start using it. Maybe that’s the source of my confusion.

In school, parents, teachers, and Nickelodeon characters were always saying not to do something just because it’s popular, to think independently.

That’s hard to do when it comes to joining a social network, because the network isn’t just an activity, it’s a space where social interactions (however artificial) happen. Things were less complicated when work and school were people’s only avenues for socialization.

“Because everyone else is doing it” is the primary reason most people join social networks, because they have to go where other people are. If a site is popular enough, it doesn’t matter whether the medium is 140-character messages or protein exchanges. It develops a gravitational pull of sorts that attracts more users.

Of course, it’s important not to put too much emphasis on Twitter or any other social media site. Users can post as much or as little as they want, but there is a difference between using a site and getting something out of it.

Being a late adopter is like walking through a conquered land. The hordes of discordant posts given the barest sense of order by warlord-like influencers with their thousands of followers hint at the possibilities, but remind you that, because someone has already figured out how to work the system, they’re limited.

Social media really isn’t a new frontier for human consciousness, it’s just the same routine as ever, digitized and compressed. The medium itself is where the innovation is: people are and will continue to use it to create new ways of expressing ideas.

Is that the same as fundamentally changing the way people socialize, though? if not, do we still have a choice to opt out, or will we be obligated to join the next new network, and the one after that?

, , , , , , , ,

2 Comments

Pope watching

Pope Francis IIt’s amazing how the choice of one religion’s leader can still be a worldwide event. As I write this, congratulations and snarky comments are flying on Facebook over the election of Pope Francis I. Why do all of use non-Catholics care?

A few years ago, when Benedict XVI was elected, I was in the midst of the high school crucible known as AP European History, so I was happy to put everything I’d learned about Avignon and Ignatius of Loyola to work as a Pope watcher.

I guess there is an element of glamorous drama involved, the same thing that makes Americans want to watch the Royal wedding. After all, it’s not every day that they elect a new Pope.

This event was also unprecedented in many ways. Benedict XVI was the first Pope to step down with a waiting replacement in over 600 years, and his successor is the first Pope from the Americas and only the third Jesuit Pope.

The Pope is more than a celebrity, though, and maybe that’s why the election of a new Pope is still relevant to non-Catholics. After years of heinous scandals and the obvious hypocrisy of a primarily white European governing body ruling over an increasingly diverse religion, people want change.

These are matters that should be the concern of everyone, regardless of their religion. That’s what makes this more than a media spectacle.

, , , , , , , ,

Leave a comment

Flinching in the face of the future

They used to say “trust no one over 30,” and I guess that means I can’t be trusted. I’m not over 30, but I seem to have the mentality of someone who is beyond their 20s.

Twenty-somethings have run afoul of the New York Times a lot lately. In one post I read recently, the author described the Times as “conservative” and “on the wrong side of history” for criticizing the Millenial lifestyle.

It wasn’t just shocking to hear a paragon of liberalism like the Times being referred to as conservative, it was the thought that modern tech is defining who we are.

I’m a 20-something, but I sometimes feel like I’m on the wrong side of history. I use digital tech to work and communicate, but I often wonder if society isn’t paying a price for all of the convenience it offers.

Of course, everyone gets annoyed by the constant barrage of e-mails and Facebook statuses once in awhile, but what really bothers me is the feeling that, whatever people think about tech, and whatever legitimate evidence of its flaws comes to light, we’ll continue plunging head-first into a wired future. We don’t have a choice.

I love writing this blog, and I love being able to keep in touch with far flung friends with social media. However, I also love print books, and my flip phone. I don’t love the idea of paying my bills online, and risking all sorts of digital skullduggery.

I often read that the Internet and smart devices are creating unheard of opportunities for innovation, that they are tools that can change the world. But for something that can do all of that, it comes with an awful lot of rules.

Living in the Digital World requires a different set of skills; it doesn’t completely level the playing field. As with anything else, some people are better at it. Those who can express themselves in 140 characters, attract followers, and read the data will always succeed. Those who can’t will fail.

I guess this is second nature to some people, but it quickly drains the romance from the digital frontier. Whenever I engage a new medium, it seems like someone’s already figured it out before me. So what’s required is conformity, not innovation.

I’m sure there are others more brilliant and courageous than I who can bend these mediums to their will and truly innovate, but it’s completely false to believe that everyone can automatically do the same because of some inherent quality of the technology.

I sincerely hope that technology leads us to a better future. I hope that someday, our reality is like Star Trek. I just have a hard time seeing how to get from here to there, and I have a hard time imbuing technology with that much significance. Yes, it’s new and popular but at one time, so was the steam engine.

, , , , , , ,

Leave a comment