Posts Tagged social media
As in a good fantasy story, there are parts of the Internet you just shouldn’t go to.
Peruse the comments section of just about any website, and you’re likely to run across vitriol-spewing trolls, hurling obscenities–and sometimes even rape or death threats–in arguments about seemingly everything.
In “The Epidemic of Facelessness,” a recent opinion piece for The New York Times, Stephen Marche attributes the rise of casually monstrous behavior on the Internet to the fact that attackers never see their victims’ faces.
Pulling examples from a diverse catalog that includes ancient Roman law, French phenomenology, and neuroscience, Marche argues that actually seeing another person’s face is the key to empathy.
That doesn’t typically happen online, hence the ease with which rape and death threats get thrown around.
It also means people need to work to imbue others with humanity. Attackers need to realize the people they’re threatening are, well, people, and their attacks should be understood in the context of a complex human psyche.
Remembering not to leave our humanity online is an admirable and necessary goal to work towards, but it will likely get harder to do as we rely more on indirect digital communication.
Because while society still shuns Internet trolls, it also continues to devalue humanity at the expense of performing discreet tasks more efficiently.
That’s what digital technology does. It lets us do everything from shopping to wishing each other “Happy Birthday” quickly, cleanly, and efficiently.
Saving money and time is good, of course, but it’s possible this obsession with digital efficiency is also grooming people to be less tolerant of each other.
The number of situations where strangers are forced to candidly interact in everyday life is diminishing. Does using one of those self-cehckout machines really save that much time, or do you just prefer not having to exchange pleasantries with a human cashier?
It’s not that people need to be in the mood to talk to each other all of the time, but with Internet-related technology making it so easy to put each other at a distance, it’s hard to see how the “epidemic of facelessness” can be cured.
Beneath the shiny confidence of Silicon Valley futurism, the way of life being constructed around the Internet is potentially damaging to human empathy, even if it is easier.
After 200 posts, I still have a love/hate relationship with the Internet.
I mean that in the most literal sense: I love the opportunities the Internet has made possible, but I hate most of what comes with using it and interacting with people through it.
Without the Internet, I wouldn’t have a job right now. I certainly wouldn’t be able to cover the car industry from a house in Connecticut.
However, the Internet has also de-valued skills.
For many jobs, remote working has opened up a pool of applicants that literally spans the nation. People with job-specific skills are much more interchangeable than they ever have been.
That’s great if, like me, you want to write about cars without moving to Detroit, but it also means that being good at something just doesn’t cut it anymore.
People are expected to bring much more than relevant skills to a job; they’re expected to bring specific training, connections, and name recognition.
Some call this the entrepreneurial spirit; I call it blurring the line between work and life.
Because when people expect less from organizations, organizations expect more from people. So much for punching out at 5:00 p.m.
Those aren’t the only terms the Internet dictates.
We work for it: we design content for it, adapt messages to suit it, alter our language so that both humans and Google will comprehend it.
Then someone invents a new “breakthrough in communications” that must be satiated on its own terms.
Earlier this year I got a Twitter account, because everyone else has one.
As far as I can tell, Twitter is just a forum for anyone who has ever been involved with Star Trek, and a gruesomely effective way to relay information during a disaster.
Every time a celebrity does something, it explodes like a healthcare exchange website on October 1, 2013. I can’t see how this leads to productive discourse.
We shouldn’t feel obligated to make room for new social media in our lives, but we do. That’s what frustrates me the most about living in the shadow of the Internet.
After several generations of continuous technological progress, people seem resigned to the Digital Age being just another part of an inexorable historical movement. Nothing stays the same forever.
When I was in first grade I learned to type on beige Macs and play with floppy disks. The teachers said computers would one day be an important part of my life. It was a self-fulfilling prophecy.
Even if we use a piece of technology, we should still be allowed to evaluate its effect on us, and tailor it to our lives–not the other way around.
The Internet has certainly changed the way people live, but whether “different ” really means “better” — and doesn’t mean “worse” is a determination we need to make. It’s easy to assume we have no agency in the face of progress, but we need to take account of how we use technology.
“Categorizing people is not something we do here” was the slogan used during my college orientation to teach us impressionable freshmen not to discriminate, generalize, or judge based on a person’s skin color, religion, sexual orientation, etc.
Since embracing diversity is second nature for most New England liberal arts students, that slogan became the punchline of many fine jokes, but what’s really funny is how far some people are taking the act of categorization.
Reading Thought Catalog, one would think introverted people are an oppressed minority. The site recently ran a list article on things you should never say to an introvert, and a POV piece on how the deck of life is stacked against the less-talkative, because things like job interviews are dependent entirely on sociability and charisma.
I’m not going to argue that being outgoing doesn’t make certain parts of life easier, but the whole idea of categorizing people as either “introverted” or “extroverted’ is an oversimplification worthy of a “Physics for English Majors” class.
Obviously, when many individuals act a certain way, it’s easy to recognize patterns of behavior. But to extrapolate that and apply one’s observations to every introverted or extroverted person is crazy. We’re not robots, are we?
What’s the threshold for introversion anyway? Should the American Psychiatric Association add some diagnostic criteria to the DSM-V? What if someone doesn’t fit the template of “introvert” or “extrovert,” just as most people don’t fit classic social stereotypes like “jock” or “nerd?”
The answer to all of those questions is the same: human beings are individuals, and their behavior can’t be accounted for by gross generalizations. They are conscious of their actions and capable of changing. Labeling people just obfuscates that fact.
I’ve always thought my generation knew enough about the dangers of generalizations based on race, religion, or sexual orientation, but here we are creating new generalizations based on how much a person talks at parties. One of those Thought Catalog articles was followed by “The Current State of Public Discourse” on the site’s feed. A tad ironic, no?
Everyone wants to make sense of the chaos that is human interaction, but that chaos is the essential fact of it. Individuality makes our actions unpredictable, and it can’t be any other way.
Categorizing people may give the categorizer a sense of serenity, but it also dehumanizes the people being categorized by making it seem like they are not in control of their own actions. That’s why it is not something we do here.
Since I got a Twitter account recently, I haven’t been sure of what to do with it. On Monday, I found a very good, but very unpleasant, use for it.
As with so many things these days, I found out about the Boston Marathon bombings through a reference on someone’s Facebook profile. Scrolling through the newsfeed, I saw a status from a college classmate:
“Slowly finding out more about what happened during the Boston Marathon,” it read.
I jumped over to Twitter and, sure enough, a photo of the scene of the first explosion had already been retweeted by a friend. Reports of a series of explosions were starting to come in, intermixed with Pulitzer prize winners and the announcement that Chris Hardwick will be in Baltimore on May 24.
“Two men had bombs strapped to themselves and they both went off,” a tweet posted 32 minutes before I logged on read, “everyone is scrambling.”
Switching over to the New York Times’ website, there were only a few short lines confirming that explosions had occurred, not even using the word “bomb.”
Facebook and the news sites stayed quiet a bit longer, but Twitter was shot through with reports, mostly from the Associated Press and journalists who were already on site. The Boston Globe posted a video of the first explosion, and soon it was possible see it from nearly every angle by scanning the tweets.
Not everything tweeted that day was accurate (the report of suicide bombers doesn’t jibe with what investigators are learning about the bombs) but the most necessary information was imparted as quickly as possible.
So that, it seems, is what Twitter is for.
I finally got a Twitter account and I’m not really sure why. You should definitely follow me (@SAEdelstein) while I figure that out. I promise it will be entertaining.
I’ve never been an early adopter of social media; I usually start by asking “What the hell is this for?” before caving when a critical mass of friends and potential employers start using it. Maybe that’s the source of my confusion.
In school, parents, teachers, and Nickelodeon characters were always saying not to do something just because it’s popular, to think independently.
That’s hard to do when it comes to joining a social network, because the network isn’t just an activity, it’s a space where social interactions (however artificial) happen. Things were less complicated when work and school were people’s only avenues for socialization.
“Because everyone else is doing it” is the primary reason most people join social networks, because they have to go where other people are. If a site is popular enough, it doesn’t matter whether the medium is 140-character messages or protein exchanges. It develops a gravitational pull of sorts that attracts more users.
Of course, it’s important not to put too much emphasis on Twitter or any other social media site. Users can post as much or as little as they want, but there is a difference between using a site and getting something out of it.
Being a late adopter is like walking through a conquered land. The hordes of discordant posts given the barest sense of order by warlord-like influencers with their thousands of followers hint at the possibilities, but remind you that, because someone has already figured out how to work the system, they’re limited.
Social media really isn’t a new frontier for human consciousness, it’s just the same routine as ever, digitized and compressed. The medium itself is where the innovation is: people are and will continue to use it to create new ways of expressing ideas.
Is that the same as fundamentally changing the way people socialize, though? if not, do we still have a choice to opt out, or will we be obligated to join the next new network, and the one after that?
It’s amazing how the choice of one religion’s leader can still be a worldwide event. As I write this, congratulations and snarky comments are flying on Facebook over the election of Pope Francis I. Why do all of use non-Catholics care?
A few years ago, when Benedict XVI was elected, I was in the midst of the high school crucible known as AP European History, so I was happy to put everything I’d learned about Avignon and Ignatius of Loyola to work as a Pope watcher.
I guess there is an element of glamorous drama involved, the same thing that makes Americans want to watch the Royal wedding. After all, it’s not every day that they elect a new Pope.
This event was also unprecedented in many ways. Benedict XVI was the first Pope to step down with a waiting replacement in over 600 years, and his successor is the first Pope from the Americas and only the third Jesuit Pope.
The Pope is more than a celebrity, though, and maybe that’s why the election of a new Pope is still relevant to non-Catholics. After years of heinous scandals and the obvious hypocrisy of a primarily white European governing body ruling over an increasingly diverse religion, people want change.
These are matters that should be the concern of everyone, regardless of their religion. That’s what makes this more than a media spectacle.
They used to say “trust no one over 30,” and I guess that means I can’t be trusted. I’m not over 30, but I seem to have the mentality of someone who is beyond their 20s.
Twenty-somethings have run afoul of the New York Times a lot lately. In one post I read recently, the author described the Times as “conservative” and “on the wrong side of history” for criticizing the Millenial lifestyle.
It wasn’t just shocking to hear a paragon of liberalism like the Times being referred to as conservative, it was the thought that modern tech is defining who we are.
I’m a 20-something, but I sometimes feel like I’m on the wrong side of history. I use digital tech to work and communicate, but I often wonder if society isn’t paying a price for all of the convenience it offers.
Of course, everyone gets annoyed by the constant barrage of e-mails and Facebook statuses once in awhile, but what really bothers me is the feeling that, whatever people think about tech, and whatever legitimate evidence of its flaws comes to light, we’ll continue plunging head-first into a wired future. We don’t have a choice.
I love writing this blog, and I love being able to keep in touch with far flung friends with social media. However, I also love print books, and my flip phone. I don’t love the idea of paying my bills online, and risking all sorts of digital skullduggery.
I often read that the Internet and smart devices are creating unheard of opportunities for innovation, that they are tools that can change the world. But for something that can do all of that, it comes with an awful lot of rules.
Living in the Digital World requires a different set of skills; it doesn’t completely level the playing field. As with anything else, some people are better at it. Those who can express themselves in 140 characters, attract followers, and read the data will always succeed. Those who can’t will fail.
I guess this is second nature to some people, but it quickly drains the romance from the digital frontier. Whenever I engage a new medium, it seems like someone’s already figured it out before me. So what’s required is conformity, not innovation.
I’m sure there are others more brilliant and courageous than I who can bend these mediums to their will and truly innovate, but it’s completely false to believe that everyone can automatically do the same because of some inherent quality of the technology.
I sincerely hope that technology leads us to a better future. I hope that someday, our reality is like Star Trek. I just have a hard time seeing how to get from here to there, and I have a hard time imbuing technology with that much significance. Yes, it’s new and popular but at one time, so was the steam engine.
I never thought I’d see the day when World War II became a source of irony. It was the definition of “good fight,” a time when the nation harnessed all of its resources to defeat what one of my high school history teachers called “made to order bad guys.”
Yet here we are. Barbasol is running a commercial featuring the viewer’s “ great grandfather” on the ground in a French village, perhaps Ste.-Merie-Eglise or St.-Lo, laconically comparing his attempt to stop Hitler with the current young generation’s obsession with tweeting and Facebooking.
Like “first world problems,” this is another example of a perverted form of thought. Its as if people think that, by noting their shortcomings in an ironic way, they don’t have to actually do anything about them.
It’s also a silly comparison. I’m not saying that my generation is perfect, but it’s not really fair to compare us to the “Greatest Generation.” We’ll never know how the social media-savvy would deal with a Great Depression or a World War, because we lived through a Great Recession and a pseudo-War on Terror.
Twitter and Facebook can lead to some shallowness, but we’ll also never know what our grandparents’ generation would have done if they grew up with these luxuries. I recently ate lunch at a restaurant packed with senior citizens, and most of them had smartphones.
Maybe we should cut back on the irony before we lose track of what we’re making ironic jokes about. This reminds me of a New York Times blog post I read recently called “How to Live Without Irony.” The author argued that too many people are using irony to avoid honest (and sometimes painful) emotional commitments.
That seems like what’s going on here. People need to accept the fact that they’re better off than others, including their own grandparents and great grandparents. That’s what those World War II soldiers were fighting for, after all.
In this age of irony and constant self-investigation, it’s easy to lose track of the reasons why people do things. That’s especially true when it comes to the media (I still don’t understand why we have 24-hour news networks). Still, we all know why reporters publish stories on things they observe, right?
As a member of the media (sort of) I guess I sometimes fall into the trap of assuming what readers will think of an article. That’s why I was surprised by some of the reactions to a recent piece on dating in the New York Times magazine.
“The End of Courtship?” was controversial to begin with. It focuses on 20-somethings’ use of texting, social media, and online dating sites, saying that technology has ruined romance. The author claims that social media have taken the risk out of asking a person out, and prevent one-on-one dates from happening by making it too easy to bring friends along.
Having your entire generation described as gutless and emotionally stunted obviously stirs up some strong opinions. In a rebuttal on RoleReboot, Niki Fritz criticized the story’s assumption that women only want old fashioned dates where the man picks the wine and pays the bill. She said there is nothing wrong with having casual dates, group outings, or hookups as options.
I completely agree, but I didn’t expect Fritz to attack the article’s negative tone along with the specific points it made. I’m getting a little meta here, so bear with me.
“All these articles do is scare young women into thinking we are in some hopeless, relationship-less era devoid of love and romance,” Fritz said.
This sounded similar to a comment I saw on a friend’s Facebook page: “I’m just sick to my stomach of article like this complaining with no resolution in sight,” the disgruntled reader said.
They say no news is good news, and maybe that’s becoming too much for people to handle. I could be wrong, but I’ve always assumed that articles like the Times piece are written to identify negative trends so they can be corrected.
People should read articles like this, realize how lame their dating lives are and try to change. But I guess, in the real world, even the people that agree that text-based dating is a problem respond with a simple “I don’t want to hear this.”
There are a lot of unpleasant things in the world, and this isn’t even really one of them. Everyone deserves to be happy, but these 20-somethings are much closer to happy than most people in the world.
Arguing an article’s specific points is one thing, but criticizing it just because it is negative is completely different. Journalists need to report what they see, good and bad, and while they shouldn’t exaggerate or misinterpret the facts, they definitely have a license to be negative.
Much criticism of the media is warranted, but have we really been reduced to this? I hope the New York Times doesn’t pick up this story; too much criticism of criticism might break the universe.
Social media is evolving. The basement geeks that built the first social networks have moved into the niche market. Facebook is still essential for any digital identity, but now other platforms like Instagram and Loopster are being designed to share specific types of information. Which got me thinking: why stop with photos and videos?
Here are five possible social media platforms for sharing other vital aspects of one’s life:
Crassfone: For sharing inappropriate thoughts you know should be kept to yourself yet feel the irrational need to blurt out in a crowded room.
Triv-o-gram: For sharing random bits of trivia.
Noisss: For sharing non-music sound files
Aro-matic: Until smell-o-vision is invented, this platform will allow users to share descriptions of their favorite smells.
Splice: For sharing the sequence of a person’s genes.
Social networks allow us to share every aspect of our lives, the good, the bad, and the boring. Who cares if no one wants to (or shouldn’t be allowed) to know every thought that pops into our heads and every one of our actions?
Tags2012 Presidential Election Afghanistan American culture Apple Avengers Barack Obama books Captain America cars car shows China Christmas Chrysler classic cars Cold War comic books Congress Connecticut current-events DC Comics Democrats digital age Dodge Charger e-books Facebook Ford General Motors Great Recession gun control healthcare history internet iPad iPhone journalism Marvel Comics media Mitt Romney movies NASA nerds news media newspapers New York City New York Times Occupy Wall Street online dating personal computers politics popular culture reality television Republicans Saab science science-fiction smart phones social media social networking spacecraft space race space shuttle Spider-Man sports cars Star Trek superheroes Superman technology Top Gear trains transportation Twitter urban exploring Worcester World War II writing