Archive for category Language

Faceless in the crowd

As in a good fantasy story, there are parts of the Internet you just shouldn’t go to.

Peruse the comments section of just about any website, and you’re likely to run across vitriol-spewing trolls, hurling obscenities–and sometimes even rape or death threats–in arguments about seemingly everything.

In “The Epidemic of Facelessness,” a recent opinion piece for The New York Times, Stephen Marche attributes the rise of casually monstrous behavior on the Internet to the fact that attackers never see their victims’ faces.

Pulling examples from a diverse catalog that includes ancient Roman law, French phenomenology, and neuroscience, Marche argues that actually seeing another person’s face is the key to empathy.

That doesn’t typically happen online, hence the ease with which rape and death threats get thrown around.

It also means people need to work to imbue others with humanity. Attackers need to realize the people they’re threatening are, well, people, and their attacks should be understood in the context of a complex human psyche.

Remembering not to leave our humanity online is an admirable and necessary goal to work towards, but it will likely get harder to do as we rely more on indirect digital communication.

Because while society still shuns Internet trolls, it also continues to devalue humanity  at the expense of performing discreet tasks more efficiently.

That’s what digital technology does. It lets us do everything from shopping to wishing each other “Happy Birthday” quickly, cleanly, and efficiently.

Saving money and time is good, of course, but it’s possible this obsession with digital efficiency is also grooming people to be less tolerant of each other.

The number of situations where strangers are forced to candidly interact in everyday life is diminishing. Does using one of those self-cehckout machines really save that much time, or do you just prefer not having to exchange pleasantries with a human cashier?

It’s not that people need to be in the mood to talk to each other all of the time, but with Internet-related technology making it so easy to put each other at a distance, it’s hard to see how the “epidemic of facelessness” can be cured.

Beneath the shiny confidence of Silicon Valley futurism, the way of life being constructed around the Internet is potentially damaging to human empathy, even if it is easier.

Advertisements

, , ,

Leave a comment

Tech glossary

I love learning jargon, and at a recent tech conference in New York City I got to add a few pieces to my collection. Here’s what technologists use to describe what they do when normal words simply aren’t enough.

Tech (n.) Any device incorporating digital technology, and the digital technology itself.

Exp: “This new iPhone is a great piece of tech.”

This is quickly devolving from shorthand for technology into a word exclusively denoting smartphones, tablets, and the bits and pieces that make them work. We say that our society has faith in technology, but many of us seem to actually mean the kinds of technology that come with plenty of silicon.

Unlike a lot of jargon, “tech” is actually a bit non-specific. A smartphone is a piece of tech, but so are the hardware and software that comprise it. Sometimes tech entrepreneurs need to be more specific, which brings us to our next term…

Solution (n.) A product proffered by a tech company for a specific application.

Exp: “Our company provides innovative solutions for in-car infotainment.”

Since a lot of what tech companies produce is non-corporeal software, someone obviously thought it was a good idea to ditch the word “product,” which implies something more substantial; it’s basically the opposite of what the finance industry did. it’s a suitable term for a technology that seeks to insert itself into all kinds of situations, from glasses to car dashboards.

Innovate (v.) To create something new, specifically a new piece of tech.

Exp: “To solve society’s problems, people need to be free to innovate.”

I miss the days of Dexter’s Laboratory and middle school history lessons about Thomas Edison, when scientists and engineerings invented things instead of just innovating. Being innovative is great, but shouldn’t there be a specific goal behind the innovation? A carbon fiber toothbrush would be incredibly innovative, but there wouldn’t be much of a point to it.

Space (n.) A subject, an area of expertise, a topic.

Exp: “Milled aluminum knobs are very important in the home audio space.”

This admittedly, has more to do with the people writing about the tech industry than the people in it. For some reason, when it comes to technology, there aren’t topics or beats, there are spaces.

Maybe it has to do with the way tech takes on different forms to infiltrate into different physical spaces; morphing into intelligent flat screens and TFT speedometers.

Got any tech terms of your own? Post them in the comments below.

, , , , , ,

Leave a comment

Defying categorization

“Categorizing people is not something we do here” was the slogan used during my college orientation to teach us impressionable freshmen not to discriminate, generalize, or judge based on a person’s skin color, religion, sexual orientation, etc.

Since embracing diversity is second nature for most New England liberal arts students, that slogan became the punchline of many fine jokes, but what’s really funny is how far some people are taking the act of categorization.

Reading Thought Catalog, one would think introverted people are an oppressed minority. The site recently ran a list article on things you should never say to an introvert, and a POV piece on how the deck of life is stacked against the less-talkative, because things like job interviews are dependent entirely on sociability and charisma.

I’m not going to argue that being outgoing doesn’t make certain parts of life easier, but the whole idea of categorizing people as either “introverted” or “extroverted’ is an oversimplification worthy of a “Physics for English Majors” class.

Obviously, when many individuals act a certain way, it’s easy to recognize patterns of behavior. But to extrapolate that and apply one’s observations to every introverted or extroverted person is crazy. We’re not robots, are we?

What’s the threshold for introversion anyway? Should the American Psychiatric Association add some diagnostic criteria to the DSM-V? What if someone doesn’t fit the template of “introvert” or “extrovert,” just as most people don’t fit classic social stereotypes like “jock” or “nerd?”

The answer to all of those questions is the same: human beings are individuals, and their behavior can’t be accounted for by gross generalizations. They are conscious of their actions and capable of changing. Labeling people just obfuscates that fact.

I’ve always thought my generation knew enough about the dangers of generalizations based on race, religion, or sexual orientation, but here we are creating new generalizations based on how much a person talks at parties. One of those Thought Catalog articles was followed by “The Current State of Public Discourse” on the site’s feed. A tad ironic, no?

Everyone wants to make sense of the chaos that is human interaction, but that chaos is the essential fact of it. Individuality makes our actions unpredictable, and it can’t be any other way.

Categorizing people may give the categorizer a sense of serenity, but it also dehumanizes the people being categorized by making it seem like they are not in control of their own actions. That’s why it is not something we do here.

, , , , , , ,

4 Comments

Twittering away

I finally got a Twitter account and I’m not really sure why. You should definitely follow me (@SAEdelstein) while I figure that out. I promise it will be entertaining.

I’ve never been an early adopter of social media; I usually start by asking “What the hell is this for?” before caving when a critical mass of friends and potential employers start using it. Maybe that’s the source of my confusion.

In school, parents, teachers, and Nickelodeon characters were always saying not to do something just because it’s popular, to think independently.

That’s hard to do when it comes to joining a social network, because the network isn’t just an activity, it’s a space where social interactions (however artificial) happen. Things were less complicated when work and school were people’s only avenues for socialization.

“Because everyone else is doing it” is the primary reason most people join social networks, because they have to go where other people are. If a site is popular enough, it doesn’t matter whether the medium is 140-character messages or protein exchanges. It develops a gravitational pull of sorts that attracts more users.

Of course, it’s important not to put too much emphasis on Twitter or any other social media site. Users can post as much or as little as they want, but there is a difference between using a site and getting something out of it.

Being a late adopter is like walking through a conquered land. The hordes of discordant posts given the barest sense of order by warlord-like influencers with their thousands of followers hint at the possibilities, but remind you that, because someone has already figured out how to work the system, they’re limited.

Social media really isn’t a new frontier for human consciousness, it’s just the same routine as ever, digitized and compressed. The medium itself is where the innovation is: people are and will continue to use it to create new ways of expressing ideas.

Is that the same as fundamentally changing the way people socialize, though? if not, do we still have a choice to opt out, or will we be obligated to join the next new network, and the one after that?

, , , , , , , ,

2 Comments

Fighting ironic battles

Pearl Harbor posterI never thought I’d see the day when World War II became a source of irony. It was the definition of “good fight,” a time when the nation harnessed all of its resources to defeat what one of my high school history teachers called “made to order bad guys.”

Yet here we are. Barbasol is running a commercial featuring the viewer’s “ great grandfather” on the ground in a French village, perhaps Ste.-Merie-Eglise or St.-Lo, laconically comparing his attempt to stop Hitler with the current young generation’s obsession with tweeting and Facebooking.

Like “first world problems,” this is another example of a perverted form of thought. Its as if people think that, by noting their shortcomings in an ironic way, they don’t have to actually do anything about them.

It’s also a silly comparison. I’m not saying that my generation is perfect, but it’s not really fair to compare us to the “Greatest Generation.” We’ll never know how the social media-savvy would deal with a Great Depression or a World War, because we lived through a Great Recession and a pseudo-War on Terror.

Twitter and Facebook can lead to some shallowness, but we’ll also never know what our grandparents’ generation would have done if they grew up with these luxuries. I recently ate lunch at a restaurant packed with senior citizens, and most of them had smartphones.

Maybe we should cut back on the irony before we lose track of what we’re making ironic jokes about. This reminds me of a New York Times blog post I read recently called “How to Live Without Irony.” The author argued that too many people are using irony to avoid honest (and sometimes painful) emotional commitments.

That seems like what’s going on here. People need to accept the fact that they’re better off than others, including their own grandparents and great grandparents. That’s what those World War II soldiers were fighting for, after all.

, , , , , , , ,

Leave a comment

First world problems

So I’ve encountered a new phrase called “first world problems.” I have a problem with this phrase.

It seems to mean something that really isn’t a big deal, like having to prepare a presentation or being peeved that the barista put cream in your Starbucks concoction instead of milk. You know, things that don’t have to do with subsistence.

I see what people are getting at here. We all get wrapped up in our lives, make mountains out of mole hills and forget how lucky we are to live the way we do. That’s fine.

Checking your whining with a phrase like “first world problems” is a little obnoxious, though. It sounds like the person is saying “I know I shouldn’t be complaining about this trivial thing, but I will,” or “See how conscious I am of other people’s suffering?”

Both are very “first world” things to do. I’m a huge fan of irony, but too much of a good thing is still a problem. Drawing an implied comparison between oneself and a starving African child or a smog-choked Chinese factory worker doesn’t make a person sound smart or sensitive, it just makes them sound like they are trying to license their whining.

The phrase “first world problems” is also etymologically dubious. Do you ever notice why people never talk about the second world? It’s because the terms first world and second world were coined during the Cold War to describe the United States and its NATO allies and the Soviet Union and its Warsaw Pact allies, respectively. Any countries not within either the U.S. or Soviet sphere were referred to as the third world.

So maybe we should stop using outdated political terms to label our trivial complaints. It’s perfectly fine to complain, even if you know that someone else would be happy to be in your position. It’s not a big deal, and certainly doesn’t merit a snarky term like “first world problems.”

, , , , , , , , ,

Leave a comment

An opinion on opinions

Ricky-BobbySaying what’s on your mind can have unfortunate consequences, but there is a way to avoid them. I get into a lot of political debates/cage matches with people, some of whom say things that are flat out wrong. How do they maintain their credibility? They use a magic phrase.

Saying “Open Says Me” can open doors, and saying “This is just my opinion” apparently allows someones to make any stupid remark they want with impunity. I’ve had people tell me that, on average, conservatives are smarter than liberals, and that President Obama will raise more money than Mitt Romney because of his Hollywood connections. These seem like things that need to be backed up with evidence, but since each person qualified it as “their opinion,” they didn’t feel the need to.

In the cinematic triumph that is Talladega Nights: The Ballad of Ricky Bobby, a redneck NASCAR driver played by Will Ferrell uses the same tactic. Ricky tells his boss that “With all due respect, I didn’t know you had experimental surgery to have your balls removed.” That sounds inappropriate, but he did say “with all due respect.” Most people who debate politics think they are smarter than Will Ferrell’s character, but I’m not so sure.

People don’t need to be reminded about the First Amendment, but they do need to be reminded about responsible use. Saying whatever you want and using the right to free speech as en excuse is not responsible; it just makes the speaker look dumb, and makes rational discussion more difficult. Everyone has an opinion, but they can still be wrong.

Still, being able to say whatever I want by using one simple phrase sounds like fun. I’m going to give it a try. This is just my opinion, but:

Mitt Romney is an alien sent to conquer Earth with an army of dancing horses.

John Boehner is an Oompa-Loompa who took steroids.

On average, conservatives are most likely to be cannibals.

After being defeated by the Light Side, Emperor Palpatine fled to Earth, starting a new life under the pseudonym “Dick Cheney.”

Ronald Reagan did not end the Cold War.

, , , , , , , ,

1 Comment