Archive for category Language

Faceless in the crowd

As in a good fantasy story, there are parts of the Internet you just shouldn’t go to.

Peruse the comments section of just about any website, and you’re likely to run across vitriol-spewing trolls, hurling obscenities–and sometimes even rape or death threats–in arguments about seemingly everything.

In “The Epidemic of Facelessness,” a recent opinion piece for The New York Times, Stephen Marche attributes the rise of casually monstrous behavior on the Internet to the fact that attackers never see their victims’ faces.

Pulling examples from a diverse catalog that includes ancient Roman law, French phenomenology, and neuroscience, Marche argues that actually seeing another person’s face is the key to empathy.

That doesn’t typically happen online, hence the ease with which rape and death threats get thrown around.

It also means people need to work to imbue others with humanity. Attackers need to realize the people they’re threatening are, well, people, and their attacks should be understood in the context of a complex human psyche.

Remembering not to leave our humanity online is an admirable and necessary goal to work towards, but it will likely get harder to do as we rely more on indirect digital communication.

Because while society still shuns Internet trolls, it also continues to devalue humanity  at the expense of performing discreet tasks more efficiently.

That’s what digital technology does. It lets us do everything from shopping to wishing each other “Happy Birthday” quickly, cleanly, and efficiently.

Saving money and time is good, of course, but it’s possible this obsession with digital efficiency is also grooming people to be less tolerant of each other.

The number of situations where strangers are forced to candidly interact in everyday life is diminishing. Does using one of those self-cehckout machines really save that much time, or do you just prefer not having to exchange pleasantries with a human cashier?

It’s not that people need to be in the mood to talk to each other all of the time, but with Internet-related technology making it so easy to put each other at a distance, it’s hard to see how the “epidemic of facelessness” can be cured.

Beneath the shiny confidence of Silicon Valley futurism, the way of life being constructed around the Internet is potentially damaging to human empathy, even if it is easier.

, , ,

Leave a comment

Tech glossary

I love learning jargon, and at a recent tech conference in New York City I got to add a few pieces to my collection. Here’s what technologists use to describe what they do when normal words simply aren’t enough.

Tech (n.) Any device incorporating digital technology, and the digital technology itself.

Exp: “This new iPhone is a great piece of tech.”

This is quickly devolving from shorthand for technology into a word exclusively denoting smartphones, tablets, and the bits and pieces that make them work. We say that our society has faith in technology, but many of us seem to actually mean the kinds of technology that come with plenty of silicon.

Unlike a lot of jargon, “tech” is actually a bit non-specific. A smartphone is a piece of tech, but so are the hardware and software that comprise it. Sometimes tech entrepreneurs need to be more specific, which brings us to our next term…

Solution (n.) A product proffered by a tech company for a specific application.

Exp: “Our company provides innovative solutions for in-car infotainment.”

Since a lot of what tech companies produce is non-corporeal software, someone obviously thought it was a good idea to ditch the word “product,” which implies something more substantial; it’s basically the opposite of what the finance industry did. it’s a suitable term for a technology that seeks to insert itself into all kinds of situations, from glasses to car dashboards.

Innovate (v.) To create something new, specifically a new piece of tech.

Exp: “To solve society’s problems, people need to be free to innovate.”

I miss the days of Dexter’s Laboratory and middle school history lessons about Thomas Edison, when scientists and engineerings invented things instead of just innovating. Being innovative is great, but shouldn’t there be a specific goal behind the innovation? A carbon fiber toothbrush would be incredibly innovative, but there wouldn’t be much of a point to it.

Space (n.) A subject, an area of expertise, a topic.

Exp: “Milled aluminum knobs are very important in the home audio space.”

This admittedly, has more to do with the people writing about the tech industry than the people in it. For some reason, when it comes to technology, there aren’t topics or beats, there are spaces.

Maybe it has to do with the way tech takes on different forms to infiltrate into different physical spaces; morphing into intelligent flat screens and TFT speedometers.

Got any tech terms of your own? Post them in the comments below.

, , , , , ,

Leave a comment

Defying categorization

“Categorizing people is not something we do here” was the slogan used during my college orientation to teach us impressionable freshmen not to discriminate, generalize, or judge based on a person’s skin color, religion, sexual orientation, etc.

Since embracing diversity is second nature for most New England liberal arts students, that slogan became the punchline of many fine jokes, but what’s really funny is how far some people are taking the act of categorization.

Reading Thought Catalog, one would think introverted people are an oppressed minority. The site recently ran a list article on things you should never say to an introvert, and a POV piece on how the deck of life is stacked against the less-talkative, because things like job interviews are dependent entirely on sociability and charisma.

I’m not going to argue that being outgoing doesn’t make certain parts of life easier, but the whole idea of categorizing people as either “introverted” or “extroverted’ is an oversimplification worthy of a “Physics for English Majors” class.

Obviously, when many individuals act a certain way, it’s easy to recognize patterns of behavior. But to extrapolate that and apply one’s observations to every introverted or extroverted person is crazy. We’re not robots, are we?

What’s the threshold for introversion anyway? Should the American Psychiatric Association add some diagnostic criteria to the DSM-V? What if someone doesn’t fit the template of “introvert” or “extrovert,” just as most people don’t fit classic social stereotypes like “jock” or “nerd?”

The answer to all of those questions is the same: human beings are individuals, and their behavior can’t be accounted for by gross generalizations. They are conscious of their actions and capable of changing. Labeling people just obfuscates that fact.

I’ve always thought my generation knew enough about the dangers of generalizations based on race, religion, or sexual orientation, but here we are creating new generalizations based on how much a person talks at parties. One of those Thought Catalog articles was followed by “The Current State of Public Discourse” on the site’s feed. A tad ironic, no?

Everyone wants to make sense of the chaos that is human interaction, but that chaos is the essential fact of it. Individuality makes our actions unpredictable, and it can’t be any other way.

Categorizing people may give the categorizer a sense of serenity, but it also dehumanizes the people being categorized by making it seem like they are not in control of their own actions. That’s why it is not something we do here.

, , , , , , ,

4 Comments

Twittering away

I finally got a Twitter account and I’m not really sure why. You should definitely follow me (@SAEdelstein) while I figure that out. I promise it will be entertaining.

I’ve never been an early adopter of social media; I usually start by asking “What the hell is this for?” before caving when a critical mass of friends and potential employers start using it. Maybe that’s the source of my confusion.

In school, parents, teachers, and Nickelodeon characters were always saying not to do something just because it’s popular, to think independently.

That’s hard to do when it comes to joining a social network, because the network isn’t just an activity, it’s a space where social interactions (however artificial) happen. Things were less complicated when work and school were people’s only avenues for socialization.

“Because everyone else is doing it” is the primary reason most people join social networks, because they have to go where other people are. If a site is popular enough, it doesn’t matter whether the medium is 140-character messages or protein exchanges. It develops a gravitational pull of sorts that attracts more users.

Of course, it’s important not to put too much emphasis on Twitter or any other social media site. Users can post as much or as little as they want, but there is a difference between using a site and getting something out of it.

Being a late adopter is like walking through a conquered land. The hordes of discordant posts given the barest sense of order by warlord-like influencers with their thousands of followers hint at the possibilities, but remind you that, because someone has already figured out how to work the system, they’re limited.

Social media really isn’t a new frontier for human consciousness, it’s just the same routine as ever, digitized and compressed. The medium itself is where the innovation is: people are and will continue to use it to create new ways of expressing ideas.

Is that the same as fundamentally changing the way people socialize, though? if not, do we still have a choice to opt out, or will we be obligated to join the next new network, and the one after that?

, , , , , , , ,

2 Comments

Fighting ironic battles

Pearl Harbor posterI never thought I’d see the day when World War II became a source of irony. It was the definition of “good fight,” a time when the nation harnessed all of its resources to defeat what one of my high school history teachers called “made to order bad guys.”

Yet here we are. Barbasol is running a commercial featuring the viewer’s “ great grandfather” on the ground in a French village, perhaps Ste.-Merie-Eglise or St.-Lo, laconically comparing his attempt to stop Hitler with the current young generation’s obsession with tweeting and Facebooking.

Like “first world problems,” this is another example of a perverted form of thought. Its as if people think that, by noting their shortcomings in an ironic way, they don’t have to actually do anything about them.

It’s also a silly comparison. I’m not saying that my generation is perfect, but it’s not really fair to compare us to the “Greatest Generation.” We’ll never know how the social media-savvy would deal with a Great Depression or a World War, because we lived through a Great Recession and a pseudo-War on Terror.

Twitter and Facebook can lead to some shallowness, but we’ll also never know what our grandparents’ generation would have done if they grew up with these luxuries. I recently ate lunch at a restaurant packed with senior citizens, and most of them had smartphones.

Maybe we should cut back on the irony before we lose track of what we’re making ironic jokes about. This reminds me of a New York Times blog post I read recently called “How to Live Without Irony.” The author argued that too many people are using irony to avoid honest (and sometimes painful) emotional commitments.

That seems like what’s going on here. People need to accept the fact that they’re better off than others, including their own grandparents and great grandparents. That’s what those World War II soldiers were fighting for, after all.

, , , , , , , ,

Leave a comment

First world problems

So I’ve encountered a new phrase called “first world problems.” I have a problem with this phrase.

It seems to mean something that really isn’t a big deal, like having to prepare a presentation or being peeved that the barista put cream in your Starbucks concoction instead of milk. You know, things that don’t have to do with subsistence.

I see what people are getting at here. We all get wrapped up in our lives, make mountains out of mole hills and forget how lucky we are to live the way we do. That’s fine.

Checking your whining with a phrase like “first world problems” is a little obnoxious, though. It sounds like the person is saying “I know I shouldn’t be complaining about this trivial thing, but I will,” or “See how conscious I am of other people’s suffering?”

Both are very “first world” things to do. I’m a huge fan of irony, but too much of a good thing is still a problem. Drawing an implied comparison between oneself and a starving African child or a smog-choked Chinese factory worker doesn’t make a person sound smart or sensitive, it just makes them sound like they are trying to license their whining.

The phrase “first world problems” is also etymologically dubious. Do you ever notice why people never talk about the second world? It’s because the terms first world and second world were coined during the Cold War to describe the United States and its NATO allies and the Soviet Union and its Warsaw Pact allies, respectively. Any countries not within either the U.S. or Soviet sphere were referred to as the third world.

So maybe we should stop using outdated political terms to label our trivial complaints. It’s perfectly fine to complain, even if you know that someone else would be happy to be in your position. It’s not a big deal, and certainly doesn’t merit a snarky term like “first world problems.”

, , , , , , , , ,

Leave a comment

An opinion on opinions

Ricky-BobbySaying what’s on your mind can have unfortunate consequences, but there is a way to avoid them. I get into a lot of political debates/cage matches with people, some of whom say things that are flat out wrong. How do they maintain their credibility? They use a magic phrase.

Saying “Open Says Me” can open doors, and saying “This is just my opinion” apparently allows someones to make any stupid remark they want with impunity. I’ve had people tell me that, on average, conservatives are smarter than liberals, and that President Obama will raise more money than Mitt Romney because of his Hollywood connections. These seem like things that need to be backed up with evidence, but since each person qualified it as “their opinion,” they didn’t feel the need to.

In the cinematic triumph that is Talladega Nights: The Ballad of Ricky Bobby, a redneck NASCAR driver played by Will Ferrell uses the same tactic. Ricky tells his boss that “With all due respect, I didn’t know you had experimental surgery to have your balls removed.” That sounds inappropriate, but he did say “with all due respect.” Most people who debate politics think they are smarter than Will Ferrell’s character, but I’m not so sure.

People don’t need to be reminded about the First Amendment, but they do need to be reminded about responsible use. Saying whatever you want and using the right to free speech as en excuse is not responsible; it just makes the speaker look dumb, and makes rational discussion more difficult. Everyone has an opinion, but they can still be wrong.

Still, being able to say whatever I want by using one simple phrase sounds like fun. I’m going to give it a try. This is just my opinion, but:

Mitt Romney is an alien sent to conquer Earth with an army of dancing horses.

John Boehner is an Oompa-Loompa who took steroids.

On average, conservatives are most likely to be cannibals.

After being defeated by the Light Side, Emperor Palpatine fled to Earth, starting a new life under the pseudonym “Dick Cheney.”

Ronald Reagan did not end the Cold War.

, , , , , , , ,

1 Comment

Comic adults

Justice League by Alex RossOnce upon a time, if you were an adult and you read comic books, people thought there was something wrong with you. Until Marvel revolutionized comic book storytelling in the 1960s, comics were seen exclusively as kid stuff. After all, what adult would take a story about a guy in tights and a cape seriously?

Apparently, a lot. I wouldn’t be surprised if it turned out that more adults read comics than children. Many comic-reading kids grew up but didn’t want to give up their books (who could blame them?) and comics have grown more sophisticated to appease these mature readers. That’s great, because some of these so-called “grown ups” can act pretty childish when it comes to their favorite reading material.

Wired.com recently ran a short review of the new television show Comic Book Men. It’s a reality show about Kevin Smith’s comic book store, sort of like Pawn Stars for the nerd set. Take a minute to read the comments.

It’s amazing how much anger can be stirred up by a reality show about silly middle-aged men running a comic shop. The reviewer didn’t like it, saying that it reinforced negative stereotypes with its all-male cast and their tendency to make typically male jokes about women and gay men.

Luckily, Kevin Smith and company have some staunch defenders. One commenter called the author a “douche,” another said she was “pretty lame;” a third commenter said she shouldn’t be allowed to write professionally.

When a female commenter (Mary 229) came to the author’s defense, she was labeled an “angry fangirl” and taunted. “Mary’s turn on’s [sic] include whipped men, spreading inflammatory lies and invective about Rags Morales, and crying misogynist every ten seconds to invalidate the other persons [sic] point. It’s “angry fangirl 101,’” said commenter “John.”

I’m not taking sides on this one, but I think some of the comments were pretty ridiculous (once again, feel free to follow the link and decide for yourself). Since this is the Internet, I have no idea how old these people are or what their life stories are, but I can’t imagine any circumstances where statements like that would be acceptable in public. Everyone is entitled to their opinion, but how about a little civility?

These comic fans should really listen more closely to their favorite characters. Has Superman ever called anyone a “douche” because they disagreed with him? Does Captain America angrily stereotype people when he disagrees with a government policy? Spider-Man is constantly being hunted down by the police and press; does he ever respond with anything besides witty banter?

When comics were read exclusively by kids, superheroes were role models. The morality and emphasis on good citizenship that started out as a way to educate children became an integral part of most heroes’ characterizations. Even in today’s age of moral ambiguity, a lot of it remains.

It’s kind of funny that a bunch of adults reading the same books can’t pick up on those lessons. These characters treat everyone with dignity, even their enemies. That seems like a pretty easy thing to understand. Superheroes are super because of their extraordinary abilities; I don’t want to live in a world where having manners is an extraordinary ability.

, , , , , , , ,

Leave a comment

Characters Not Welcome

Without individuality, life would be incredibly boring. Luckily, all human beings have a unique personality, some more unique than others. Everyone has at least one “character” in their lives, someone with a personality so strong you would think they were sent from central casting.

These people transcend the social niceties most of us get hung up on. Whether it’s the conductor singing “City of New Orleans” as he punches tickets on a commuter train, or the unofficial mayor of a small town or neighborhood, who seems to be friends with everyone, they are easy to spot.

It’s a role that few people have the audacity and, well, character to pull off without looking like schmucks, and that is the way it should be. Society can only handle so many characters, but almost everybody tries to be one.

People try to write off their irascibility and anti-social behavior by adopting the character facade: it’s not their fault that they offended you, they tell it like it is and that’s just the way they are. You need to get over it.

Curmudgeons aren’t the only culprits. Younger people have their own archetypal characters, like the “partier” or “drama queen.” Instead of tailoring responses to different situations like human beings, they react the same way every time and expect the rest of the world to accommodate them.

The world needs characters, what it doesn’t need is narcissists. Everyone has personality flaws, or moments of indiscretion, and no one should be crucified for the occasional bad mood or inappropriate reaction. Still, people need to own up to their mistakes: not everything can be blamed on one’s immutable character.

If everyone was a character, there would be no point in being a character. The genre would become overplayed and passe, like superhero comics or vampire romances. To have character, a person has to be unique. Everyone wants to escape the pressure of social mores, but that doesn’t mean they can pull off this kind of performance.

Real-life characters know they live in the real world, but they interact with it in a different way than everyone else. Wannabe characters just want to escape the rules of the real world with shallow play-acting, and that is why the whole notion of characters has gone too far. Not everyone can be Groucho; someone has to be Zeppo.

, , , , ,

Leave a comment

What I’ve Learned

For this blog’s 50th post, I decided to write about writing. Since finishing grad school last May, I’ve been trying to get a job as a newspaper journalist; here’s what i’ve learned so far about finding employment as a writer. I have not been terribly successful, so far, so don’t take this as a “How to Be a Writer” guide.

1: Newspapers want clips

My quest to become a journalist began when I started college. I wasn’t sure what I wanted to do in life, but I did know that I liked to write and argue about politics. In an attempt to be social, I went to a meeting of the school newspaper, The Scarlet, and they assigned me an op-ed piece on gas prices. The rest, as they say, is history.

During senior year, I took a journalism class that included visits from local journalists. The first question they always asked was “Who works for the school paper?” That’s also how I got my first writing job (blogging for the Worcester Telegram & Gazette’s WorcesterU site): the editor saw that I worked for The Scarlet, and thought I knew what I was doing.

Published writing is a prerequisite for any newspaper or writing job. Editors want to see that a person can write; just telling them that you can without proof is not going to work. If you’re in high school or college, don’t put off writing for your school paper or any other publications.

2: They want more than clips

Good writing skills are the bare minimum for employment, employers won’t even consider someone who misspells things in their cover letter, but landing a job requires more skills. Newspapers want their reporters to have local knowledge, to know everything about the area they cover so said reporter can cultivate sources and stories.

Consequently, the best place to start looking for a job might be the place you’ve lived the longest. Having a working knowledge of the major issues of your hometown shows employers that you already know what to write about.

3: Expand your definition of “writer” and “employed”

If you can’t find steady employment, why not freelance? If you have an idea for a story, pitch it to your local newspaper. If you have a hobby, remember that the majority of content in enthusiast magazines is bought from freelancers. The New York Times also accepts op-ed submissions every week.

The problem with freelancing is that it’s hard to live off the approval of editors. So, like any good superhero, it’s a good idea to get a day job while freelancing. If you get the right job, it can contribute to your ultimate goal. I work at a not-for-profit agency, where I am making a newsletter, and writing press releases and articles for publication in local papers. In other words, I’m writing. It may not be a staff job at the Times, but it’s better than flipping burgers.

4: Work for free

This can feel exploitative and fulfilling at the same time. On the one hand, news organizations from CNN to Patch are broadcasting user-generated content. Aside from not getting paid, accepting a free blogging gig gives you some perks: the public (and potential employers) are viewing your work and your name is attached to a reputable organization. Thanks to the Internet, writing is one of the only professions where people are expected to work for free. Until payment systems catch on, we’ll just have to deal with that.

On the other hand, this could be an opportunity to do some important work. Volunteer organizations are always looking for people to write grant applications or press releases, or edit newsletters and websites. You still don’t get paid, but you do get to show off your skills for a good cause.

5: Keep Writing

No matter what you do, the important thing is to keep writing. It is, after all, a skill that can only be maintained and improved with practice. You may not have a job, but that doesn’t mean you have to stop observing the world and putting words together in an aesthetically pleasing manner. Even if you can’t think of something that’s fit for public consumption, keep a notebook. Write a blog, even if you don’t think anyone will read it. After all, if you really want to be a writer, how could you stop?

, , , , ,

Leave a comment