Posts Tagged social networking

Twittering away

I finally got a Twitter account and I’m not really sure why. You should definitely follow me (@SAEdelstein) while I figure that out. I promise it will be entertaining.

I’ve never been an early adopter of social media; I usually start by asking “What the hell is this for?” before caving when a critical mass of friends and potential employers start using it. Maybe that’s the source of my confusion.

In school, parents, teachers, and Nickelodeon characters were always saying not to do something just because it’s popular, to think independently.

That’s hard to do when it comes to joining a social network, because the network isn’t just an activity, it’s a space where social interactions (however artificial) happen. Things were less complicated when work and school were people’s only avenues for socialization.

“Because everyone else is doing it” is the primary reason most people join social networks, because they have to go where other people are. If a site is popular enough, it doesn’t matter whether the medium is 140-character messages or protein exchanges. It develops a gravitational pull of sorts that attracts more users.

Of course, it’s important not to put too much emphasis on Twitter or any other social media site. Users can post as much or as little as they want, but there is a difference between using a site and getting something out of it.

Being a late adopter is like walking through a conquered land. The hordes of discordant posts given the barest sense of order by warlord-like influencers with their thousands of followers hint at the possibilities, but remind you that, because someone has already figured out how to work the system, they’re limited.

Social media really isn’t a new frontier for human consciousness, it’s just the same routine as ever, digitized and compressed. The medium itself is where the innovation is: people are and will continue to use it to create new ways of expressing ideas.

Is that the same as fundamentally changing the way people socialize, though? if not, do we still have a choice to opt out, or will we be obligated to join the next new network, and the one after that?

, , , , , , , ,

2 Comments

Social media finds its niche(s)

Social media is evolving. The basement geeks that built the first social networks have moved into the niche market. Facebook is still essential for any digital identity, but now other platforms like Instagram and Loopster are being designed to share specific types of information. Which got me thinking: why stop with photos and videos?

Here are five possible social media platforms for sharing other vital aspects of one’s life:

Crassfone: For sharing inappropriate thoughts you know should be kept to yourself yet feel the irrational need to blurt out in a crowded room.

Triv-o-gram: For sharing random bits of trivia.

Noisss: For sharing non-music sound files

Aro-matic: Until smell-o-vision is invented, this platform will allow users to share descriptions of their favorite smells.

Splice: For sharing the sequence of a person’s genes.

Social networks allow us to share every aspect of our lives, the good, the bad, and the boring. Who cares if no one wants to (or shouldn’t be allowed) to know every thought that pops into our heads and every one of our actions?

, , , , , ,

Leave a comment

Independent society

“Everyone’s the good guy in their own story.” It’s funny how perspective works: we get so focused on living our own lives that we sometimes forget that everyone around us is trying to do exactly the same thing. Marketing departments and Tea Partiers want us to be our own unique selves, but how can we do that without getting in each other’s way?

The U.S. Constitution guarantees the fundamental freedoms necessary for each citizen to be their own person, but until now they haven’t had access to the sheer amount of esoterica that can craft a unique persona. The rise of social media changed that, which is why so many suburban white kids now have a taste for kimchi.

Expanding cultural horizons is always a good thing, but sometimes it smacks of desperation. It is possible to spend too much effort on introspection, to examine one’s self so closely that you will inevitably find an excuse to continue a self-aggrandizing search for happiness.

The more we spend looking at ourselves, the less we see of other people. That makes social interactions more difficult, because everyone else starts to seem like an obstacle, or a pawn. We deserve to make ourselves happy, but we need to remember that everyone else is doing the same thing.

, , ,

Leave a comment

We need to teach our parents some manners

Technology is a wonderful thing (this blog wouldn’t be possible without it), but it does come with some drawbacks. The combined heat of the world’s iPad 3s is probably contributing to global warming, and some say the Internet is just one big distraction. Regardless, one thing is certain: high tech gadgets make people incredibly rude.

People seem to think that smart phones and other devices excuse them from behaving properly. They let them ring at the most inappropriate times, and discuss things in elevated cell phone voices that not one needs to hear. Suits with ear pieces look like they are talking to themselves. People carry on conversations with friends while texting other friends.

A few years ago, this type of behavior would have been unthinkable. Now, people are so engrossed in what is happening on their tiny LCD screens that they ignore the people around them. Is this the future of human interactions? Perhaps not.

I may sound like an altacocker, but I’m actually part of the young, tech savvy generation ad men dream about. This isn’t the 1950s, where an older generation decried youth’s supposed lack of morals. This time, parents (and grandparents) are listening to rock n’ roll. Unlike past cultural phenomena, the technological revolution is not generation-specific.

A common stereotype is that all young people are very good with computers, while their Baby Boomer parents just can’t figure them out. That’s often true, but that doesn’t mean older people are not using computers, smart phones, or tablets. In fact, that’s the problem.

A lot of older people have smart phones, but they may not be comfortable using them. They see other people being obnoxious, and assume it is part of the brave new smart phone culture. These people probably don’t even know how to set their iPhones to “silent.”

Consequently, it’s up to teenagers and 20-somethings to teach their parents some manners. This generation has grown up with the annoyances of technology abuse, so they know how to use their devices without making everyone within a 15-foot radius want to kill them. Youth is also much better for marketing: no one takes an old person complaining about manners seriously, but what about someone in their 20s? For once, parents should listen to their children.

, , , , , ,

Leave a comment

Connotative dissonance

Shakespeare famously asked “what is in a name?” Actually, a lot; modern language is about more than aptly describing the world; it’s about describing the way we want to view the world. With a little clever diction, we can turn the mundane and the pathetic into something more. The British have “news readers;” Americans have “news anchors.” The position of secretary has been replaced by the “executive assistant.” People don’t have problems; they have “issues.” It is the opposite of Newspeak: instead of removing meaning from words, they are imbued with more meaning than the things they are describing.

However, these liberally-defined connotations can backfire. Words carry multiple meanings, and when we try to give them new ones, an ironic contrast can arise. One example is the title of “mayor” on Foursquare. This social networking site tracks members’ locations; the person who spends the most time at a specific location, like a favorite coffee shop or bar, becomes the “mayor.”

Foursquare did not invent this term. City dwellers have been electing “mayors” at their neighborhood bars for decades, but the term had a little less reverence than it gets on Foursquare. Pre-internet “mayors” were usually senile old men, the kind of people who had nothing better to do but sit in a bar all day. It was a sarcastic title for someone who, like it or not, was a fixture of a certain establishment.

Foursquare users like to think being the “mayor” makes them popular but they are really continuing a proud tradition started by a bunch of irascible old coots. Either way, one person loiters in a bar because they have nothing better to do.

Another example of connotative backfire is the frequent use of the word “consumer.” This word is used to describe the buying public; “we are a nation of consumers,” a recent credit card ad declared. Non-professional electronics are known as “consumer electronics,” and the people buying them are advised by Consumer Reports.

“Consumer” is an accurate term, and it sounds more neutral than “buyer” or “sucker.” Still, it has another use that could spoil many ad campaigns. Organizations that serve developmentally disabled people have a problem. Cases like Willowbrook (the New York mental institution whose abuse of patients was famously exposed by Geraldo Rivera) have made the public more sensitive to the mistreatment of the mentally ill and the need to view them as autonomous human beings. Due to the diversity of services these agencies provide, a single term like “patient” won’t cut it, and “client” was thought to imply too much dependence. Consequently, the people these agencies serve are known as consumers.

An attempt to make life’s endless stream of financial transactions into a lifestyle and economic system was borrowed to make the developmentally disabled feel better about themselves and their relationship with caregivers. The comparison may benefit those people, but other American consumers might not appreciate the comparison.

The Bard was right: words are only descriptors of what we encounter. And their impermanence is heightened by the constant process of redefinition. Modern linguists seem most interested in using words to cover reality in shiny, attractive packaging. This is how language evolves, but people attempting to make the world seem classier through creative connotation should be wary of the “mayors” and “consumers” that came before them.

, , , ,

Leave a comment