Posts Tagged Twitter

The algorithms of progress

After 200 posts, I still have a love/hate relationship with the Internet.

I mean that in the most literal sense: I love the opportunities the Internet has made possible, but I hate most of what comes with using it and interacting with people through it.

Without the Internet, I wouldn’t have a job right now. I certainly wouldn’t be able to cover the car industry from a house in Connecticut.

However, the Internet has also de-valued skills.

For many jobs, remote working has opened up a pool of applicants that literally spans the nation. People with job-specific skills are much more interchangeable than they ever have been.

That’s great if, like me, you want to write about cars without moving to Detroit, but it also means that being good at something just doesn’t cut it anymore.

People are expected to bring much more than relevant skills to a job; they’re expected to bring specific training, connections, and name recognition.

Some call this the entrepreneurial spirit; I call it blurring the line between work and life.

Because when people expect less from organizations, organizations expect more from people. So much for punching out at 5:00 p.m.

Those aren’t the only terms the Internet dictates.

We work for it: we design content for it, adapt messages to suit it, alter our language so that both humans and Google will comprehend it.

Then someone invents a new “breakthrough in communications” that must be satiated on its own terms.

Earlier this year I got a Twitter account, because everyone else has one.

As far as I can tell, Twitter is just a forum for anyone who has ever been involved with Star Trek, and a gruesomely effective way to relay information during a disaster.

Every time a celebrity does something, it explodes like a healthcare exchange website on October 1, 2013. I can’t see how this leads to productive discourse.

We shouldn’t feel obligated to make room for new social media in our lives, but we do. That’s what frustrates me the most about living in the shadow of the Internet.

After several generations of continuous technological progress, people seem resigned to the Digital Age being just another part of an inexorable historical movement. Nothing stays the same forever.

When I was in first grade I learned to type on beige Macs and play with floppy disks. The teachers said computers would one day be an important part of my life. It was a self-fulfilling prophecy.

Even if we use a piece of technology, we should still be allowed to evaluate its effect on us, and tailor it to our lives–not the other way around.

The Internet has certainly changed the way people live, but whether “different ” really means “better” — and doesn’t mean “worse” is a determination we need to make. It’s easy to assume we have no agency in the face of progress, but we need to take account of how we use technology.

, , , , , , ,

Leave a comment

Boston is bombed, one tweet at a time

Since I got a Twitter account recently, I haven’t been sure of what to do with it. On Monday, I found a very good, but very unpleasant, use for it.

As with so many things these days, I found out about the Boston Marathon bombings through a reference on someone’s Facebook profile. Scrolling through the newsfeed, I saw a status from a college classmate:

“Slowly finding out more about what happened during the Boston Marathon,” it read.

I jumped over to Twitter and, sure enough, a photo of the scene of the first explosion had already been retweeted by a friend. Reports of a series of explosions were starting to come in, intermixed with Pulitzer prize winners and the announcement that Chris Hardwick will be in Baltimore on May 24.

“Two men had bombs strapped to themselves and they both went off,” a tweet posted 32 minutes before I logged on read, “everyone is scrambling.”

Switching over to the New York Times’ website, there were only a few short lines confirming that explosions had occurred, not even using the word “bomb.”

Facebook and the news sites stayed quiet a bit longer, but Twitter was shot through with  reports, mostly from the Associated Press and journalists who were already on site. The Boston Globe posted a video of the first explosion, and soon it was possible see it from nearly every angle by scanning the tweets.

Not everything tweeted that day was accurate (the report of suicide bombers doesn’t jibe with what investigators are learning about the bombs) but the most necessary information was imparted as quickly as possible.

So that, it seems, is what Twitter is for.

, , , , , , , , ,

Leave a comment

Twittering away

I finally got a Twitter account and I’m not really sure why. You should definitely follow me (@SAEdelstein) while I figure that out. I promise it will be entertaining.

I’ve never been an early adopter of social media; I usually start by asking “What the hell is this for?” before caving when a critical mass of friends and potential employers start using it. Maybe that’s the source of my confusion.

In school, parents, teachers, and Nickelodeon characters were always saying not to do something just because it’s popular, to think independently.

That’s hard to do when it comes to joining a social network, because the network isn’t just an activity, it’s a space where social interactions (however artificial) happen. Things were less complicated when work and school were people’s only avenues for socialization.

“Because everyone else is doing it” is the primary reason most people join social networks, because they have to go where other people are. If a site is popular enough, it doesn’t matter whether the medium is 140-character messages or protein exchanges. It develops a gravitational pull of sorts that attracts more users.

Of course, it’s important not to put too much emphasis on Twitter or any other social media site. Users can post as much or as little as they want, but there is a difference between using a site and getting something out of it.

Being a late adopter is like walking through a conquered land. The hordes of discordant posts given the barest sense of order by warlord-like influencers with their thousands of followers hint at the possibilities, but remind you that, because someone has already figured out how to work the system, they’re limited.

Social media really isn’t a new frontier for human consciousness, it’s just the same routine as ever, digitized and compressed. The medium itself is where the innovation is: people are and will continue to use it to create new ways of expressing ideas.

Is that the same as fundamentally changing the way people socialize, though? if not, do we still have a choice to opt out, or will we be obligated to join the next new network, and the one after that?

, , , , , , , ,

2 Comments

Fighting ironic battles

Pearl Harbor posterI never thought I’d see the day when World War II became a source of irony. It was the definition of “good fight,” a time when the nation harnessed all of its resources to defeat what one of my high school history teachers called “made to order bad guys.”

Yet here we are. Barbasol is running a commercial featuring the viewer’s “ great grandfather” on the ground in a French village, perhaps Ste.-Merie-Eglise or St.-Lo, laconically comparing his attempt to stop Hitler with the current young generation’s obsession with tweeting and Facebooking.

Like “first world problems,” this is another example of a perverted form of thought. Its as if people think that, by noting their shortcomings in an ironic way, they don’t have to actually do anything about them.

It’s also a silly comparison. I’m not saying that my generation is perfect, but it’s not really fair to compare us to the “Greatest Generation.” We’ll never know how the social media-savvy would deal with a Great Depression or a World War, because we lived through a Great Recession and a pseudo-War on Terror.

Twitter and Facebook can lead to some shallowness, but we’ll also never know what our grandparents’ generation would have done if they grew up with these luxuries. I recently ate lunch at a restaurant packed with senior citizens, and most of them had smartphones.

Maybe we should cut back on the irony before we lose track of what we’re making ironic jokes about. This reminds me of a New York Times blog post I read recently called “How to Live Without Irony.” The author argued that too many people are using irony to avoid honest (and sometimes painful) emotional commitments.

That seems like what’s going on here. People need to accept the fact that they’re better off than others, including their own grandparents and great grandparents. That’s what those World War II soldiers were fighting for, after all.

, , , , , , , ,

Leave a comment

Is your iPad all it can be?

In one of its recent iPad commercials, Apple tried to show consumers all the wonderful things they could do with Steve Jobs’ little black tablet. From reading classic books to learning a new language, the iPad looks like the key to enlightenment. But is that really what iPad users do with their devices?

You probably have an Internet-connected electronic device at home. What do you use it for? Do you go on Facebook a lot? Do you watch other people make fools of themselves on Youtube? Do you read random blogs written by curmudgeon-y writers?

The Internet brought the world to our fingertips, and devices like the iPad and iPhone make that interface even easier. However, traditional, pre-Internet, goals of learning can’t beat distractions that were designed for the Internet.

There is considerable debate about whether it is better to read a physical book or a digital one, but you can only play Angry Birds on a screen. In The Shallows, Nicholas Carr described how the Internet encourages the brain to skim through materials instead of examining them closely. That makes it very hard to learn a new language, but very easy to scan the latest Tweets.

If Apple’s own commercials are the benchmark, the iPad may not be living up to its potential. Or maybe Apple needs to reassess. Its device is the perfect platform for all the Internet frivolity we know and love. That is its true function, although that may not be the best ad material.

, , , , , , , ,

Leave a comment

What does Weiner’s resignation accomplish?

Congressman Anthony Weiner has announced that he plans to resign, and hopefully that means we will not have to look at his genitalia anymore. That might be why so many people, including President Barack Obama and Nancy Pelosi, have called for Weiner’s resignation. That would be more reasonable than what they have actually been saying.

Weiner’s actions are reprehensible and completely devastating on a personal level. However, given the immoral and perverted things our public officials have been caught doing, it should barely register as a political issue. When the head of the IMF and the governor of California are attacking women and getting them pregnant, who cares about sexting?

When people hear about one of their leaders acting immorally, their natural response is to cry for blood. They feel their trust has been betrayed and are justifiably angry. Americans have a long history of turning random scandals into major news stories. In his Autobiography, Mark Twain describes the “Morris Incident,” in which a Mrs. Morris waited outside President Theodore Roosevelt’s office until she was dragged out of the White House kicking and screaming. This “incident,” Twain claims, overshadowed international news such as the Decembrist Revolt in Russia and conflicts between France and Germany that would eventually lead to World War 1.

This mob mentality is what the Founding Fathers feared. Popular fascination over a sensational yet inconsequential event can be very harmful to a democratic republic. In this case, government was brought to a standstill because one member of congress couldn’t keep it in his pants. People complain that the government is not getting enough done, but how can they expect it to do anything when Congress and the media are paralyzed by a perverse fascination with sex scandals?

It’s not like there is nothing happening, either. The revolution in Syria is heating up, al Qaeda is reestablishing itself in Yemen, and the economy is still a wreck. Our government is hamstrung by partisan politics and special interests, but some of the responsibility for its lethargy rests on the citizens. If enough people were as committed to job creation as they are to finding out the details of Weiner’s escapades, we would have a solution sooner.

Sometimes, knee-jerk outrage is the easiest response to the actions of members of Congress. However, that reaction needs to be put in perspective. As the puppets of Avenue Q say, “the Internet is for porn.” Our society is highly sexualized and Weiner’s actions may be a side-effect of that. Instead of crucifying every official that does not live up to our high (but rarely maintained) moral standards, Americans should weigh the effect their outrage has on the issues that really matter, and perhaps stop Twittering.

, , , , ,

Leave a comment