Author Archives: jadallas

The Effects of Social Media on the Journalistic and Political Landscapes

The political and journalistic landscapes are changing in the United States. It seems as though online media has provided a means for the two areas to become increasingly intertwined throughout this era of change in the digital age. Journalistically, social media has provided a means for news to travel in smaller, more digestible forms. Politically, social media has become increasingly important in keeping constituents tuned in to political news and happenings and providing a means of cheap, efficient public relations for politicians. The development of online social media outlets like Facebook and Twitter is altering the ways in which the public consumes information and news about politics and elections. Social media is transforming the journalistic and political fields by changing the way political news is being consumed and increasing the speed that information is gathered in released. With the 2012 presidential elections in the nation’s rearview mirror, journalists and politicians are eager to see how social media will continue to transform the democratic process.

Obama wins the 2012 election. Photo courtesy of HelloBeautiful.com.

Obama wins the 2012 election. Photo courtesy of HelloBeautiful.com.

Perhaps the most noticeable difference between present day presidential elections and those of years past is the use of social media websites, such as Facebook and Twitter, by both candidates and constituents to communicate and consume political news. Although the United States witnessed the role social media can play in the elections in 2008, the 2012 elections served as a true test of the medium’s power to influence voters. Some even argue that the result of the election was predicted first on social media, with President Obama clearly ahead on social media websites. In previous interviews, Ryan Adams, CEO of PME 360, a company specializing in local Internet marketing, explains that social media sites, Facebook specifically, are underused by political candidates: “While further studies are needed, it appears that social media analytics may be an underutilized resource for predicting presidential election results. It can be argued that social media engagement signifies an informal vote for a candidate.” Essentially, he argues that when an individual “likes” a candidate on Facebook, that “like” signifies that person’s vote, thus making Facebook an integral part of future election predictions.

Facebook has gained more than 96 million users in the United States since 2009; that means that over 96 million more people will have access to the Facebook pages of politicians and news outlets, and the ability to share, “like”, and post links to these pages on their own personal pages for all of their friends to see and interact with. Twitter has also grown rapidly since the last presidential election. The compound annual growth rate of Twitter from 2006 to 2010 was 478 percent, and in 2008 Twitter grew at an astounding rate of 752 percent with an increase of one million additional users in December alone. With the immense growth in the number of United States citizens using social media and the increasing participation of both large and small news outlets on Facebook and Twitter, social media is becoming a far more powerful player in the political game than it was previously, even in the 2008 election.

Although social media is a relatively new part of politics, this kind of transformation of the democratic process brought on by a new form of media has been seen before. On September 26, 1960 the democratic process in the United States fundamentally altered when John F. Kennedy, a senator from Massachusetts, went head to head with Vice President Richard Nixon in a 60-minute televised debate. Until then, politics had not been televised in a meaningful way. This televised debate transformed “political campaigns, television media, and America’s political history” from that point on. What was it about that debate that so deeply affected the democratic process and the way political new is covered by journalists? Essentially, “after the debate, how you presented yourself, what you looked like, how you sounded and whether you connected directly with audiences mattered.” Furthermore, 88 percent of American households had televisions at that time – a 77 percent increase from just a decade before. According to Nielsen television ratings of the day, approximately 74 million viewers were tuned in when calm, confident Kennedy beat the nervous, sickly looking Nixon in the September debate. The debates drastically transformed the way presidential candidates approach the election process by increasing the importance of the roles of personal appearances, charm, and public relations. In the same way that television did in 1960, social media had similarly experienced a short period of rapid growth immediately prior to the 2012 presidential election.

Yet, one of the weightiest concerns about the incorporation of social media into the political process is the emphasis on speed and instant gratification that is embedded into the culture of the Internet. Journalistically, social media has provided a means for news to travel in smaller, more digestible forms. For instance, Twitter limits users’ posts to 150 characters or less. Essentially, this allows Twitter to act as a headline service for users. If a user is intrigued by the 150-character snippet, then they’ll often click on the link to the related story and read the full article. Twitter also allows for a considerable amount of citizen journalism to take place. Due to the nature of site, users are able to instantly Tweet to millions of followers in just a fraction of a second. Because of this, websites like Twitter have forced journalists and news outlets to gather and release news faster than they ever have before. When journalists are kept on such tight deadlines, it can be difficult to go through the proper fact checking processes that are expected to occur when newspapers or television stations publish or air a piece of news. By overlooking this integral step in the journalistic process, journalists and news outlets expose themselves to a plethora of factual errors that may not have occurred if they had taken the proper amount of time to fact check their information and sources. Unfortunately, modern day journalists are pressured to compete with citizen journalists who may have witnessed a piece of news first hand and then Tweeted about it only seconds later.

Although this form of amateur journalism has become integral to the development of journalism in the online world, it poses a serious threat to the integrity of professional journalists. Citizen journalists usually do not have experience or education in the field of journalism, therefore they are not held to the same code of ethics that professional journalists are held to. This can cause a great deal of problems for professional journalists because the public has unlimited access to the information these sources put out, which is highly susceptible to errors and heavy bias. If the public is fed false or heavily biased information from these amateur journalists, then it makes it more difficult for professional journalists to convince readers that the information they are presenting is verified. Even so, political philosopher John Stuart Mills asserts that the truth will inevitably arise, even out of false information, in his theory of the marketplace of ideas. As a new media landscape evolves, journalists, those who consume the news, and those who are in the news can only hope that Mills’ theory possesses some veracity.

However, the truth may be especially difficult to come by when it comes to political news obtained from biased news sources. Social media has become increasingly important in keeping constituents tuned in to political news and happenings both in everyday life and during election seasons. Social media websites provide a means of cheap, efficient public relations for politicians. Unfortunately, these websites have also made it easier for readers and viewers to be manipulated by exposure to an inundation of biased information from politicians themselves, political pundits, opinionated constituents and citizens, and even a growing number of unprofessional journalists who lack the ethics that professional journalists are trained to have. Furthermore, social media websites like Facebook and Twitter are extremely easy to update and can actually be even more effective in reaching mass audiences than other forms of modern media, such as television or newspaper advertisements. This element of social media makes it an especially promising platform for public relations because politicians and public relations professionals are able to instantly interact with millions of people with just the click of a mouse. Conversely, broadcast, print, and other traditional news sources are then forced to keep up with consumers in order to stay relevant. This phenomenon ties back to the idea of instant gratification. By speeding up this process of news gathering and publishing in order to instantly gratify those who want to hear news as it happens, traditional news outlets’ publications become increasingly prone to error and the inclusion of unconfirmed facts simply because they want to be the first to release a story.

Although getting the story out first is an important element in the social media landscape, this idea is certainly not new to the field of journalism. Being the first publication or news source to get “the scoop” on a newsworthy story has been engrained into the foundation of journalism for decades. One notable example of this is the scoop that Bob Schieffer obtained when he was a fledgling police reporter at the Fort Worth Star-Telegram in Texas. Schieffer is known for getting Lee Harvey Oswald’s mother to Dallas after Oswald was accused of assassinating President John F. Kennedy. When Oswald’s mother called the Fort Worth Star-Telegram asking for a ride to Dallas, Schieffer initially reacted with condescension saying, “Well lady, you know, we don’t run a taxi service besides the president has been shot.” Once the woman explained her relationship to Oswald, Schieffer was able to find a ride into Dallas for her. When Oswald’s mother arrived in Dallas, where floods of reporters were prowling around looking for any tidbit of news about Kennedy’s murderer to report, Schieffer pretended to be a police detective in order to interview the woman first at the scene. The incident helped shape Shieffer’s career.

Now imagine an event like the Kennedy assassination happening today. Would the news coverage look different on a social media platform? Probably. Inevitably, there would be a small number of political extremists touting his assassination as a step forward for the nation. Although these kinds of minority political extremists undoubtedly existed 1963, it is likely that their opinions would not have reached the eyes and ears of even a fraction of the amount of people they would have today on a social media website. Good news travels fast, but bad news travels faster. If someone were to Tweet something inflammatory about an assassination, it’s possible that the Tweet could go viral and reach millions of people by means of Retweets, Facebook shares, et cetera. Essentially, the speed at which people are able to obtain and share information on the web may have drastically changed how the assassination was covered in the news and how the public perceived the event.

With social media growing in popularity and allowing people more opportunities to customize which news outlets to subscribe to for news, there may never be a news event that promotes the kind of cohesiveness that the nation experienced when Kennedy was assassinated. According to Nielsen statistics, a point was reached during the funeral on Monday afternoon when 41,553,000 television sets were in use, believed to be an all-time high for the era. With Americans obtaining their news from so many different niches, it would be difficult to garner the same amount of public attention to one medium at the same time. Because of this, it is possible that television is the best medium for reaching millions of people simultaneously, and consequently uniting them by allowing them to watch the same coverage concurrently all over the nation. Conversely, social media appears to be dividing the American population further by promoting the use of niche marketing and targeted news strategies to find a specific audience rather than a broad audience. This tactic is also referred to as narrowcasting, the reverse of broadcasting. It is possible that the narrowcasting tactics social media websites use could create additional conflict between people of opposing views and opinions and prevent Americans from behaving and thinking cohesively. If this occurs, then the boundaries between journalism and politics may become increasingly blurred and distorted, thus fundamentally changing both fields, possibly even forever.

The Effects of “Nichification” on American Audiences

As websites like Hulu and Netflix rise in popularity among American citizens, television and movie viewers are becoming increasingly polarized in their tastes in everything from entertainment to politics. As more people turn away from cable television service, cutting the cord in favor of a more personalized viewing schedule, individuals are simultaneously becoming pickier about what they do and do not watch. The opportunity for viewers to watch what they want, when they want, can have deeper effects on their open or close-mindedness in terms of political and social issues, as well as on how they critically review these same issues.

Prior to the cable cutting revolution, people chose whether to get their news from one cable network or another. Today, those who wish to hear only the most conservative perspectives can choose to watch Bill O’Reilly’s show, “The O’Reilly Factor”, exclusively for all of their news, rather than being exposed to more balanced news sources. This allows people to fall into a few select niches, or general categories, in terms of their television viewership. This “nichification” of television viewers allows an individual to develop a more close-minded view of political and social issues by limiting his or her exposure to diverse opinions and worldviews.

Heightened political staunchness and augmented partisanship among Americans can have consequences that reach far beyond the television remote. One symptom of this growing political nichification is the recurring gridlock problem experienced within the United States Congress. Congressional gridlock has dominated news headlines for years. Yet, despite constant scholarly analysis and criticism, it continues to plague the American political system. In January’s 112th Congress, Congress spent weeks deliberating how best to avoid the impending fiscal cliff. In the end, the conflict came down to (far too many) late night and early morning votes, which, unsurprisingly, remained heavily partisan. The entire ordeal earned Congress some of the lowest national approval ratings it has seen in years. While many critics assert that they key to overcoming Congressional gridlock is to limit the use of the ever-problematic filibuster, several of them overlook the deeper causes and implications of Congressional gridlock.

Congressional gridlock may cause numerous harms throughout the nation, but it also reveals some deeper social issues at play within the American population. Members of Congress aren’t the only ones falling prey to the negative consequences of nichification, for them by way of partisanship, in the nation’s constantly evolving technological landscape.

Americans are becoming increasingly nichified in all areas of their lives, much to the pleasure of advertisers and media outlets. When individuals fall into certain niches, it makes marketing to certain key audiences cheaper, easier, and more effective. This type of niche driven marketing is evidenced on social media platforms like Facebook, where users’ search histories are reported to companies advertising on the website to provide more personalized advertisements for that particular user. While the nichification of consumers is invaluable to marketing firms and advertisers, it can have unintended implications on individuals who close themselves off from other niches and choose to expose themselves exclusively to information from within their own niche.
Nichification can also be especially crippling to the creative process. By cutting oneself off from diverse opinions and perspectives in favor of a certain one, that individual puts his or herself at a creative disadvantage because he or she is effectively limiting his or her exposure to ideas that may have otherwise been enriching in some way.

Furthermore, this kind of nichification can impact one’s ability to objectively consider certain political and socials issues. For example, an impressionable individual may succumb to the opinions of those within his or her specific niche without proper evaluation of those views, rather than expose his or herself to the viewpoints of another group, simply because it is easier than forming a unique opinion that may cause tension within that niche. This can also affect an individual’s ability to critically evaluate information, whether it be for academic or entertainment purposes. If people lock themselves into a certain niche, then they compromise the critical skills they might have gained had they appraised new information. When one is presented with opinions different from his or her own, consideration of those opinions requires a certain level of critical thinking and evaluation. This brand of critical thinking is especially vital as the nation faces weighty political and social issues, like the current state of the economy and the legality of gay marriage.

Yet, as audiences become increasingly nichified, they simultaneously become more rigid in their beliefs and values, making political and social decision making more difficult. Nichification of American audiences, as evidenced on social media websites, in Congress, and in the entertainment industry, is extremely detrimental to creative development, critical thinking, and the encouragement of thoughtful debates surrounding political and social issues. While the nichification of audiences is helpful to the marketing and entertainment industries, it is dangerous to the encouragement of independent thought in the American population. While the outrageous niche identifiers on Netflix might be grounds for a good laugh—Feel Good Con-Game Musicals and Controversial Fight-the-System Documentaries, to name a couple of the more obnoxious ones—the effects of nichification are far more serious than they may appear on a Netflix queue.

Mass Shootings and the Media: Which is the Cause/Effect?

On the morning of April 2, 2012, seven people were killed “execution style” at Oikos University in Oakland, California.

On the afternoon of September 27, 2012, five employees of a company called Accent Signage Systems were shot to death in Minneapolis, Minnesota.

On the morning of December 13, 2012, twenty-six individuals were shot and killed at Sandy Hook Elementary School in Newton Connecticut.

The combined number of deaths from these events is 38. In addition to these 38 individuals, 50 others were shot to death in other mass shooting situations last year. The most notable information missing from the descriptions given, above, however, is the names of the individuals responsible for the shootings: the alleged murderers.

In the wake of the horrific, highly publicized mass killings that occurred in 2012, a number of debates have arisen concerning the media coverage of these events. One of the most prominent discussions is one concerning the media’s glorification of the mass murderers themselves—the sort of glorification that may cause similarly distraught individuals to follow in the footsteps of the cold blooded killers of yesteryear. Of the 25 worst mass shootings in history, 15 have occurred in the United States, according to Washington Post. Furthermore, of the 11 deadliest shootings, five have occurred from 2007 onward. This information suggests that mass shootings like the ones at Oikos University, Sandy Hook, and Accent Signage Systems are increasing in number and severity.

In his analysis of the downsides of intense media scrutiny on mass shooting style massacres, Brian Levin argues that this sort of coverage has three chief negative effects: it may assist or inspire other violent individuals to commit similar acts, while also traumatizing the victims of these shootings before they are able to obtain the necessary psychological counseling. Yet most significantly, he argues, is the effect this sort of coverage can have on the public’s widespread fear of “brutal but unusual” mass shooting situations. Levin argues that by saturating the media with coverage of these shootings, media gatekeepers are giving the American public the impression that the risk of these events happening is higher than it may actually be.

Each of the above arguments is valid, but they overlook many of the main reasons why gatekeepers choose to cover these mass shootings as aggressively as they do.

While the media is responsible for the dissemination of details regarding shooters’ backgrounds, motives and methods, it is not responsible for how the public interprets that information. Admittedly, following these tragic events, a newsgathering frenzy of sorts occurs to find any and all information about the killer responsible for the event. This frenzy is not usually driven simply by the media’s “first to report it” mentality, but usually by the gatekeepers’ own curiosity about the event. It is part of human nature to be interested in the details of tragic events—after all, traffic still slows on the highway long after a car accident has been cleared to the side of the road thanks to those dubbed as “rubberneckers”. Gatekeepers are just as curious about the disturbed individuals behind these tragedies as the public, if one media outlet chooses not to publish information about a shooter, another outlet inevitably will to satisfy its curious audience. This is, of course, a darker side of human nature, but it isn’t something that the media can be blamed for.

The idea that the media is not responsible for how people interpret the information it distributes extends into each of Levin’s other arguments as well. It is up to the victims to overcome the trauma they endure during these events, not the media. While the media should remain as respectful of the victims as possible, it is not accountable for the mental health of the victims. Even in terms of the American public’s warped perception of the frequency of mass shootings, the media isn’t entirely to blame. In general, members of the media focus heavily on statistics and relevant historical information to tell stories and outline the news. This means that the media does its best to communicate the level of risk the public should feel regarding these shootings based on the statistics available. Any level headed risk assessment may be offset by the sheer number of media outlets covering the events, thus bombarding the public with an overwhelming sense of anxiety about the risk of shootings. However, the media is essentially trapped by these events—forced to cover them for the sake of reverence for those affected and for the sake of newsgathering, but also criticized for its meticulous coverage. Thus, the media is forced to walk an impossibly narrow line.

Mass shootings are, undeniably, becoming more commonplace. So too, is the reach of the media into the lives of everyday Americans. As more media platforms become available, the din of the media rings increasingly louder in the public’s ear. The sheer number of media outlets out there is overwhelming. When a tragedy that piques the public’s interest is thrown into the mix, it is inevitable that the media will become saturated with information regarding the event. Thus, it is important to scrutinize the media for things within its control—the validity of the information it releases, the relevance of that information, and the most effective way to present it to the public—rather than what is out of its control—how the public interprets and uses that information.

Beating the Post Travel Blues

Boat moored off the coast of Caye Caulker.

Boat moored off the coast of Caye Caulker.

As you may already know, I spent my Spring Break this year backpacking through Belize. It was, truly, one of the best experiences of my life. The plane ride home, while entertaining due to my friends and my slap happy/sleep deprived state, was extremely depressing. I kept thinking to myself, “How am I supposed to go back to school later today and pretend this trip never happened?” This feeling is, I suspect, not entirely uncommon for travelers upon their return home. Yet, the fact that world travelers throughout the ages have successfully coped with the same feeling doesn’t take the sting away. It’s never easy to return to a seemingly mundane and ordinary daily life after spending an extraordinary time away from home meeting new people and experiencing new things in some exotic locale. Those first few days at home can be a struggle as you unpack, wash (in my case, probably burn) your clothes, and upload all of your glorious vacation photos. (Which, by the way, I’ll be sure to post along with this post.)

Iguana

We went camping overnight at Actun Tunichil Muknal-a 3 mile long cave used for Mayan sacrifices. Francisco, pictured here holding a MASSIVE iguana, was our guide.

But once all of that’s done, how do you manage to get out of your post travel funk once and for all? Though I haven’t officially escaped my current funk, I have dealt with the post travel blues before, so I know what to expect. Here are a few common symptoms of the ever-dreaded Post Travel Blues–oh, and a few cures too.

Symptom: Excessive Boredom

Cure: Become a tourist in your own town. Try that new restaurant down the street with a friend you haven’t seen in awhile. Hike that trail you’ve been meaning to try for years.
Throwing yourself into your work is another great cure for intense boredom. It might not sound fun in comparison to what you were doing on your trip, but it’ll help you get back into the swing of things.

Symptom: Loneliness

Cure: Call up some old friends, especially friends you haven’t seen in awhile so that you can tell them all about your recent adventures abroad. Honestly, though, spending time with the people who traveled with you is a better option. Chances are they’re feeling just as strange about being back and could use some time reminiscing with you as well. You might also have the strange feeling that travel has, in some way, changed you so profoundly that you don’t mesh as well with those in your inner circle anymore. Spending time with others you traveled with will help all of you reintegrate into daily life. This can be helpful in figuring out ways to implement your new outlooks into your everyday lives in a way that doesn’t alienate you from you peers.

Symptom: Erratic Sleeping Patterns

I was this close to a jaguar. No explanation needed.

I was this close to a jaguar. No explanation needed.

Cure: Oh, jetlag–half as common as the common cold and twice as annoying. A bad case of jetlag can put you out of commission for weeks as your body struggles to relearn a normal sleeping schedule. There are two ways to go about resetting your body clock during a bad case of jetlag:

1. Shock your system. Sometimes the best way to force your body back into normal sleeping habits is to do just that–force it. Go to bed whenever you feel like it, but set your alarm earlier. After a few sluggish mornings, your body will get the message.

2. Ease into it. For some, it’s far less exhausting to gradually start going to bed earlier and waking up earlier. This is definitely the better option for those who absolutely cannot function on only a few hours of sleep.

Symptom: Future Looks Bleak

Crystal Maiden

This is the Crystal Maiden, the remains of a Mayan sacrifice found in the Actun Tunichil Muknal cave that we trekked through. Cameras aren’t permitted in the cave, so this photo was obtained through Ambergris Daily.

After an amazing trip somewhere, it’s easy to fall into the trap of thinking that nothing exciting is ahead–from here on out everything is going to be mundane and predictable. The best way to overcome this notion of bleakness is to plan some sort of future outing that you can look forward to. If you’re on a tight budget after your previous excursion, then plan a trip to visit a friend or relative instead of an extravagant overseas vacation.

Hopefully anyone suffering from the Post Travel Blues will be able to conquer the pesky ailment with some or all of these tips. If not, then I suppose you’ll just have to hop on a plane somewhere new and try to rid yourself of those blues the same way you found them (no complaints here).

To Be or Not to Be: American Identity Abroad

In two short days, I will be embarking a plane bound for San Salvador, El Salvador, and then another headed for Belize City, Belize. It’s all a little surreal, to be honest. I’ll be backpacking though jungles, cities, and beaches across Central America for all of my university’s Spring Break (March 15-24). The past week or so has been a blur of infectious disease immunizations, trips to my local R.E.I. store, and frantic reservation scheduling. But the hope is that it’ll all be worth it when I arrive.

Map of Central America
Map of Central America. Image courtesy of Geology.com.

Now, I’ve traveled abroad before, so this isn’t my first rodeo (seven stamps and still counting!). Nonetheless, there’s always a lingering fear in my mind that people will judge me for being American. (Sidenote: I also have an irrational fear of being “Locked Up Abroad”–if you haven’t seen the show on National Geographic, you’re missing out.) Admittedly, I’m not just an American citizen, I’m also in the process of obtaining dual citizenship in Greece. But don’t be fooled– despite my split heritage, I am one hundred percent American. I was born and raised in this nation. Moreover, I think like an American– and that’s what counts.

Now that I’ve established how thoroughly American I am, for better or for worse, let’s examine why this might be a good or bad thing when traveling abroad. I’m a big believer in a good ol’ fashioned pro/con list, so let’s stick with that method. I’ll just list a few of the most obvious/important advantages and disadvantages.

PROS

1. If you get in trouble, you know, locked up abroad,  (you never know what kind of random law you might accidentally break, right?) you’ll always have a dedicated U.S. Embassy around.

2. You speak English already, so that’s a huge bonus when traveling abroad. Many people (depending on where you go, of course) speak some English.

CONS

1. Stigma. People will assume that you’re a stereotypical American: unsophisticated, arrogant, uneducated, and lazy.

2. Because English is so prevalent in the U.S. you might not know many other languages.

3. Traveling to some places (i.e. Cuba, North Korea, some nations in the Middle East and Africa) is more difficult for Americans than for citizens of other countries.

Common stereotypes of Americans.

Common stereotypes of Americans. Image courtesy of Chin Chat Comics.

Though, when it comes down to it, the advantages and disadvantages of being an American overseas are roughly equal. While some travelers may be ashamed of being American because of their fear of being judged by natives of other countries, others may be overly proud and obnoxious about their American heritage. Much like our pro/con list, the number of individuals who identify with these groups are roughly equal as well.

There’s no right or wrong way to feel about being an American abroad. People will probably make assumptions about you regardless of whether you’re ashamed or proud to be a citizen of the United States.

In the end, the very act of evaluating the merit of one’s national identity is, in itself, beneficial to any globetrotting American. In her research on the effects of studying abroad on undergraduate university students, Nadine Dolby explains the phenomenon:

National identity shifts from a passive to an active identity in the global context. Just as whiteness studies have argued that “white” identities are often invisible in contexts where whiteness is accepted as the “norm,” an American identity is only invigorated in a situation where students become “other,” and are thus compelled to interrogate their national location.

Dolby’s research reveals that this kind of introspective thinking can ultimately alter the way these individuals process global issues.

What is possible, if not fully realized through these students’ experiences, is a postnational American identity, one that encounters and confronts itself in the context of the world, as part of a conversation, and as a participant in the human village. how students understand “America” has implications for future practices of citizenship. Citizens with an exclusionary, closed notion of the relationship between nation and state (Berlant’s infantile citizenship) may seek to create one type of world, while citizens who have a more open, inclusive, sense of citizenship may struggle to create another. Thus, the perspectives that students bring back with them are part of public discourse in the United States and have implications for the future of American democracy, the public good, and the constant renegotiation of the material and imaginative space that is America.

As I have not yet studied abroad (I’m still five months away from studying abroad in Madrid), and have only visited foreign countries for weeks at a time, I can’t necessarily say whether or not simply vacationing in another country for a prolonged amount of time brings about the same level of introspection that studying or working abroad would. However, I can attest to the fact that my experiences abroad have, in fact, dramatically altered my worldview.

As I prepare for my excursion through Central America in the coming days, I look forward to the insight the experience will bring me about my own national identity–or, I suppose, identities.

*A subscription may be required to view Dolby’s research on the JSTOR database.

The Internet: Friend or Foe of Critical Reasoning?

How Does the Internet Affect Critical Reasoning?

In this day and age, one single form of media has successfully challenged all forms of media that have preceded it: the Internet. The Internet is a curious thing; people sit for hours in front of glowing screens aimlessly researching, reading, and watching the various websites, applications, and videos it has to offer. But what is it that the Internet provides that has allowed it to give more traditional forms of media a run for their money? One may even argue that conventional media, such as television and print sources, are being phased out as the Internet evolves and gains more followers. However, the implications of the Internet’s widespread popularity extend far beyond what many people would stop to consider if they were only able to step back and take a break from their ceaseless commenting, Tweeting, and Stumbling. The Internet allows people to become instantly gratified. That is to say, when one is seeking a piece of information they are able to obtain it almost instantaneously. Furthermore, with the evolution of digital media throughout the twentieth and twenty-first centuries, people have not only gained access to boundless sources of information, but they have also been provided with a means of organizing and accessing this data. However, the popular notion that all information found on the Internet is derived from credible sources allows Internet users to believe inaccurate information and to develop faulty reasoning based on this information. In many cases, people will apply this defective reasoning to important social issues in order to defend their own standpoint on a particular matter, or to attack someone of an opposing view, thus muddying the waters for many other Internet users attempting to widen their own knowledge and perspective on the subject. 

Wikipedia

Wikipedia, one of the most prominent and readily available sources of information on the Internet, has recently received a great deal of disapproval from media critics for its role in feeding the public false information. Although Wikipedia serves as an easily accessible source of infinite information on a multitude of things, the open source nature of the website allows a great deal of ignorant or intentionally deceptive individuals to upload false facts to the various articles presented on Wikipedia. These inaccurate or misleading ideas are often believed to be true facts by those who read the article, thus allowing these ideas to spread further into mainstream culture. Oftentimes, students utilize articles on Wikipedia as sources for research papers, and for other similar assignments for their schoolwork. In fact, the incorporation of false information from Wikipedia into academic papers submitted in school has led to the widespread ban of Wikipedia as an academic source throughout the nation. In her article on Wikipedia’s lack of credibility, Nora Miller explains,

“On campuses across the country, the debate over the move has mounted, with arguments ranging from ‘no encyclopedia qualifies as an adequate source for a research paper, and neither should Wikipedia’ to ‘this move is the beginning of censorship.’” 

If students to continue to use Wikipedia as a source of information for academic research, then this will cause them to develop opinions based on inaccurate information. The development of these flawed ideas will eventually lead to faulty critical reasoning among the youth population in the United States and could potentially harm their ability to analyze real-world social, political, and environmental issues.

Effects on Critical Reasoning

In fact, the Internet’s ability to damage one’s critical reasoning creates a seemingly endless pattern: by reducing one’s capacity to critically reason, one’s ability to tell which information on the Internet is credible is consequently worsened. The Internet has become a battlefield for opposing thoughts, ideas, and perceptions; nasty comments are sprawled across Facebook news feeds, angry Tweets are endlessly re-Tweeted, and open source Wikipedia articles have become war zones for people with strong opinions on political and social issues. While this battlefield allows people to become exposed to a variety of viewpoints on myriad issues, it saturates social media web sites with excessive data for Internet users to filter through. Before the reign of digital media began in the twentieth century, individuals were forced to actively search for information in more conventional ways. Today, people no longer have to actively search for this data in libraries and encyclopedias because it is readily available at their fingertips. In fact, people are inundated with information everyday in the form of status updates, blog entries, and search engine pop-ups. Although this data gives people the power to enrich their lives with new knowledge, and to expand their critical reasoning skills through research and interaction with one’s peers, it also adversely impacts their ability to generate quality discussions within their respective communities about social issues based on the analysis of known facts. 

Striking a Balance

In order to harness the beneficial effects of the Internet, while avoiding the negative consequences of overexposure to false information and the resulting development of faulty reasoning from it, Internet users must learn to critically analyze the media they interact with and to consider the circumstances from which the information they receive is coming from. If Internet users are able to develop a sense of skepticism towards the information they find in blog posts, unknown news sources, and social media web sites, then it will have the potential to truly surpass the media of the past and will allow Internet users to make their first steps into a dominantly digital age.

 

You can read Miller’s article, “Wikipedia Revisited” through the ProQuest database with a subscription. 

What’s Wrong with Children’s “Rights”

The other day I was watching an episode of Mad Men, when I had the sudden urge to Google a rather strange term: “why do I hate children.” One might think that I’m a horrible child-hater simply looking to track down some entertaining online rants construed by my child-hating cronies, but that’s not necessarily the case. Allow me to explain: In this particular episode of Mad Men, Don and Betty Draper’s newborn baby is screaming at the top of its lungs in the middle of the night (the kid seriously sounds hysterical) and Betty is forced to wake up and check on the newborn. At this point, simply witnessing this kid’s hysteria is driving me absolutely nuts, even through my tinny Macbook speakers. Before entering the baby’s room, Betty stops outside the door, taking a breath and preparing herself to deal with the screeching infant—that’s when my “I hate children” moment came along.

I’d love to tell you that these moments are few and far between, but that’s not the case. Every screaming baby, loudmouth kid, and snarky adolescent sets my teeth on edge and brings that same odious thought to the forefront of my mind. Though, what’s strange to me is the fact that, at some point in time, I took on each of those roles in my youth, undoubtedly annoying the hell out of all the adults around. So how, then, can I feel so much hatred toward a group that I was once a proud member of? Blogger “Roy” from Feministe expounds upon my (and others’) proclaimed hatred of children.

 “Can any of us imagine someone posting ‘I hate women. How is it disrespectful that I don’t find bitches awesome?’ and there being less than serious outrage over it? Replace ‘women’ with any number of other groups, and I think that the result is the same. Children, though, are generally seen as a group that it’s okay to hate, in some ways.”

The problem with this disgust with children concerns the severe lack of children’s rights. Essentially, children are products of their environments, and every aspect of their environments are tightly controlled and restricted by the adults in their lives. If any other group, race, or ethnicity were to be oppressed in the way that children are oppressed today, there would be uproar about rights, liberties, and equality. Unfortunately, children are seen as immature, stupid and, at times, delusional to adults, thus preventing them from earning a voice in society.

Roy explains:

“An adult who isn’t feeling well can call in sick and avoid interacting with other people, in many cases. Children don’t have that option…Children are one of the most easily victimized groups on the planet. They’re targeted for rape/sexual abuse, kidnapping, forced prostitution, slave labor… and they have little to no means of fighting back or escaping from these situations. Millions upon millions of children go without any health insurance in the United States, through circumstances completely beyond their control.”

All of these factors combine to form the perfect conditions for oppression. By targeting children as punching bags for our aggression, we’re essentially beating down a social group that is already as low on the social ladder as one can go.

Because of this, I am taking a stand. From today on, I vow not to express my annoyance with poorly behaved children with such odiousness. I will be an advocate for the equality of all people, young and old, and I urge you to do the same.

The Effect of Apocalyptic Media on Modern American Attitude

The zombie apocalypse—it’s a terrifying theme that now pervades countless popular books, movies, and television programs. Though the setting and circumstance of each representation may differ, the enemies are alike: undead men and women with vacant eyes, decaying skin, and an interminable thirst for human blood. Zombies embody mankind’s greatest, most defining anxieties: death, and what comes after it.

“Zombieland.” Photo courtesy of IMP Awards.

The idea of the Earth being overrun by the undead is both chilling and threatening to many. So much so, in fact, that many Americans are choosing to arm and prepare themselves for this (rather unlikely) kind of apocalypse. According to the United States Federal Bureau of Investigations (FBI), Black Friday background checks for firearms last year rose 20 percent from the previous year. Dave Workman, a gun rights advocate, explains that part of this increase is due to an elevated fear of a zombie apocalypse. Essentially, the fictional representations of these nonexistent monsters are beginning to warp the minds of men and women across the nation, causing them to modify their lives around the possibility of a zombie apocalypse. This information suggests that fictional depictions of an apocalypse are capable of deeply affecting the human mind to an extent where the individuals exposed to these tales begin to adopt apocalyptic thinking in all aspects of their lives. In an era where the public is inundated with apocalyptic films, literature and television shows, the doomsday mindset is spreading rampantly throughout the American population. This alteration in perspective is changing the way American men and women interact with one another and how they think about politics, society, and their futures.

Movie poster for “Contagion.” Photo courtesy of Wikimedia.

If Americans believe that the nation, has no future, then it is imaginable that their optimism would decline, and thus their concern about country’s political future. Since 1960, ten percent fewer Americans have showed up at the polls to vote during presidential election years. This low voter turnout rate coincides with recent poll results (2011) showing that Americans are, for the first time in recorded history, more pessimistic than optimistic about the nation’s future. Interestingly, when the same question was asked of Americans just two years prior, 56 percent of American men and women were still optimistic about the future of the United States. So what was it that changed American attitudes in that two-year span of time—could it have been the impending Mayan apocalypse set to occur in December of the following year? Maybe Americans felt less optimistic as they recognized the 10th anniversary of the September 11th attacks on the United States, or perhaps they were simply feeling the emotional effects of the seemingly endless Great Recession. The precise reason for such an increase in American pessimism isn’t likely to be attributed to just one factor, but it is quite possible that overexposure to apocalyptic ideas and the belief that the nation could have no foreseeable future might easily be a factor in this phenomenon. Apocalyptic thinking breeds pessimism. In John Suler’s The Psychology of Cyberspace, Suler asserts that catastrophic thinking is a sizeable factor in the psychology of depression:

One component of depression is the tendency to engage in the style of faulty thinking called “catastrophizing” – i.e., predicting and anticipating crisis, often based on little or no evidence … Reminds me of the concept of the tragic flaw in classic Greek literature. The hero has a weakness—a secret, hidden vulnerability that he himself may not realize, an Achilles heel. At the peak of his triumph, it comes back to haunt him. It triggers his downfall.

In effect, as filmmakers and writers continue to play upon this “tragic flaw” of mankind by producing an excess of end-of-the-world films to draw in audiences, they are also unconsciously propagating pessimism and figuratively kicking the American Achilles heel. This, in turn, psychologically affects those who consume these doomsday stories by creating in them a greater sense of impending doom. This sense of inevitable downfall then causes potential voters to shy away from politics simply because they believe that there may be no bright future ahead to be concerned about.

The notion that there is no future to be concerned about can also stretch into social aspects of society once the public has absorbed it. Movies like Contagion (2011), a film about a deadly viral epidemic that wrecks havoc on human populations across the planet, and Deep Impact (1998), where mankind must decide who is worth saving when a comet is set to collide with Earth, illustrate the distrust humans can develop of their peers once their livelihood and safety is threatened. As soon as humans begin to believe that no bright future is foreseeable, faith and trust in their peers is likely to decrease. This could be due to the “every man for himself” notion that is often evident in doomsday stories. Once men and women lose confidence is their fellow mankind, it is inevitable that their relationships with one another will suffer soon after. The social consequences of apocalyptic thinking can cause a great deal of interpersonal anxiety in addition to deepening chasms between classes, races, and ethnicities in society once the barrier of politeness has been breeched.

While many end of the world plots pit humans against one another as they face potential termination, others pit the human race against some foreign race or phenomenon. This theme is evident in films like

“Zombieland” movie poster. Photo courtesy of Wikimedia.

Battle Los Angeles (2011), H.G. Wells’ classic science fiction thriller, War of the Worlds, which was also adapted into a screenplay for the identically named film released in 2005, and Zombieland (2004), in which mankind is fighting to survive a zombie apocalypse. Instead of setting humans in opposition of one another, mankind is pitted against a common enemy—usually some species of aggressive aliens or monsters. Though this theme is psychologically beneficial to the human mind due to its promotion of cohesiveness and trust in one another, it can have perilous effects on how Americans process the idea of outsiders. The consequences of this type of thinking are visible in the way the American public thinks about foreigners. This variety of thinking is hazardous to the American psyche as it promotes hostile behavior towards foreigners, who could be considered the enemy. This notion that foreigners are the enemy is dangerous to the formulation of foreign policy and increases the hardships foreigners much endure to assimilate into American society. At the end of many of these us-against-the-enemy type stories, mankind prevails by “banishing, symbolically obliterating whatever the apocalyptic writer deems unacceptable, evil, or alien.” However, in the real world, the consequences of obliterating those whom one doesn’t understand or accept can have severe consequences, as seen in tragic events like the Holocaust in Europe or the creation of Japanese internment camps in the United States, both of which were the result of the fear associated with a certain subset of the population during the twentieth century. Ultimately, the kind of inclusive thinking promoted by the us-against-the-enemy type of apocalyptic stories can be more detrimental to the American psyche than expected.

Movie poster for “The Day After Tomorrow.” Photo courtesy of IMP Awards.

Another aspect of apocalyptic plots that can be detrimental to American society is the preponderance of the idea that one should live each day as though it is one’s last. Today, this idea has become extremely prevalent in society through the term “YOLO”—you only live once. Despite its recent popularity, this idea has existed throughout history, though by other names. In the past, it has been known in Latin as carpe diem, or in Spanish as que sera sera. Lately, however, the term YOLO has taken on a rather surprising connotation; one associated with rebellious activities and a blatant disregard for the consequences of one’s actions. When people adopt the idea that their actions and choices are unimportant in the grand scheme of things because there is no future to look forward to, society can unravel as people begin to act in rash and thoughtless ways. In his article for the Journal of Religion and Film, Conrad Ostwalt explains:

Hollywood has discovered and tapped into a secular, popular apocalyptic imagination that is prevalent in our contemporary culture. We are inundated with this sense of an impending doom.

It is this sense of impending doom that causes people to make snap decisions about choices that would normally require further consideration, like whether to quit one’s job or spend a large sum of money on something luxurious. Fabiola Carletti, a reporter for CBS News, explains this phenomenon in an article about what doomsday films reveal about the American people:

Instead of panicking, most people stumble about their daily lives as if in shock. And amid the requisite looting and stockpiling of food, some people attend damn-it-all dance parties or kiss strangers.

Though Carletti is referring to the effects of doomsday thinking in the face of an actual apocalypse, the same thought process occurs in the minds of those who have adopted apocalyptic thinking into their everyday lives. Once hope for the future is lost, any reverence for rules, convention and social norms is usually thrown out with it, thus bringing immense consequences to the personal lives of those consumed with this sort of thinking. In terms of personal relationships, carpe diem thinking can cause people to act selfishly in their interpersonal relationships and act in ways that disregard the feelings of others. By bombarding the public with doomsday stories, the media is deepening the influence of apocalyptic thinking in a social context and ultimately corroding the interpersonal relationships Americans have with one another.

Cover of “The Road.” Photo courtesy of SF Site.

Doomsday thinking and extreme carpe diem philosophy can affect not only the way people interact with one another, but also the way in which Americans interact with the government and process large-scale social issues. There are many subsets of the apocalyptic narrative genre—the mankind versus monster variety, the end of the world by overwhelming natural disasters sort and the apocalypse by plague type, to name a few—but all of them have profound effects on the way viewers process the world around them. These narratives can cause a change in the mindset of those who adopt an apocalyptic philosophy, thus leading them to alter their lives in some way to adapt to the possibility of an apocalypse. Furthermore, this attitude reveals a great deal about the anxieties inherent to mankind, and how these anxieties can warp the minds of Americans if they are overexposed to apocalyptic stories that highlight them. Yet, regardless of the medium, apocalyptic narratives have great power over the collective American psyche, and ultimately, the fate of the nation.

Internships- Friend or Foe?

As a “starving college student,” I’m almost always overjoyed at the thought of obtaining any sort of job or internship to work towards improving my lot in life. However, I’ve recently noticed that nearly every internship opportunity is unpaid and requires students to have class units available so that they can be credited in some way for their labor. However, whether or not the employers realize it, offering only school credit as payment for an unpaid internship is more of a burden than a benefit to students. Oftentimes, interns are forced to play hundreds, if not thousands, of dollars more to purchase units for internship credit from their respective universities or colleges in order to work for no pay whatsoever. Thus, causing students who do not have the financial means to support themselves while working to lose out on valuable work experience. In order to offer students the work experience they need to succeed in the workplace in a more useful and effective way, it should be mandated that companies over a certain size must pay their interns at least minimum wage. This would allow smaller companies to have unpaid interns in order to save revenue, while more established companies, with larger amounts of revenue to expend, are forced to pay their interns for their work.

Admittedly, this would make it so that many students might pass over an internship at a smaller company for one at a larger, more well known one. However, incentives could be created and implemented (either public ones offered through the government or privately from the companies themselves) in order to encourage well qualified students to work at smaller companies, where the work of interns is integral to company growth.

While some students feel that pay isn’t necessary for them to gain experience in their desired professions, it is necessary to offer payment to those who might need financial help during their internships to support themselves. This would level the playing field for students studying for certain professions and allow them to fully commit to their internships without worrying about finding a part-time job aside from their internships. This would also eliminate the need to purchase class credits for internships if students are already taking a full course load. Instead of punishing over-achieving students who wish to enroll in the maximum number of class units allowed each semester while also interning off-campus by forcing them to purchase additional units, students should be encouraged to take their futures into their own hands and get as much experience as possible throughout their education.

This can only be achieved if internships are in some way financially subsidized so that students can focus on learning, not earning, until they’re sufficiently trained and prepared to do so. Internships are a valuable part of the higher education process and should be available to all students who wish to gain hands on experience in their given professions, regardless of their financial well-being.

For the Love of Rock—A Review of “Sound City”

Documentary Film as Journalism

I believe I’ve said this countless times before, but journalism comes in many different forms. One particular form that is especially near and dear to my heart is documentary journalism. Documentary films, arguably the most artistic style of long-form journalism, have the ability to completely draw viewers into a topic and immerse them in information pertaining to the topic being documented in an entertaining and visually interesting way. Documentaries can evoke a great deal of emotion from their enraptured viewers. The film “Sound City”, directed by Dave Grohl, gives viewers an in-depth look at the rock music industry in Los Angles throughout the past few decades. Moreover, the film unites diverse viewers and brings them together to appreciate a dying form of music—rock and roll.

The Review

When was the last time you hung out with Paul McCartney, Tom Petty, and Stevie Nicks for an hour and 48 minutes? Unless you happen to have some serious connections in the rock music world, the answer is probably never. Those who had the good sense to go out and see “Sound City”, however, would probably respond with something like this: “Oh, yeah. I just sat in on a jam session of theirs last weekend. Kick ass stuff.” And, yes, they would describe it as being kick ass, because that’s exactly what “Sound City” did, it kicked ass.

The crowd at the chic Hollywood theater I had the pleasure of viewing the film at (the only theater showing the film in Los Angeles) was chock-full of viewers of all ages, many of whom were clad in some breed of Converse sneakers, leather jackets, and vintage band shirts. By the end of the film, nearly everyone in the theater had collectively sighed, cried, or guffawed at one point or another. It appears as though the old adage rings true—music brings people together, even when the tune is in film form. In the words of singer/songwriter John Denver, “No matter what language we speak, what color we are, the form of our politics or the expression of our love and our faith, music proves: We are the same.” In the case of “Sound City”, one gets the feeling that this notion remains true, even today, in an age where everyone seems to belong to a particular musical niche. The picture’s ability to bring viewers together is ultimately achieved through brilliant editing techniques, stunning cinematography, and a never-ending soundtrack of some of the greatest rock and roll chart toppers of all time.

"Sound City" movie poster. Image obtained from Ace Show Biz.

“Sound City” movie poster. Image obtained from Ace Show Biz.

One of the most useful tools the filmmakers behind “Sound City” had to work with is the element of nostalgia. The first half of the film is jam-packed with archived video of live performances, long-forgotten photos of everyone’s favorite rock stars, and all of their original music videos. The film essentially begins at the inception of Sound City, a recording studio in Van Nuys, California that served as the launch pad for the careers of numerous award-winning bands and musicians like Rick Springfield (“Jessie’s Girl”, “I’ve Done Everything for You”), Nirvana, (“Smells Like Teen Spirit”, “Heart-Shaped Box”) and Tom Petty and the Heartbreakers (“I Won’t Back Down”, “Here Comes My Girl”). The historic recording studio’s claim to fame was a massive, 1970’s era soundboard. “Sound City” essentially uses the soundboard as the anchor for the multitude of stories relayed in the film; every musician and studio employee had some sort of memorable interaction with the soundboard during their career. As the film progresses, the viewer gets the sense that the soundboard is a metaphor for the overall well being of the rock music genre itself. The soundboard develops its own personality; it becomes the physical embodiment of decades of unforgettable rock music. And let me tell you, it takes a lot of personality to capture the essence of rock and roll. Ultimately, the soundboard serves as the primary tool for transitioning from the thriving rock music scene of yesteryear to the fading rock landscape of today. Though the soundboard serves as the centerpiece of the film, it’s what’s going on around the soundboard that makes the film truly magical.

Director David Grohl’s (lead vocalist of the Foo Fighters) involvement with the film gives viewers an unimaginable, unprecedented look into the musical development of incredible musicians like Tom Petty and Rick Springfield. It is this astonishing access to these outrageous personalities that gives the film an enchanting quality that ultimately ropes viewers in and ties their emotions to that iconic Sound City soundboard and all of the people whose lives it affected. Once the film captures the hearts of its viewers, it eagerly provides a new beat for their adoring hearts to thump along to: the deep, soulful melody of a bass drum. Truly, there is hardly a single moment in the film where music is absent. The constant presence of music throughout the film, combined with deeply revealing and stunningly honest close up interviews, gives viewers the sensation that they’re sitting in on a low-key jam session with some of the most iconic rock stars of the past few decades.

“Sound City” documents the musical process in a candid, profoundly fascinating way that all viewers, musically inclined or otherwise, can appreciate. In fact, I can nearly guarantee, regardless of your musical preferences, you’ll find yourself blasting Nirvana and Springfield the entire drive home from the theater. So, buy yourself a one-way ticket to “Sound City” and please, for the love of rock, enjoy the ride there.

Links of the Week:

These 25 quotes about rock music.

This article published in The Guardian in 2011 on the death of rock music.

And this hilarious collection of the 25 most ridiculous rock band names in history.