Pleasure & Decay: A Hunger for Decadence

December 2, 2017 § 3 Comments

The Hunger Games film series is a post-apocalyptic dystopian fiction directed by Gary Ross and based on the trilogy written by Suzanne Collins. The film begins with the “reaping” ceremony for the selection of one boy and girl between 12 and 18 years of age from each of the twelve districts who will train and later fight to their death in the 74th annual Hunger Games, leaving only one survivor who will become the victor. These annual Hunger Games take place in an outdoor arena in The Capitol of Panem, a city overflowing with wealth and luxury, while the rest of the world lives in extreme poverty oftentimes unable even to find sustainable sources of food. The event is televised for the entertainment of the citizens of the Capitol who gain pleasure in watching the contestants brutally kill one another in a survival of the fittest so that they may have a chance to live.

The residents of the decadent Capitol are pleasure-driven and the city features all the luxuries that one could want. They pride aesthetic values and therefore alter and adorn their bodies in extravagant ways. Undergoing surgical augmentation, they reshape their bodies, colorfully dye their hair and epidermis, imprint themselves with striking designs, and have jewels implanted into their skin. They also don distinctive clothing which often takes on an altered vintage appearance with richly embellished, puffy-sleeved dresses, and brightly-colored suits.

20hungergames0207a.jpg

Although the inhabitants of the twelve districts seem to be impoverished and severely lacking in food and supplies, the citizens of the Capitol feast on delicate bread and pastries, meats, and fine wines. Gluttonous to the extreme, they even drink a liquid which makes them vomit so that they can consume more food.

Citizens in the dystopian Capitol city of Panem featured in Collins’s The Hunger Games trilogy (2008-2010) bear and uncanny resemblance to the protagonist in (and even the author of) a work of nineteenth-century Victorian fiction. Perhaps, the “new hedonism that was to re-create life” as Lord Henry of Oscar Wilde’s The Picture of Dorian Gray predicts found its way into Panem. Like Capitol residents, Dorian Gray, seeks a “new spirituality, of which a fine instinct for beauty [is] to be the dominant characteristic.” Dressed in extravagance with an eye for decadence and a degeneration into monstrosity, Dorian seems as though he would feel quite at home in the Capitol.

Oscar_Wilde_Sarony.jpg

Oscar Wilde (1882)

Ben-Barnes-as-Dorian-Gray-jpg-ben-barnes-21140020-720-479.jpg

Ben Barnes as Dorian Gray (2009 Film)

With an air of dandyism, in excessive dress, embellished with jewelry, even-so-far as to don “a dress covered in five hundred and sixty pearls” at a costume party, Dorian also hosts dinner parties that feature “exquisite taste shown in the decoration of the table, with its subtle symphonic arrangements of exotic flowers, and embroidered cloths, and antique plates of gold and silver” and “the most celebrated musicians of the day to charm his guests.” To Dorian’s guests, “he seem[s] to belong to those whom Dante describes as having sought to ‘make themselves perfect by the worship of beauty.’”

Just as the “beautiful” people of the Capitol witness and even cheer for the deaths of the participants in the Hunger Games, the monstrosity hidden under the surface of Dorian’s beauty is revealed when he commits murder.

95971170037008fc7acac00118bfe6fb--dorian-grey-oscar-wilde

Illustration by Colominas

With a society who increasingly seeks pleasure and luxury at the expense of the impoverished, The Hunger Games and The Picture of Dorian Gray novels and films are not necessarily a case of art imitating life as Oscar Wilde might say. As Suzanne Collins reveals in an interview, her inspiration for the trilogy derived from “lying in bed…channel surfing between reality TV programs and actual war coverage” when “the lines between these stories started to blur in a very unsettling way.” As the wealthy feast on $25,000 tacos “wrapped in tortillas containing gold flakes” and wash it down with bottles of $150,000 white gold and platinum tequila at the Grand Velas Los Cabos and the top five percent of elite seek pleasure in activities such as adventuring the ocean in a 4.5 billion dollar Yacht “plated with 100,000 kg of platinum” with “statues made of a T-Rex’s bone” while “Almost half the world […] live on less than $2.50 a day,” starving with inadequate food, water, and medicine, it is not difficult to see how our world is dissolving into decadence. With the election of a president who, within the time span of a year in office, has spent more than $84,554,589 in American tax-payer funds just to play golf, I am afraid to see where the state of this nation alone is headed.

–MRC

 

 

Sources:

Campbell-Schmitt, Adam. “Behold The $25,000 Taco.” Food & Wine, 2 Mar. 2017.

Germain, Sophie. “Trump Golf Count.” Trump Golf Count, trumpgolfcount.com.

Margolis, Rick. “A Killer Story: An Interview with Suzanne Collins, Author of ‘The Hunger Games’ Under Cover.” School Library Journal, 1 Sept. 2008.

Mukherjee, Tatsam. “17 Most Expensive Things On This Planet.” ScoopWhoop, 15 June 2015.

Shah, Anup. “Poverty Facts and Stats.” Global Issues, 7 Jan. 2013.

Wilde, Oscar. The Picture of Dorian Gray. e-artnow, 1890.

 

Advertisements

Metamorphosis and Mirrors

September 17, 2017 § 1 Comment

[Please Note: This text contains minor spoilers for the 2017 television series “Twin Peaks: The Return.”]

The season finale of “Twin Peaks: The Return” earlier this month created a seismic ripple amongst David Lynch devotees of the Internet. The proliferation of detail-obsessed fan theories, wikis in at least six languages, and thoughtful analytic pieces speaks to the twisted depths of Lynch’s vision in his reboot of the cult 1990s television series. While the show’s terrain is undoubtedly multidimensional, its intricacies depend on a foundational, age-old motif: dual identity.

While doppelgängers have always been important to the “Twin Peaks” universe, Lynch takes it a step further in “The Return” with the introduction of Tulpas: manufactured alternate identities. A Tulpa takes on the exact appearance (with shifts in minute physical details) of a character, but is actually a construct unknowingly advancing evil while the “real” person is trapped somewhere—in another body or alternate dimension. This play on identity undergoes a number of interesting permutations with the show’s central character, FBI Special Agent Dale Cooper (Kyle MacLachlan). There is 1) the “real” Dale Cooper, known and adored from the original series, 2) his evil doppelgänger, “Bad Cooper,” 3) a manufactured Tulpa, Dougie Jones, 4) Dougie Jones’ evacuated body, which is reinhabited by the “real” Dale Cooper in a dormant, barely verbal state; and, eventually, 5) the reawakened Dale Cooper who, after entering an alternate dimension, becomes someone named Richard. In this shifting landscape, one can never know who is real and who is a Tulpa, not the real characters or the Tulpas themselves, and oftentimes not even the viewers.

twin-peaks-showtime-three-dale-cooper-kyle-maclachlan-david-lynch

Bad Cooper, Dale Cooper, and Dougie Jones (Showtime)

Indeed, any fan of Lynch will recognize doubles as a long-standing interest of the surrealist filmmaker; Mulholland Drive and Lost Highway wholly depend on structures of duality and split existence. But this preoccupation with multiple identities seems to have particular resonance in the world of contemporary television. Jill Soloway’s award-winning series “Transparent,” for instance, revolves around the story of Maura Pfefferman (Jeffrey Tambor), a transitioning transgender woman who, while exploring the complex (and increasingly unlikely) process of sex reassignment surgery, must make peace with her hybrid identity.

The FX original series “The Americans” offers a more politically oriented site for thinking about metamorphosis: two Soviet spies passing as Americans (as well as happily married) near the end of the Cold War. Additional examples are not hard to find: Walter White/Heisenberg (“Breaking Bad”), Don Draper/Dick Whitman (“Mad Men”), and Tony Soprano (“The Sopranos”), who struggles to reconcile his public role as brutal Mafioso with his inner sense of morality and humanity.

That the theme of metamorphosis is central to so many recent television shows is significant, I think. TV has emerged as a medium that not only offers easily digestible entertainment, but also produces serious art, in some conversations even rivaling film as today’s cinematic experience of choice. And perhaps it is no surprise that in an increasingly digital and fragmented culture, creatives have taken up questions of refracted identity, code-switching, and constructed worlds through a medium that is itself somewhat paradoxical: consisting of isolated parts while also sustaining a long-form narrative whole, a combination that has produced a telling term that yokes the consumptive and temporal—binge-watching.

Perhaps, though, it is not merely technology that is fueling inquiry into questions of identity and self-definition in today’s cultural mainstream. As we become an increasingly global society, boundaries separating countries and cultures are less and less defined (or else more fervently reinforced). It may be productive to consider an earlier era that blurred geographic borders through an increase in travel: the 19th Century. In 1859, Charles Darwin’s On the Origin of Species spawned a crisis of faith in the Western world, exploding people’s notions of what it means to be human. The rise of expeditions into new territories and confrontations with indigenous cultures exposed a seemingly infinite variety among human beings and the natural order.

Identity crisis may indeed be one impulse behind the contemporary revival of the 19th Century as a site for artistic inquiry. A particularly inspired example of such bridging of periods can be found in A. S. Byatt’s 1992 novella Morpho Eugenia, which presents Victorian-era social critique and romance through a modern, hybrid form. The story’s protagonist, William Adamson, admits to being “doomed to a kind of double consciousness” after returning to England from a voyage in the Amazon (28). Throughout the novella, William is jockeyed between a host of tensions and dualities: settling into domesticated married life vs. pursuing his life’s work in the jungle; writing a book on natural science vs. editing his father-in-law’s book arguing for the existence of a divine creator; lusting for a woman who is physically alluring vs. one who is intellectually stimulating.

In his book on the behavior of ants, William lays out “some more abstract, questioning chapters” according to a series of possible headings: “Instinct or Intelligence,” “Design or Hasard,” “The Individual and the Commonwealth,” “What is an Individual?” (126). These questions have DNA in cultural artifacts from as recent as the television shows discussed above to as distant as Homer’s Odyssey (a source text that’s taken up in Morpho Eugenia and, appropriately, adapted to suit the novella’s own ends). After sketching out William’s chapters, the text shifts into the actual pages of his book, where he considers “the utility to men of other living things.” He writes, “one of the uses we make of them is to try to use them as magical mirrors to reflect back to us our own faces with a difference” (127).

In “Twin Peaks: The Return,” Lynch ends one of the final episodes with a brief, unsettling shot of Audrey Horne (Sherilyn Fenn), in an unspecified location, staring aghast at her own mirror-image.

twin-peaks-the-return-part-16-audrey

Audrey Horne (Showtime)

Where is she? What year is it? Design or Hasard? What is an individual?

We look to science and religion and art. We look to others and, ultimately, to ourselves. But perhaps the longer and harder we look, the farther away we are from knowing, and the more we demand from the image reflected back to us.

Jennifer Gutman

Source:

Byatt, A. S. “Morpho Eugenia.” Angels & Insects. Vintage International, 1994.

Money Speaks Louder than Human Voices

March 28, 2017 § 3 Comments

“Everything has a price.” This phrase in Margaret Atwood’s Oryx and Crake is not new, but it takes on a new meaning in the context of her novel (139). In today’s world, corporations dominate in every sphere from the economy to religion and politics. While Atwood’s world in which corporations have absolute control is unsettling, her ideas are merely an extrapolation of current times to the future. However, as Atwood shows, commercialism and commodification come at a high price to society and the humans that are a part of it.

Early on in the novel we learn that Jimmy (or later Snowman) lived on a company compound called OrganInc. The corporation controls everything in Jimmy’s life including his school and the rules he has to abide by, enforced through the CorpSeCorps. Later on, we learn that Jimmy and Crake attend what are similar to universities. These “universities,” particularly Crake’s Watson-Crick Institute, aim to generate profits as well, encouraging the very bright students to innovate and develop new technology, carefully securing their facilities, and minimizing interaction with the outside world. In Jimmy’s world, corporations control everything, and their motives clearly dominate.

The corporation-developed compounds seem absurd; however, in reality, they already exist. Massive companies like Amazon and Google have “campuses” that contain everything one needs to live off of. They include restaurants, gyms, childcare facilities, and even sleeping pods – all designed to keep you inside and focused on doing everything possible for the company. Beyond company campuses, universities today mimic those in Atwood’s story. As Vandy students, we even say that we live in a “Vandy Bubble.” Our lives all exist within the confines of our campus as we strive to learn and make new developments in all fields. We are not far off from the fictitious world that Atwood describes.

Images are renderings of future campuses for Google, Amazon, and Apple (from left to right). 

Why does it matter that corporations and technological research centers have such a wide sphere of influence? In a world where profit governs, everything becomes a commodity. This can easily be seen in Oryx and Crake with the story of Oryx. Not only is Oryx commoditized by the pimps that earn money for her sexual acts and pornography but Oryx is also commoditized by every viewer that watches the child pornography, including Snowman. In her discussions of her experience, Oryx has clearly been influenced by the corporation mentality surrounding her, as she states:

“They had no more love…but they had money value: they represented a cash profit to others. They must have sensed that – sensed they were worth something.” (126)

Do we only value human beings for the monetary value they provide? I hope not. Atwood shows a disturbing reality if corporate power continues on its current trajectory. The power of corporations to influence politics and culture even today has implications for cloning and other advanced technology. It is unsettling to think of the development of human clones by companies driven by their own bottom-line. Morality does not seem to have a place in this kind of world.

If we do consider these clones to be “human,” how do we prevent the corporate developers from treating the clones like commodities and not humans, especially when humans today are already commoditized? In the novel, Snowman compares the children in the pornography to “digital clones,” as they did not feel real to him (90). With this statement, Atwood warns of the commodification of both existing humans and potential human clones in the future. If corporations both govern and profit, we cannot prevent abuse and exploitation.

Atwood is not far off in her portrayal of the commodification of human clones. Human cloning has often been criticized for turning human organs into commodities due to their monetary value with cancer treatments and other diseases. President Bush famously rejected all human cloning, stating, “Life is a creation, not a commodity.” He is not alone in being concerned with this idea, as scientists, philosophers, and policy-makers have discussed the implications of human cloning for decades. The Presidents Council on Bioethics expressed the following:

“When the ‘products’ are human beings, the ‘market’ could become a profoundly dehumanizing force.” (The Presidents Council on Bioethics, 2002)

When corporate greed becomes entangled with the morality of health remedies, the potential commodification of humans and human clones is endless. Although Atwood’s fictitious world seems so distant, the reality is that it is much closer to present day than one would first think. From humans to clones to our independence and our value, Atwood shows that everything has a price, and the costs to society are high.

Sources:

Atwood, Margaret. Oryx and Crake. New York: Anchor, 2004. Print.
Arter, Melanie. “Bush: ‘Life Is A Creation, Not A Commodity’.” CNS News. CFC, 07 July 2008. Web. 28 Mar. 2017. http://www.cnsnews.com/news/article/bush-life-creation-not-commodity.
The President’s Council on Bioethics. “Human Cloning and Human Dignity: An Ethical Inquiry.” Georgetown University, July 2002. Web. 28 Mar. 2017. https://bioethicsarchive.georgetown.edu/pcbe/reports/cloningreport/children.html.
Cambria, Nancy. “Our 21st-century Prophet, Margaret Atwood.” St. Louis Post-Dispatch. STLtoday, 26 Sept. 2015. Web. 28 Mar. 2017. http://www.stltoday.com/entertainment/books-and-literature/reviews/our-st-century-prophet-margaret-atwood/article_242b5f9b-3ac6-51e3-9024-e858d178f6e2.html.

Images source: http://www.geekwire.com/2013/4-tech-titans-building-campus/

A New Age of Artifice

January 23, 2017 § 5 Comments

In the fall of 2011, Duke University’s undergraduate literary journal published a rather unassuming poem entitled “For the Bristlecone Snag” (“The Archive”). To the journal’s poetry editors, the poem appeared to be a typical undergraduate work, comprised of several unfulfilled metaphors and awkward turns of phrase. What the editors did not know at the time of publication, however, was that this poem was not written by a human. Instead, it was written by a computer program (Merchant).

When I first learned about “For the Bristlecone Snag”, I was reminded of the writings of Alan Turing, a renowned English computer scientist in the mid 20th century. In his seminal article on the subject of artificial intelligence (A.I.), Turing articulates that the question, “can machines think?”, is “too meaningless to deserve discussion” (Turing 442). After all, he claims, we have no direct evidence that other humans can think, and we merely assume that they do based on their behavior. Turing argues that this “polite convention that everyone thinks” should apply to all beings that can demonstrate human behavior (Turing 446). It is from this line of thought that Turing conceptualized the Turing Test, an experiment in which a computer tries to convince a human of its humanity. According to Turing, if an A.I. can convince a human judge that it is human, then we must assume that the A.I. can think.

While the program that produced “For the Bristlecone Snag” did not complete an extensive and proper Turing Test, it did convince human judges that it was human. At the very least, the poem’s acceptance into an undergraduate literary journal reveals that literate machines can, and will, exist in the near future. The way is paved for more professional and accomplished artificial authors.

Indeed, even in the half decade since “For the Bristlecone Snag” was published, the technology behind artificial intelligence has improved rapidly. Watson, IBM’s “cognitive computing platform”, is a great example of this progress (Captain). In 2011, Watson defeated two reigning champions in Jeopardy, successfully interpreting and answering the game show’s questions. While this feat alone was a remarkable step in cognitive computing, Watson’s analytical abilities have since then contributed to over thirty separate industries, including marketing, finance, and medicine (Captain). For example, the machine can read and understand millions of medical research papers in just a matter of minutes (Captain). As intelligent as Watson is, however, he was never designed to pretend to be human. The chief innovation officer at IBM, Bernie Meyerson, believes ‘“it’s not about the damn Turing Test”’; his team is more interested in accomplishing distinctly inhuman tasks, such as big data analysis (Captain).

While IBM may not be interested in the Turing Test, other artificial intelligence companies have been working specifically towards the goal. In 2014, a program by the name of Eugene Goostman passed the Turing Test using machine learning strategies similar to those that drive Watson (“TURING TEST SUCCESS”). The chatbot, or program that specializes in human conversation, was able to convince several human judges that it was a thirteen-year-old boy (“TURING TEST SUCCESS”). Given the success of Eugene Goostman, and the intelligent accomplishments of Watson, it is indisputable that the Turing Test can be, and has been, passed. Artificial intelligence is a reality. Machines can think.

As an aspiring writer and computer scientist, I can’t help but fixate on the implications that A.I. has for literature. It is entirely possible, even likely, that “For the Bristlecone Snag” foreshadows an era in which the most successful and prolific authors will be machines, an era in which the Pulitzer Prize and Nobel Prize in Literature are no longer given to humans, an era in which humanity no longer writes its own stories.

Yet, this era of artifice should not be greeted with worry or anxiety. Art has always been artificial, a constructed medium for human expression. In the coming decades, we will author the next authors, create the new creators, we will mold the hand that holds the brush. Artificial intelligence should not be feared as an end to art, but rather a new medium, a new age of artifice.

– Zach Gospe

References

Captain, Sean. “Can IBM’s Watson Do It All?” Fast Company. N.p., 05 Jan. 2017. Web. 20 Jan. 2017.

Merchant, Brian. “The Poem That Passed the Turing Test.” Motherboard. N.p., 5 Feb. 2015. Web. 20 Jan. 2017.

“The Archive, Fall 2011.” Issuu. N.p., n.d. Web. 20 Jan. 2017.<https://issuu.com/dukeupb/docs/thearchive_fall2011>.

Turing, A. M. “Computing Machinery and Intelligence.” Mind, vol. 59, no. 236, 1950,  pp. 433–460. www.jstor.org/stable/2251299.

“TURING TEST SUCCESS MARKS MILESTONE IN COMPUTING HISTORY” University of Reading. N.p., 8 June 2014. Web. 21 Jan. 2017.

Craving Attention in the 21st Century: The Significance of Social Media and Science Fiction.

November 5, 2015 § Leave a comment

My newest insta pic only has 66 likes. Do you SEE that creative angle? I was lying on the grass (I was reading in the sun so I was not totally pathetic and desperate) for that angle. Do you SEE that slight color adjustment? Not too much, but just enough for you to truly appreciate that blue sky and green lawn. I had creative hashtags that would draw people in while also using some broader hashtags to appeal to a larger audience (#sky #scenic #nature). I even made sure to delete those hashtags after a few days (I definitely don’t want people to think I am desperate for attention). I might as well have put #desperate if I was being truly honest with myself.

** Username, location and caption have been edited out for my own privacy. **

** Username, location and caption have been edited out for my own privacy. **

My dad calls me pretty regularly, approximately 75% of the time for help with his phone, iPad or really anything that plugs into an outlet. My father is many things: an electrician, a carpenter, a chef, a jack-of-all-trades. Unfortunately, he is anything but technologically literate. Born long enough ago to have been drafted for the Vietnam War (He would probably be mad if I ever posted his true age for the world to see), he frequently recounts what he thought the future would look like and how hard it was growing up in the ancient days (He would probably claim to have walked six miles in the snow to school like any other stereotypical old people if he hadn’t grown up in Miami).

His dreams of the future saw technology revolutionizing the world, not supporting the rise of social media platforms such as Instagram and Facebook that feed on the new generation’s narcissism and dependence on social approval for value.

For most of its existence, science fiction was a genre outside of mainstream consumption that depended largely on pulp magazines to cheaply circulate the stories. Unfortunately, the terms “science fiction” and “sci-fi” often conjure up judgmental images of young prepubescent boys on the outskirts of society. In a way, science fiction was for the socially inept or awkward. It was literature for cheap paper and obscure magazines. A positive change in the 21st century has been a larger acceptance of science fiction, largely due to blockbuster hits. Science fiction is becoming exciting, dramatic and mainstream.

But, becoming more popular and circulated opens it up for more influence from this new technologically narcissistic generation. I freely admit that I have been guilty of this exact issue. When I started writing for this blog I wrote from a creative part of my heart that simply wanted to entertain anyone willing to give my stories attention. I really didn’t use too many hashtags, just enough to convey what my story was about. I wanted creative titles that would confuse or intrigue the reader.

And then I realized I could see how much traffic my posts were generating.

Every week I checked to see if my stories, articles and posts had new views and comments. My hastags started to get more specific and numerous. My titles started to reflect more what someone would enter into a search engine. I noticed that the most “successful” post had a title along the lines of “the significance of . . .” and realized that people had a reason to type that into Google. People cared about that answer.

When I think of science fiction I think of it being a sort of social commentary that incorporates real issues with a focus on technology, space exploration and the like. That is what I had wanted to do but not what I ended up necessarily doing. I had let my own egoism and dependence on attention for self-esteem and value that I had strayed from my creative goals and aspirations. I was a sell out.

This is not to say that all of my entries have fallen for this social media pitfall. I poured my heart and soul into a blog post about the dangers of genetic engineering. I spent hours writing a story about evolution. But, I am human and I make mistakes. I look back on my blogging career and see those lapses in judgment where I sought attention for my articles.

Much of science fiction is about technology and yet I let the technology control me. I was no longer the one reinventing technology or predicting the social implications of existing technology. I was letting technology reinvent my creative process.

So, maybe some of you will have some sort of homework assignment that asks you to discuss the significance of social media. Or, maybe the significance of science fiction as an “emerging” genre. So, you guys hop on the computer and search “the significance of . . .” in a search engine and my article pops up. I hope you all click on the article and read it. Not for views, comments and likes, but maybe to learn from my own experience and realize that technology is a tool that you use, not a tool that uses you.

– S. Jamison

What If Our Spaceships Can Make It, But Our People Can’t?

October 19, 2015 § 3 Comments

Matt Damon. Need I say more? Some say that love is moving past physical attraction and toward a gradual love of the person’s personality and quirks. I sat in my movie seat smiling when he smiled, laughing when he laughed, and really worrying for him when his potatoes froze. The Martian might as well be a love story up there with The Notebook and Titanic. I wouldn’t mind living on Mars if it were with him.

Ahem. Anyways. Once I overcame my beating heart, my brain finally got enough blood to do some actual thinking and processing. I heard prior to seeing the film that producers collaborated with NASA to make the film more realistic, and with that in mind, I spent the movie scrupulously analyzing and critiquing every little detail. Was Mars’ atmosphere really thin enough for Mr. Damon to cover the nose of his ship with a tarp and blast off the planet? Could the soil on Mars, when enhanced with a few human contributions, really support plant growth? Could Mars have such violent storms if it has a thin atmosphere?

And then it hit me. Say all of the scientific plot points were plausible and accurate with sufficient scientific developments. Say everything I doubted, questioned, and critiqued was suddenly true without a scientific doubt. Would Matt Damon’s character have the psychological health and mental endurance to thrive through such an ordeal?

Researchers with Georgetown University, among other research facilities, have investigated that concern and found that a combination of alienation from relationships on Earth, cultural differences, language barriers, differences in personal values, restriction to small facilities on the space crafts, and other physiologically influential variables can lead to the gradual physiological deterioration of those onboard. And in a series of studies conducted by both government and independent space exploration organizations, researchers often found negative consequences of long-term space travel, including suicidal thoughts and tendencies, decreased group cohesion, sleep disorders, irritability, and changes in appetite.

So what does all of this mean for Mars and the future of long-term space exploration? It means that human development may not keep up with scientific development. I say “may” because, for all I know, there could be incredible advances in psychology and medicine that overcome the negative consequences of extended space exploration. But from where we stand right now, Matt Damon probably wouldn’t be so positive and clear minded being stranded on Mars.

And for our relationship’s sake, I really hope science can figure a way around human psychology. I can’t spend my life with a negative and depressed person, so I guess time will tell if we make it or not. I mean that both in marrying Matt Damon and society making it to Mars without killing each other.

– S. Jamison

TimeEscape

October 9, 2015 § 1 Comment

A line of characters flooded the screen, alternating as my friend shifted her position in my keyboard. My book report was pulled up in Microsoft Word, the cursor blinking frantically as it tried to keep up with my friend’s sabotage. I just laughed as I pushed her off, then entered my newest discovery into the keyboard, taught to us only a week before in 5th grade computer class:

Ctrl+Z.

I was fascinated by the concept of that combination of keys. I could make the most impulsive of edits, write the most ridiculous statement, and delete entire chunks of my paper without worrying about any long term consequences. If I didn’t like the result of my action, I could just push those two keys- Ctrl+Z- and everything would be as it should. A fresh start. Slate cleared. Back to whatever square I chose I wanted to resume work from. I didn’t have to worry about reconstructing any past reality or losing anything to time and effort, because with those two keys in my hand, I could take myself back to any foundation, given I had built the foundation before.

Ctrl+Z.

What a tool.

Hooked as I was on the thrill of “Edit-> Undo,” I was a little taken aback when I realized that this handy shortcut didn’t apply to social interactions. It was irrational, I know- but after a week of riding the high of Ctrl+Z, I had somehow assumed that the same rules that applied to my word processor could apply to real life. And when they didn’t, I was not so much alarmed as unsettled.

I always knew Ctrl+Z was a function of the digital realm. But nonetheless, when my confession of a crush to the boy I liked was met with a blank stare, I found my thumb and forefinger twitching, pushing at the keys that weren’t there:

Ctrl+Z.

I couldn’t edit this unfortunate moment out of my past, couldn’t insert myself back into an earlier version of my life’s document, the one where he didn’t avoid eye contact every time we passed in the hallway. Just like everything  else in the real world, I was bound by time–that immutable, stubborn dimension that refuses to yield to all of human ingenuity, that force that turns into bold, permanent marker the marks that we’d rather be in pencil. There is always the possibility that you can cover up the Sharpie-mask it with the paint of reconciliation, or hide it underneath the tarp of loaded silence.

But no matter what you throw over it, the Sharpie always remains, bleeding through the medium to remind you that yes, this happened. You messed up. You will have always messed up this moment. There’s nothing you can do about it.

Science fiction’s answer to this kick in the brain, this blow of helplessness?

Time travel.

Novels like Timescape take our worst fears–that we might irreparably damage our world, whether that world be the world of individual humans or the literal world of humanity–and puts a bandaid over them, then tucks us into to say goodnight and tells us that everything is going to be okay, because somebody will fix it. Somebody will hit the undo button. The irreparable will become repairable, and we can throw away our tarnished slates and start again.

Time travel grants us control over the fourth dimension and releases us from the chains of time, thereby releasing us from our mistakes. We are fascinated by it because we so deeply want it to be true–to imagine that we can go back and make things right before they ever went wrong.

But at the same time, it’s these wrongs that make us who we are. All of those “character building moments” would be lost if we indulged in easily-accessible time travel-we would never learn anything, because there would be no significant consequences for our actions. Perhaps more importantly, all of the good, unintended consequences of mistakes would be lost. The world would stagnate, because all of the rich innovation that arises out of failure would be lost.

We can’t predict the long term consequences of our actions. Our mistakes can be our biggest triumphs.

However, as Timescape notes, sometimes our triumphs–chemical developments and more efficient methods of manufacturing–can be our biggest mistakes, leading to our downfall–the dismal world Benford describes. And it is this possibility–that we could, as a species, ruin the world–that is the most terrifying to us, because it means that we would tarnish every blank slate born into our mistakes.

Furthermore, it is this possibility that is terrifyingly real.

Gregory Benford might not have the means to time travel in real life, but his fingers are desperately twitching at Ctrl+Z anyway–and as a result of this twitching, typing out a great novel of warning. This book is Benford’s best version of a tachyon, a message to the present urging change and a greater consideration of the future–

because the future will soon become the present, and when it does, we can’t just hit
Ctrl+Z.


Celeste Graves

Where Am I?

You are currently browsing entries tagged with science fiction at Science/Fiction.

%d bloggers like this: