A New Age of Artifice

January 23, 2017 § 5 Comments

In the fall of 2011, Duke University’s undergraduate literary journal published a rather unassuming poem entitled “For the Bristlecone Snag” (“The Archive”). To the journal’s poetry editors, the poem appeared to be a typical undergraduate work, comprised of several unfulfilled metaphors and awkward turns of phrase. What the editors did not know at the time of publication, however, was that this poem was not written by a human. Instead, it was written by a computer program (Merchant).

When I first learned about “For the Bristlecone Snag”, I was reminded of the writings of Alan Turing, a renowned English computer scientist in the mid 20th century. In his seminal article on the subject of artificial intelligence (A.I.), Turing articulates that the question, “can machines think?”, is “too meaningless to deserve discussion” (Turing 442). After all, he claims, we have no direct evidence that other humans can think, and we merely assume that they do based on their behavior. Turing argues that this “polite convention that everyone thinks” should apply to all beings that can demonstrate human behavior (Turing 446). It is from this line of thought that Turing conceptualized the Turing Test, an experiment in which a computer tries to convince a human of its humanity. According to Turing, if an A.I. can convince a human judge that it is human, then we must assume that the A.I. can think.

While the program that produced “For the Bristlecone Snag” did not complete an extensive and proper Turing Test, it did convince human judges that it was human. At the very least, the poem’s acceptance into an undergraduate literary journal reveals that literate machines can, and will, exist in the near future. The way is paved for more professional and accomplished artificial authors.

Indeed, even in the half decade since “For the Bristlecone Snag” was published, the technology behind artificial intelligence has improved rapidly. Watson, IBM’s “cognitive computing platform”, is a great example of this progress (Captain). In 2011, Watson defeated two reigning champions in Jeopardy, successfully interpreting and answering the game show’s questions. While this feat alone was a remarkable step in cognitive computing, Watson’s analytical abilities have since then contributed to over thirty separate industries, including marketing, finance, and medicine (Captain). For example, the machine can read and understand millions of medical research papers in just a matter of minutes (Captain). As intelligent as Watson is, however, he was never designed to pretend to be human. The chief innovation officer at IBM, Bernie Meyerson, believes ‘“it’s not about the damn Turing Test”’; his team is more interested in accomplishing distinctly inhuman tasks, such as big data analysis (Captain).

While IBM may not be interested in the Turing Test, other artificial intelligence companies have been working specifically towards the goal. In 2014, a program by the name of Eugene Goostman passed the Turing Test using machine learning strategies similar to those that drive Watson (“TURING TEST SUCCESS”). The chatbot, or program that specializes in human conversation, was able to convince several human judges that it was a thirteen-year-old boy (“TURING TEST SUCCESS”). Given the success of Eugene Goostman, and the intelligent accomplishments of Watson, it is indisputable that the Turing Test can be, and has been, passed. Artificial intelligence is a reality. Machines can think.

As an aspiring writer and computer scientist, I can’t help but fixate on the implications that A.I. has for literature. It is entirely possible, even likely, that “For the Bristlecone Snag” foreshadows an era in which the most successful and prolific authors will be machines, an era in which the Pulitzer Prize and Nobel Prize in Literature are no longer given to humans, an era in which humanity no longer writes its own stories.

Yet, this era of artifice should not be greeted with worry or anxiety. Art has always been artificial, a constructed medium for human expression. In the coming decades, we will author the next authors, create the new creators, we will mold the hand that holds the brush. Artificial intelligence should not be feared as an end to art, but rather a new medium, a new age of artifice.

– Zach Gospe

References

Captain, Sean. “Can IBM’s Watson Do It All?” Fast Company. N.p., 05 Jan. 2017. Web. 20 Jan. 2017.

Merchant, Brian. “The Poem That Passed the Turing Test.” Motherboard. N.p., 5 Feb. 2015. Web. 20 Jan. 2017.

“The Archive, Fall 2011.” Issuu. N.p., n.d. Web. 20 Jan. 2017.<https://issuu.com/dukeupb/docs/thearchive_fall2011>.

Turing, A. M. “Computing Machinery and Intelligence.” Mind, vol. 59, no. 236, 1950,  pp. 433–460. www.jstor.org/stable/2251299.

“TURING TEST SUCCESS MARKS MILESTONE IN COMPUTING HISTORY” University of Reading. N.p., 8 June 2014. Web. 21 Jan. 2017.

On Physics Tests and Roses

September 24, 2015 § Leave a comment

I was miffed.

“The names of the scientists are going to be on the test?”

My honors physics teacher, who I regarded as a generally reasonable man, had lost touch with reality and was resorting to the lowest of low testing methods: rote memorization without purpose.

Memorizing formulas was one thing–those were tools, mental shortcuts we employed to cut through a problem in a timely manner. P=mv, F=ma, W=mg–these facts were full, bloated with the theorems my teacher wrote out on the board when introducing the concepts to us.

But Rutherford, Hooke, Bernoulli–these were empty signifiers, a collection of letters that did no more to better my understanding of how things move about in space than watching Jeopardy did. In my mind, these names were trivia, and nothing more.

Rutherford, Hooke, and Bernoulli may have lived rich lives–to their contemporaries, peers, family, and friends, their names must have been loaded with connotation, each utterance of “Rutherford”–or perhaps, “Ernest”–conjuring up memories and feelings. But to the high school science student, “Rutherford” was associated with one thing: the discovery of the nucleus. We did not have the privilege of knowing Rutherford as a person, only the privilege of knowing his discovery.

And if that was all “Rutherford” boiled down to in our heads, what was the use of knowing that name? His life could have been interesting, but to feign the resurrection of his existence, to pretend that we were paying homage to him by remembering those empty letters when all we understood of them was the discovery, not man, attributed to them, seemed superfluous, almost irreverent. I had a deep appreciation for Rutherford’s gold foil experiment, not for him. I couldn’t have. I didn’t know the guy. I had a lot to thank the guy for–we all did, as students standing on this giant’s shoulders–but the rote memorization of his name didn’t do anything for him.

“These men devoted their lives to these discoveries,” formerly-reasonable physics teacher droned on. “We owe it to them to remember their names.”

But did we? The man was long gone–what did it matter to him if, 75-odd years after his death, students drew a line from his name to the gold-foil experiment on the page of a test?

Hypothetical #1: Suppose Rutherford’s last wish was to have his name go down in history for making a meaningful contribution to science–which element of that wish, the remembrance of his name or of his contribution to science, really matters?

History is important in that it allows us to learn from past mistakes, to build on the knowledge gathered by our predecessors.  We learn nothing from the man’s name.

Hypothetical #2: Suppose Rutherford made his meaningful contribution to science in order for his name to remembered, and that was his true last wish.

As unlikely as this scenario is, it begs the question: why do last wishes matter?

Last wishes matter only insomuch that some actor in the present derives utility from them. That actor may be the person making the last wish, comforted by a notion that they have the power to make an impact after they expire. It may be a relative or close confidant of the deceased, comforted by the notion that by honoring the last wish of their loved one, they’re salvaging a part of them by keeping a bit of their corporeal desires alive.

Or it might be a high school science teacher, seven decades after the last wish was made by a man he never met, comforted by the notion that it is important to remember the name of the deceased. He draws reassurance from this via a loosely-drawn syllogism, buried within the depths of his subconscious: if people considered remembering names important, then people might remember his name–and through every utterance of his name that occurs after his death, he might live a little longer.

This syllogism is buried in the back of most brains.

When I was thirteen, someone asked me the name of my great grandfather. With a shock, I realized that I didn’t know it–and I was terrified. Had he lived 80 years to only be forgotten by his great granddaughter, his existence fading into nothingness? Would I be doomed to a similar fate, forgotten by descendants, my life fading into

meaninglessness?

For that is the most grim notion of all–the notion that the sum of our actions, struggles, relationships, passions, and toil could amount to nothing. Welcome to life, the zero-sum game. Prepare to be overtaken by suffocating depression.

So we erect monuments to individuals, we pepper college campuses with statues, we catalogue gratuitous details of our fellow humans’ lives that far exceed the amount needed to learn anything significant from their experiences. We honor last wishes and memorize Rutherford’s name. We fight the idea, voiced by Andres in Mayflower II, that we exist merely to produce replacements: we strive to prove that we, as individuals, matter–to exert some semblance of control over that great leveling force, the eraser of our identities, the zero-sum despot known as death.

Because when death strikes, our voices are silenced. We can no longer fight to preserve our individuality. Thus, we preserve the individuality of others, rewarding innovation, breakthroughs and fame–and, in turn, striving to accomplish something notable enough that our successors will do the same for us.

Tooth and nail, blood and sweat, we claw our way to meaning.

But this  logic is flawed. It’s driven by a desire to be more than just a blink in the timespan world–to last beyond the 80-odd years granted to us by nature. But even if we accomplish something notable enough to warrant a statue, that statue is just a pebble thrown into a canyon. After thousands of years, stone erodes. Names are forgotten. And thousands of years are mere blinks in the timeline of the universe.

Ultimately, we all fade.

But not to meaninglessness.

Our lifespans may be naught but flashes in the perspective of the universe, but in our perspective, they are everything. They are our own universes. They matter, if only to us and our contemporaries. And they will leave legacies, if only ones that are accepted as assumed, attributeless characteristic of the future world. Our very state of existence implies this.

Those who do not know Rutherford’s name still benefit from his breakthrough. Perhaps more significantly, they benefit from all of the miniscule steps taken by nameless shadows of the past that enabled Rutherford to make his breakthrough. The tiniest fraction is still part of the sum–and, as Bradbury notes in A Sound of Thunder–the smallest butterfly a factor in the future of the world.

And it’s okay that we are ultimately all fameless fractions, our limits approaching zero. Long after we die, we’ll–you know–still be dead. And it won’t matter if there’s a hunk of rock out there with your name on it, or a group of students reciting your name to their teacher. With the eclipse of our personal universes comes the snuffling out of our pride and vanity. We’re gone. Dust in the wind.

And that thought is empowering.

If we narrow our perspective to the timeline of our own lives, instead of trying to grasp the strands of infinity, every action becomes more meaningful. With the recalibration of our viewpoints comes the revitalization of the “small things.” Stop and smell the roses. Appreciate the feeling of the sun on your face. Write a thank-you card. Hug someone.

Accept that all of these actions will inevitably be forgotten. But in the present moment, they are
everything.


Celeste Graves

Travel Through Space and Time is Lonely for Everyone (Except Andre 3000)

September 18, 2015 § 1 Comment

It isn’t easy being a protagonist in science-fiction literature about time or space travel; loneliness often seems to be a precondition to their lives.

Sometimes, this loneliness is unavoidable: a solitary ten-year-long spaceship ride will probably make you miss other people, and it’s always difficult to develop a robust social rapport with people thousands of years in the future, with their unrecognizable languages and inexplicable habits. Frequently, however, these lonely types are surrounded by people like them as they hurtle through space and/or time – they feel they way they do because of some distinct characteristic or internal bent, something that sets them apart from the others.

Now, these lonely characters aren’t a distinctive trait of science fiction; fiction writ large is full of lonesome brooding adventurers – Ishmael immediately comes to mind, from what many consider to be the Great American Novel. So don’t take this acknowledgement as a critique or judgment of value. Besides, prose fiction naturally invites a certain degree of internality and solitariness – the very act of reading is about silently constructing an internal world to which someone who is sitting a few feet from you would have no access.

However, science fiction doesn’t just exist as prose literature, so we can look at other artistic forms of science fiction to determine if the genre is actually more lonesome. To my mind, the most obvious alternative form to consider is popular music: not only does a fruitful history of science fiction music exist, the artistic form of music is fundamentally oriented toward communal interaction in way that literature isn’t. Music almost begs to be heard alongside others, which is one of the reasons why many concerts can outdraw even the most popular book reading.

But, upon a quick glance at some of the “classics” of the science fiction musical genre, the sense of loneliness found in the literature is still present. David Bowie’s “A Space Oddity” tells the unnerving story of an actual severing of connection between an astronaut and ground control, and, in doing so, actively creates a scene of inescapable solitude, soundtracked by distant instrumentation that reiterates that lonely void. The narrator of Elton John’s “Rocket Man” explicitly talks about missing his wife and kids while on a “long, long” trip to Mars. Black Sabbath’s “Iron Man” tells of a time traveler who, because of his inability to communicate with other people, brings about mass-scale destruction.

So, perhaps the genre of science fiction simply invites this solitude – this wouldn’t necessarily be that surprising. Space and time are both inconceivable massive expanses that can’t help but make a person feel somewhat insignificant and alone. Additionally, the core demographic for science fiction is often thought to be primarily composed of socially uncomfortable – regardless of the accuracy of this notion, it has at least been a prevalent stereotype.

However, a notable exception to the loneliness of science fiction music can be found in Outkast’s “Prototype” and its corresponding (and hilarious) music video. The song and its video depict and alien version of Andre 3000, who looks just like the regular ‘Dre 3K except with a terrible blond wig, landing on what seems to be Earth with his crew of shipmates; within minutes, Andre and an earthling woman have seemingly fallen into passionate love. While video is ridiculous and only really enjoyable for its absurd kitschiness, “Prototype” serves as an interesting departure from the loneliness of space travel. Everything in the video is utopian; Andre, his crew, and the Earth woman are existing peacefully and lovingly as a community only moments after arrival, a significant departure from the paranoid loneliness elsewhere seen. This apparent love is the opposite of traditional loneliness.

While extrapolating from a single case is dangerous, I think it is worth pointing out that Outkast is a hip-hop duo, while the previously cited musicians were classic rockers. Arguably, diverse participation in science fiction could allow for such shifts in tone and subject from classical models to new iterations. This theory of a shift from lonely, exceptional protagonists being propelled by diverse participation is further supported by artists such as Janelle Monae, an R&B singer who has produced some fascinating love songs within a science fiction framework.

Therefore, by enabling more diverse participation within the genre, the music of science fiction is perhaps finally being utilized to exhibit an imagination of communal connection across space and time. Whether or not the literature of science fiction has enacted (or should enact) such a broad shift away from loneliness, however, I’ll leave that up for debate.

— Lucas Hilliard

Is Station Eleven science fiction? Does it matter?

September 4, 2015 § Leave a comment

Last year, Emily St. John Mandel’s Station Eleven was published, a novel which follows a small cast of interconnected figures through various spatial and temporal settings, most notably the upper Midwest approximately twenty years after a pandemic has wiped out the bulk of the world’s human population. Largely due to the incredible vividness of that future Mandel presents, Station Eleven won the Arthur C. Clarke Award for best Science Fiction novel, and genre fiction titan George R.R. Martin has praised the novel on his blog, even suggesting it should be named best novel at this year’s Hugo Awards. Given that scores of science fiction writers, both excitable young adults and grizzled veterans, likely wish they could somehow travel to an alternate universe wherein their work received such acclaim, one might imagine that these are the proudest moments of Mandel’s burgeoning science fiction career. But there’s a catch: some people (Mandel among them) aren’t quite sure if she has a science fiction career at all.Featured image

You see, even Mandel doesn’t quite think of the novel as science fiction; rather, she imagines Station Eleven as literary fiction, and the book’s status as a finalist for the National Book Award, one of the most prestigious literary awards in the United States, indicates that much of the literary community agrees with her assessment. For many, Station Eleven deserves the label of “Serious Literature.”

I can almost hear some of your groans from across the computer screen, science fiction fans. “Here it is,” you’re thinking, “another author writing science fiction but claiming it’s something else so the literati will indulge. Look at the apocalyptic wasteland; notice all the allusions to Star Trek; for God’s sake, the book shares its title with the pulpy comic books about heroic adventures on a planet-sized spaceship that one of Station Eleven’s own characters writes.” And, to a certain degree, you’re probably right; genres matter economically, and, had the novel been marketed as pure sci-fi, the book probably wouldn’t have enticed the Oprah’s-Book-Club-type readers that have allowed it to soar up the bestseller list, and it certainly would have been ignored by the major literary prizes.

Mandel probably knows the functional value of avoiding a genre fiction label as well as anyone; oddly enough, she has stated that the impetus for writing was partially to avoid the label of “crime fiction author” that she had begun to obtain from her first three novels. No need to take Mandel’s denial of the sci-fi genre title too harshly, then; you can commiserate with your crime fiction peers. Go ahead and call Mandel a traitor – I’m sure there will others to echo your complaints.

However, if you can temper your anger for a moment, I’ll try to explain to you why Mandel’s supposed denial of the genre might not be as simple as it seems right now. As much as a short list of the book’s events and allusions might lead one to believe Station Eleven is directly within the lineage of science fiction, when actually reading the book, the lineage does not seem quite so apparent. Sure, a third of the book takes place in a dystopian future, but two-thirds of it does not; the novel cares about how its characters try to live and construct meaning in the horrific future, but it also cares about how its characters do so in mid-1990s Toronto and early 21st century Los Angeles, each of which occupies a length of the novel comparable to the dystopian future. While some chapters detail dangerous cultists stalking our band of heroes, other chapters focus on the disintegration of a famous couple’s marriage or the death of a middle-aged actor on stage while performing as King Lear. Most of the book reads as a purely realist meditation on the trials and anxieties of celebrity culture, a classic example of literary fiction.

Even in the dystopian future, Shakespeare looms large. His plays keep appearing throughout every portion of the text, including the future; not even dear Captain Picard means as much to the novel as the Bard. For all of the novel’s interest in science fiction, the primary literary figure is not Asimov or Clarke – rather, it is the man who serves as the cornerstone for countless authors who know nothing of metaphysical sciences but can recite countless lines from metaphysical poets.

So, if one is to be fair, Station Eleven can be read just as convincingly as a work of literary fiction just as – if not more – convincingly than as a work of pure science fiction. However, is fighting over what the novel is truly productive? Both communities have recognized it with major awards; the book now belongs to both genres, each of which could stand to learn the value of sharing from the genre of children’s picture books. And perhaps, if we all can learn the importance of sharing, Station Eleven can serve as the rare book that proves real-world literary fiction and science fiction can coexist within one text, that most books aren’t as simple as their shelving designation.

In the harsh future of Station Eleven, a band of musicians and actors come together to form a traveling troupe, dedicating to performing the works of Shakespeare, in order to maintain some semblance of pre-collapse culture. This group, known as the Traveling Symphony, has a slogan, which they have written on their caravan: “Because survival is insufficient.” The source of this quotation, so vital to this group of Shakespeareans that they immediately advertise it to anyone who might see them? Star Trek. In Mandel’s future, the literary culture has, by necessity, been stripped of its silly biases and pointless designations. Star Trek and Shakespeare can exist within the same space, in tandem. Maybe our future can also feature such a productive coexistence of literary and science fiction, and maybe Station Eleven can serve as a guide. I hope so, as long as we don’t have to go through our own devastating pandemic in order to get there.

–Lucas Hilliard

The Art of Sci-Fi

September 2, 2011 § Leave a comment

“HOW HAVE YOU NOT SEEN STAR WARS?!??”

I asked Her as we pulled into the Sonic drive through, with bellies roaring with hunger after the four hour hike up and down a spectacular bluff in Tazewell, Virginia.  Our photographic venture had left us both amazed and starving.

“Haha, I don’t know,” she retorted, “you know I’m not really crazy about Sci-Fi stuff and all.”

Over the next ten minutes, while utterly destroying the cheeseburgers, onion rings, and frothy milkshakes that laid before us, I explained the depth of those three movies as best I could.  How starships and lightsabers only set the stage to tell an epic and moving story about courage, loyalty, friendship, love (if you, like I do, consider the three as a single story), sacrifice, and especially redemption.  I’ve witnessed many a Facebook status to the amount of “everybody deserves a second chance,” but have seen few events which better exemplify redemption than Vadar’s sacrifice at the end of the third film.

“Seriously?”  She added at the conclusion of my spiel, “I always thought it was just an action movie, with lasers and robots and stuff.”

“Well, I mean, it has that too,” I responded with a rather large grin on my face.

“Haha, boys will be boys.”

“I don’t think science fiction has to do with gender.  I think more than anything it’s because we really want to believe than anything is possible,” I said to Her, “that our experiences are only limited by our imaginations.  I mean take Blade Runner for example…”  She laughed and gave me a look as if I was talking about some futuristic weapon of mass destruction.

“It’s a movie, haha,” I quickly added.  I recalled to Her the scene in which Deckard repeatedly asks his computer to “enhance” an image to give him a lead in his investigation.  “I wish getting a great shot was as easy for me as it is for Rick Deckard.  We just hiked up and down a mountain, and I might have gotten nothing at all.”

“Oh my gosh, hush.  I loved the shot you took of the owl in Warner Park.  You’ve got a great skill,” she said.

“Umm, not exactly,” I quickly added, “Had I not driven by at that exact moment, OR not had my camera, OR not brought my long lens, OR not had such a perfect subject who for God only knows stayed on that branch despite my frantic (and rather loud) attempts to open the trunk of my car and put my gear together, that shot would have never occurred.  That was more luck than art!”

“Well you consider Star Wars art even though science says it shouldn’t have happened either right?  I think I’ve picked out a movie for us to watch tonight.”

I smiled to myself as we pulled out of the parking lot.  After all, everything has an art to it.

-A.M.A.

Where Am I?

You are currently browsing entries tagged with Art at Science/Fiction.

%d bloggers like this: