A New Age of Artifice

January 23, 2017 § 5 Comments

In the fall of 2011, Duke University’s undergraduate literary journal published a rather unassuming poem entitled “For the Bristlecone Snag” (“The Archive”). To the journal’s poetry editors, the poem appeared to be a typical undergraduate work, comprised of several unfulfilled metaphors and awkward turns of phrase. What the editors did not know at the time of publication, however, was that this poem was not written by a human. Instead, it was written by a computer program (Merchant).

When I first learned about “For the Bristlecone Snag”, I was reminded of the writings of Alan Turing, a renowned English computer scientist in the mid 20th century. In his seminal article on the subject of artificial intelligence (A.I.), Turing articulates that the question, “can machines think?”, is “too meaningless to deserve discussion” (Turing 442). After all, he claims, we have no direct evidence that other humans can think, and we merely assume that they do based on their behavior. Turing argues that this “polite convention that everyone thinks” should apply to all beings that can demonstrate human behavior (Turing 446). It is from this line of thought that Turing conceptualized the Turing Test, an experiment in which a computer tries to convince a human of its humanity. According to Turing, if an A.I. can convince a human judge that it is human, then we must assume that the A.I. can think.

While the program that produced “For the Bristlecone Snag” did not complete an extensive and proper Turing Test, it did convince human judges that it was human. At the very least, the poem’s acceptance into an undergraduate literary journal reveals that literate machines can, and will, exist in the near future. The way is paved for more professional and accomplished artificial authors.

Indeed, even in the half decade since “For the Bristlecone Snag” was published, the technology behind artificial intelligence has improved rapidly. Watson, IBM’s “cognitive computing platform”, is a great example of this progress (Captain). In 2011, Watson defeated two reigning champions in Jeopardy, successfully interpreting and answering the game show’s questions. While this feat alone was a remarkable step in cognitive computing, Watson’s analytical abilities have since then contributed to over thirty separate industries, including marketing, finance, and medicine (Captain). For example, the machine can read and understand millions of medical research papers in just a matter of minutes (Captain). As intelligent as Watson is, however, he was never designed to pretend to be human. The chief innovation officer at IBM, Bernie Meyerson, believes ‘“it’s not about the damn Turing Test”’; his team is more interested in accomplishing distinctly inhuman tasks, such as big data analysis (Captain).

While IBM may not be interested in the Turing Test, other artificial intelligence companies have been working specifically towards the goal. In 2014, a program by the name of Eugene Goostman passed the Turing Test using machine learning strategies similar to those that drive Watson (“TURING TEST SUCCESS”). The chatbot, or program that specializes in human conversation, was able to convince several human judges that it was a thirteen-year-old boy (“TURING TEST SUCCESS”). Given the success of Eugene Goostman, and the intelligent accomplishments of Watson, it is indisputable that the Turing Test can be, and has been, passed. Artificial intelligence is a reality. Machines can think.

As an aspiring writer and computer scientist, I can’t help but fixate on the implications that A.I. has for literature. It is entirely possible, even likely, that “For the Bristlecone Snag” foreshadows an era in which the most successful and prolific authors will be machines, an era in which the Pulitzer Prize and Nobel Prize in Literature are no longer given to humans, an era in which humanity no longer writes its own stories.

Yet, this era of artifice should not be greeted with worry or anxiety. Art has always been artificial, a constructed medium for human expression. In the coming decades, we will author the next authors, create the new creators, we will mold the hand that holds the brush. Artificial intelligence should not be feared as an end to art, but rather a new medium, a new age of artifice.

– Zach Gospe

References

Captain, Sean. “Can IBM’s Watson Do It All?” Fast Company. N.p., 05 Jan. 2017. Web. 20 Jan. 2017.

Merchant, Brian. “The Poem That Passed the Turing Test.” Motherboard. N.p., 5 Feb. 2015. Web. 20 Jan. 2017.

“The Archive, Fall 2011.” Issuu. N.p., n.d. Web. 20 Jan. 2017.<https://issuu.com/dukeupb/docs/thearchive_fall2011>.

Turing, A. M. “Computing Machinery and Intelligence.” Mind, vol. 59, no. 236, 1950,  pp. 433–460. www.jstor.org/stable/2251299.

“TURING TEST SUCCESS MARKS MILESTONE IN COMPUTING HISTORY” University of Reading. N.p., 8 June 2014. Web. 21 Jan. 2017.

Advertisements

Criticism of Cli-Fi: A Global Warning Gone Too Far?

November 13, 2015 § 1 Comment

Meteorologists and climatologists around the world are sounding the alarm about the role human actions have played in bringing about “global warming.” And apparently, filmmakers are, too. The rise in ocean temperatures is being met with a rise in the demand for movies that explore the potential doom of humanity as a result of the deteriorating environment–movies like Avatar, The Day After Tomorrow, the Interstellar, and even 2012. So what makes viewers so responsive to “climate fiction” and ecological disaster?

It’s no secret that climate change is real. Data from computer analysis and report after meteorological report demonstrate the rapid melting of glaciers and increased concentration of greenhouse gases, especially carbon dioxide, in the upper levels of the atmosphere. These phenomena are observable consequences of what scientists believe is our own tinkering with natural processes and resources. With effects so tangible, it’s difficult to deny that there is something happening, even if we don’t know what it is yet. The mysterious nature of global warming only makes it scarier for the average citizen and cli-fi viewer. To make matters worse, the data seem to show that this is all the fault of the human race, all our fault. And because we have contributed so much to the destruction of our home, we feel a responsibility to fix this and return Earth to its original, untainted glory.

So that’s why cli-fi hits so close to home. The scenarios may not be real, but they are realistic enough to allow the audience to recognize and visualize the potential repercussions of growing industry and economy. Climate change is a global event, affecting everyone and everything from the poles to the equator. There is no escape; the only safety may come from jetting off into outer space, as films like Interstellar and Wall-E suggest, or restoration of our planet.

But Noah Gittell, a film critic who writes for The Atlantic, believes the choice is not so black-and-white, and such films do more to instill paranoia regarding improbable situations than to actually bring about change. Though climate fiction may lead us to believe that ecological disaster is no longer just a calamity of the future, Gittell feels that the sub-genre dramatizes the effects to an unconceivable extreme. And this is not at all helpful is promoting awareness of the actually manageable, and maybe even reversible, issues possibly caused by human actions.

Even if this new genre of media may misrepresent the scientific aspects of climate change, it certainly holds incredible merit. Literature and film are critical in providing momentum for any movement representing worldwide issues. Novels and films have played pivotal roles in beginning global conversations about the sustainability of Earth and initiatives to decrease gas emissions; this has made activism more accessible to the general public, especially since cli-fi films target a younger audience more receptive to the presentation of global problems in this format.

So what do you think? Is cli-fi only valuable as entertainment? Or can it serve a higher purpose and force us to address environmental issues more seriously?

Related Article: http://www.theatlantic.com/entertainment/archive/2014/11/why-interstellar-ignores-climate-change/382788/

-Bushra Rahman

Craving Attention in the 21st Century: The Significance of Social Media and Science Fiction.

November 5, 2015 § Leave a comment

My newest insta pic only has 66 likes. Do you SEE that creative angle? I was lying on the grass (I was reading in the sun so I was not totally pathetic and desperate) for that angle. Do you SEE that slight color adjustment? Not too much, but just enough for you to truly appreciate that blue sky and green lawn. I had creative hashtags that would draw people in while also using some broader hashtags to appeal to a larger audience (#sky #scenic #nature). I even made sure to delete those hashtags after a few days (I definitely don’t want people to think I am desperate for attention). I might as well have put #desperate if I was being truly honest with myself.

** Username, location and caption have been edited out for my own privacy. **

** Username, location and caption have been edited out for my own privacy. **

My dad calls me pretty regularly, approximately 75% of the time for help with his phone, iPad or really anything that plugs into an outlet. My father is many things: an electrician, a carpenter, a chef, a jack-of-all-trades. Unfortunately, he is anything but technologically literate. Born long enough ago to have been drafted for the Vietnam War (He would probably be mad if I ever posted his true age for the world to see), he frequently recounts what he thought the future would look like and how hard it was growing up in the ancient days (He would probably claim to have walked six miles in the snow to school like any other stereotypical old people if he hadn’t grown up in Miami).

His dreams of the future saw technology revolutionizing the world, not supporting the rise of social media platforms such as Instagram and Facebook that feed on the new generation’s narcissism and dependence on social approval for value.

For most of its existence, science fiction was a genre outside of mainstream consumption that depended largely on pulp magazines to cheaply circulate the stories. Unfortunately, the terms “science fiction” and “sci-fi” often conjure up judgmental images of young prepubescent boys on the outskirts of society. In a way, science fiction was for the socially inept or awkward. It was literature for cheap paper and obscure magazines. A positive change in the 21st century has been a larger acceptance of science fiction, largely due to blockbuster hits. Science fiction is becoming exciting, dramatic and mainstream.

But, becoming more popular and circulated opens it up for more influence from this new technologically narcissistic generation. I freely admit that I have been guilty of this exact issue. When I started writing for this blog I wrote from a creative part of my heart that simply wanted to entertain anyone willing to give my stories attention. I really didn’t use too many hashtags, just enough to convey what my story was about. I wanted creative titles that would confuse or intrigue the reader.

And then I realized I could see how much traffic my posts were generating.

Every week I checked to see if my stories, articles and posts had new views and comments. My hastags started to get more specific and numerous. My titles started to reflect more what someone would enter into a search engine. I noticed that the most “successful” post had a title along the lines of “the significance of . . .” and realized that people had a reason to type that into Google. People cared about that answer.

When I think of science fiction I think of it being a sort of social commentary that incorporates real issues with a focus on technology, space exploration and the like. That is what I had wanted to do but not what I ended up necessarily doing. I had let my own egoism and dependence on attention for self-esteem and value that I had strayed from my creative goals and aspirations. I was a sell out.

This is not to say that all of my entries have fallen for this social media pitfall. I poured my heart and soul into a blog post about the dangers of genetic engineering. I spent hours writing a story about evolution. But, I am human and I make mistakes. I look back on my blogging career and see those lapses in judgment where I sought attention for my articles.

Much of science fiction is about technology and yet I let the technology control me. I was no longer the one reinventing technology or predicting the social implications of existing technology. I was letting technology reinvent my creative process.

So, maybe some of you will have some sort of homework assignment that asks you to discuss the significance of social media. Or, maybe the significance of science fiction as an “emerging” genre. So, you guys hop on the computer and search “the significance of . . .” in a search engine and my article pops up. I hope you all click on the article and read it. Not for views, comments and likes, but maybe to learn from my own experience and realize that technology is a tool that you use, not a tool that uses you.

– S. Jamison

Changing the Public Perception of “Mad” Science

October 9, 2015 § 4 Comments

Capture 2Capture4Capture 6

Scientists aren’t always depicted in the most flattering of lights. The stereotype that those who commit much of their time to experiments and scientific investigation are “mad scientists” is a trope that has been overused in literature, comics, and film, and it paints quite an unsettling picture of experimenters as extremely eccentric or insane hermits devoid of emotions or social skills who live their lives in laboratories tucked away from civilization. While the nature of scientific research no doubt requires one to spend hours performing experiments, analyzing data, and writing papers, this is far from a projection of a scientist’s personality and is simply the nature of any extended learning process.

This “process,” however, seems to be what keeps the general public away from the science scene. While scientific journals and documentaries about laboratory work may try to engage the common person in the discoveries being made daily in laboratories around the world, the public is not having it. The jargon is intimidating and the theories are complex. So what can be done to create a mutual understanding between those who do science and those who do not?Capture 5Capture

There are dozens of STEM programs—including everything from NASA Robotics Internships to Destination Science summer camps—seeking to involve kids and young adults in scientific activities. So why do kids participating in the “Draw-A-Scientist” Test (DAST) still create the same image of an unkempt, unsmiling, lab coat-clad man when asked to draw a picture of a typical “scientist?”  Some disconnect between actual experiments being conducted and information relayed to the general public prevents children from illustrating a scientist as a happy, tidy, female inventor or explorer. This clearly demonstrates the need to emphasize that there are not two different groups at odds here. Scientists are not “them.” They are “us.” We are all, in a way, engaging in this systematic methodology in our everyday lives. For some reason, people seem to have an inherent suspicion of scientists, believing that they will use the knowledge they acquire against us when in reality, most research is done for the greater benefit. Although we might not be fully knowledgeable on the research being done in nearby labs and universities, the unknown always creates opportunities to learn.

And this is where science fiction can come into play. Science fiction can be the bridge that spans the gap in appreciation between the public and the scientific. Sci-fi is not written to educate. It is created and imagined to explore, redefine, and inspire. Scientific journals might not be the most alluring magazines to read while unwinding on a Friday night. But novels, on the other hand, can present the same ideas and similar material in a more attractive and accessible fashion, generating a greater interest for scientific discovery. As more and more science fiction films and novels have been released in the past decade, public interest in this genre has increased as well. Hopefully, these forms of media will prompt more people to regard scientific work, and those who perform it, as valuable to them and society.

References and Further Reading:

  1. The Draw-A-Scientist Test http://onlinelibrary.wiley.com/doi/10.1002/sce.3730670213/abstract;jsessionid=D380FD12DDF9ED2CDB0F4B4EE07CA6F1.f02t03
  1. Images

http://ed.fnal.gov/projects/scientists/index.html

-Bushra Rahman

TimeEscape

October 9, 2015 § 1 Comment

A line of characters flooded the screen, alternating as my friend shifted her position in my keyboard. My book report was pulled up in Microsoft Word, the cursor blinking frantically as it tried to keep up with my friend’s sabotage. I just laughed as I pushed her off, then entered my newest discovery into the keyboard, taught to us only a week before in 5th grade computer class:

Ctrl+Z.

I was fascinated by the concept of that combination of keys. I could make the most impulsive of edits, write the most ridiculous statement, and delete entire chunks of my paper without worrying about any long term consequences. If I didn’t like the result of my action, I could just push those two keys- Ctrl+Z- and everything would be as it should. A fresh start. Slate cleared. Back to whatever square I chose I wanted to resume work from. I didn’t have to worry about reconstructing any past reality or losing anything to time and effort, because with those two keys in my hand, I could take myself back to any foundation, given I had built the foundation before.

Ctrl+Z.

What a tool.

Hooked as I was on the thrill of “Edit-> Undo,” I was a little taken aback when I realized that this handy shortcut didn’t apply to social interactions. It was irrational, I know- but after a week of riding the high of Ctrl+Z, I had somehow assumed that the same rules that applied to my word processor could apply to real life. And when they didn’t, I was not so much alarmed as unsettled.

I always knew Ctrl+Z was a function of the digital realm. But nonetheless, when my confession of a crush to the boy I liked was met with a blank stare, I found my thumb and forefinger twitching, pushing at the keys that weren’t there:

Ctrl+Z.

I couldn’t edit this unfortunate moment out of my past, couldn’t insert myself back into an earlier version of my life’s document, the one where he didn’t avoid eye contact every time we passed in the hallway. Just like everything  else in the real world, I was bound by time–that immutable, stubborn dimension that refuses to yield to all of human ingenuity, that force that turns into bold, permanent marker the marks that we’d rather be in pencil. There is always the possibility that you can cover up the Sharpie-mask it with the paint of reconciliation, or hide it underneath the tarp of loaded silence.

But no matter what you throw over it, the Sharpie always remains, bleeding through the medium to remind you that yes, this happened. You messed up. You will have always messed up this moment. There’s nothing you can do about it.

Science fiction’s answer to this kick in the brain, this blow of helplessness?

Time travel.

Novels like Timescape take our worst fears–that we might irreparably damage our world, whether that world be the world of individual humans or the literal world of humanity–and puts a bandaid over them, then tucks us into to say goodnight and tells us that everything is going to be okay, because somebody will fix it. Somebody will hit the undo button. The irreparable will become repairable, and we can throw away our tarnished slates and start again.

Time travel grants us control over the fourth dimension and releases us from the chains of time, thereby releasing us from our mistakes. We are fascinated by it because we so deeply want it to be true–to imagine that we can go back and make things right before they ever went wrong.

But at the same time, it’s these wrongs that make us who we are. All of those “character building moments” would be lost if we indulged in easily-accessible time travel-we would never learn anything, because there would be no significant consequences for our actions. Perhaps more importantly, all of the good, unintended consequences of mistakes would be lost. The world would stagnate, because all of the rich innovation that arises out of failure would be lost.

We can’t predict the long term consequences of our actions. Our mistakes can be our biggest triumphs.

However, as Timescape notes, sometimes our triumphs–chemical developments and more efficient methods of manufacturing–can be our biggest mistakes, leading to our downfall–the dismal world Benford describes. And it is this possibility–that we could, as a species, ruin the world–that is the most terrifying to us, because it means that we would tarnish every blank slate born into our mistakes.

Furthermore, it is this possibility that is terrifyingly real.

Gregory Benford might not have the means to time travel in real life, but his fingers are desperately twitching at Ctrl+Z anyway–and as a result of this twitching, typing out a great novel of warning. This book is Benford’s best version of a tachyon, a message to the present urging change and a greater consideration of the future–

because the future will soon become the present, and when it does, we can’t just hit
Ctrl+Z.


Celeste Graves

What Do We Sacrifice For “Perfection”?

September 29, 2015 § 1 Comment

It looked like any other hospital waiting room. Well, any other hospital waiting room in the year 2050. I’ve been told that you weren’t kept behind bars like a common criminal. I’ve been told the doors didn’t always have locks on the outside. Hell, I’ve even been told the rooms had chairs to sit in. . . . I’m pretty sure most of those are myths though. I guess it really doesn’t matter how the rooms might have been though. I’m here now and that is all that really matters.

Eventually they come for you. You don’t know when it will be. You just don’t know . . . but they come eventually. People leave one by one. Where they go I can only imagine, but I guess I will find out soon enough. No one really seems to be nervous and I guess they don’t really have a reason to be. We aren’t here voluntarily. We don’t have a choice, an escape, an alternative. You just accept your fate as it comes.

There was a general trend in the room. None of us looked old than five or six and most of us had obvious defects. You were snatched up as soon as something seemed off with you. For some they were born lucky. Infants with a clear disorder were treated on the spot. They won’t even remember the treatment. Not all of us were so lucky. The boy across from me sat drooped over in his wheelchair. His legs looked frail and thin.

He will walk soon enough. Everything will be fixed soon.

How I found myself in this mess was entirely different. You can’t tell something is wrong with me just by looking at me. The moment I was born my parents could sigh in relief that they would never have to turn their child over to the state. I am more sorry for them than I am for myself as I sit here. I’m an anomaly. It all started when I was three or four and I insisted I was a girl. “But Michael, you’re my baby boy,” my mother would insist. She would force trucks and army men into my hands to play with. She dressed me exclusively in blue. She put me in karate and never let me have girl friends. I was defective.

But medicine can fix all of that now. I am told that after the surgery I won’t even remember wanting to be a girl. I will be my mommy’s strong little man after all.

And with that, they came for me.

——————————————————————————

            I deliberately chose something that would offend or shock. Being transgender is not a defect. It is not something inherently wrong with the person. It is not something to treat. So why did I chose the issue of being transgender as the main driving force of my narrative?

To make you think and question.

Medicine and genetic research has come leaps and bounds from where it began. Thus far, the progress has been something that I support wholeheartedly. Stem cells have incredible potential to change the world. Finding a foolproof cure to cancer would revolutionize the world. But where do we draw the line?

Something I think about is where genetic engineering must stop. My fear is not so much what we as human beings can create, but rather how we choose to use that technology. My greatest fear is that we find ways to change things that are simply hard to understand or not the “norm.”

I have incredibly strong friends who have a wide variety of sexual orientations, gender identities, disabilities, you name it. They are the strongest people I know and I know that they wouldn’t change who they are if given the chance. Nothing is wrong with them. They are unique and beautiful. But as someone who loves them all deeply and unconditionally, I fear that less tolerant people will try to change them. I used to paint my friend’s nails at sleepovers just to take it off in the morning before returning home. Thankfully, Claire*  was able to transition to the person I had always accepted her to be when she moved out of her house. I know her parents would have changed her to accept being male, the sex she had been born with, if given the chance.

I hope I did not offend you, but I do hope I shocked you. We need to think about the limits to genetic engineering. Not the scientific limits, but the moral and ethical limits. Just because something may be possible, doesn’t mean we should necessarily do it. I don’t really know where we should draw the line in the sand, but I hope we can start the dialogue.

Maybe this story can be the pebble that creates waves.

*Name changed for privacy. 

  • S. Jamison

On Physics Tests and Roses

September 24, 2015 § Leave a comment

I was miffed.

“The names of the scientists are going to be on the test?”

My honors physics teacher, who I regarded as a generally reasonable man, had lost touch with reality and was resorting to the lowest of low testing methods: rote memorization without purpose.

Memorizing formulas was one thing–those were tools, mental shortcuts we employed to cut through a problem in a timely manner. P=mv, F=ma, W=mg–these facts were full, bloated with the theorems my teacher wrote out on the board when introducing the concepts to us.

But Rutherford, Hooke, Bernoulli–these were empty signifiers, a collection of letters that did no more to better my understanding of how things move about in space than watching Jeopardy did. In my mind, these names were trivia, and nothing more.

Rutherford, Hooke, and Bernoulli may have lived rich lives–to their contemporaries, peers, family, and friends, their names must have been loaded with connotation, each utterance of “Rutherford”–or perhaps, “Ernest”–conjuring up memories and feelings. But to the high school science student, “Rutherford” was associated with one thing: the discovery of the nucleus. We did not have the privilege of knowing Rutherford as a person, only the privilege of knowing his discovery.

And if that was all “Rutherford” boiled down to in our heads, what was the use of knowing that name? His life could have been interesting, but to feign the resurrection of his existence, to pretend that we were paying homage to him by remembering those empty letters when all we understood of them was the discovery, not man, attributed to them, seemed superfluous, almost irreverent. I had a deep appreciation for Rutherford’s gold foil experiment, not for him. I couldn’t have. I didn’t know the guy. I had a lot to thank the guy for–we all did, as students standing on this giant’s shoulders–but the rote memorization of his name didn’t do anything for him.

“These men devoted their lives to these discoveries,” formerly-reasonable physics teacher droned on. “We owe it to them to remember their names.”

But did we? The man was long gone–what did it matter to him if, 75-odd years after his death, students drew a line from his name to the gold-foil experiment on the page of a test?

Hypothetical #1: Suppose Rutherford’s last wish was to have his name go down in history for making a meaningful contribution to science–which element of that wish, the remembrance of his name or of his contribution to science, really matters?

History is important in that it allows us to learn from past mistakes, to build on the knowledge gathered by our predecessors.  We learn nothing from the man’s name.

Hypothetical #2: Suppose Rutherford made his meaningful contribution to science in order for his name to remembered, and that was his true last wish.

As unlikely as this scenario is, it begs the question: why do last wishes matter?

Last wishes matter only insomuch that some actor in the present derives utility from them. That actor may be the person making the last wish, comforted by a notion that they have the power to make an impact after they expire. It may be a relative or close confidant of the deceased, comforted by the notion that by honoring the last wish of their loved one, they’re salvaging a part of them by keeping a bit of their corporeal desires alive.

Or it might be a high school science teacher, seven decades after the last wish was made by a man he never met, comforted by the notion that it is important to remember the name of the deceased. He draws reassurance from this via a loosely-drawn syllogism, buried within the depths of his subconscious: if people considered remembering names important, then people might remember his name–and through every utterance of his name that occurs after his death, he might live a little longer.

This syllogism is buried in the back of most brains.

When I was thirteen, someone asked me the name of my great grandfather. With a shock, I realized that I didn’t know it–and I was terrified. Had he lived 80 years to only be forgotten by his great granddaughter, his existence fading into nothingness? Would I be doomed to a similar fate, forgotten by descendants, my life fading into

meaninglessness?

For that is the most grim notion of all–the notion that the sum of our actions, struggles, relationships, passions, and toil could amount to nothing. Welcome to life, the zero-sum game. Prepare to be overtaken by suffocating depression.

So we erect monuments to individuals, we pepper college campuses with statues, we catalogue gratuitous details of our fellow humans’ lives that far exceed the amount needed to learn anything significant from their experiences. We honor last wishes and memorize Rutherford’s name. We fight the idea, voiced by Andres in Mayflower II, that we exist merely to produce replacements: we strive to prove that we, as individuals, matter–to exert some semblance of control over that great leveling force, the eraser of our identities, the zero-sum despot known as death.

Because when death strikes, our voices are silenced. We can no longer fight to preserve our individuality. Thus, we preserve the individuality of others, rewarding innovation, breakthroughs and fame–and, in turn, striving to accomplish something notable enough that our successors will do the same for us.

Tooth and nail, blood and sweat, we claw our way to meaning.

But this  logic is flawed. It’s driven by a desire to be more than just a blink in the timespan world–to last beyond the 80-odd years granted to us by nature. But even if we accomplish something notable enough to warrant a statue, that statue is just a pebble thrown into a canyon. After thousands of years, stone erodes. Names are forgotten. And thousands of years are mere blinks in the timeline of the universe.

Ultimately, we all fade.

But not to meaninglessness.

Our lifespans may be naught but flashes in the perspective of the universe, but in our perspective, they are everything. They are our own universes. They matter, if only to us and our contemporaries. And they will leave legacies, if only ones that are accepted as assumed, attributeless characteristic of the future world. Our very state of existence implies this.

Those who do not know Rutherford’s name still benefit from his breakthrough. Perhaps more significantly, they benefit from all of the miniscule steps taken by nameless shadows of the past that enabled Rutherford to make his breakthrough. The tiniest fraction is still part of the sum–and, as Bradbury notes in A Sound of Thunder–the smallest butterfly a factor in the future of the world.

And it’s okay that we are ultimately all fameless fractions, our limits approaching zero. Long after we die, we’ll–you know–still be dead. And it won’t matter if there’s a hunk of rock out there with your name on it, or a group of students reciting your name to their teacher. With the eclipse of our personal universes comes the snuffling out of our pride and vanity. We’re gone. Dust in the wind.

And that thought is empowering.

If we narrow our perspective to the timeline of our own lives, instead of trying to grasp the strands of infinity, every action becomes more meaningful. With the recalibration of our viewpoints comes the revitalization of the “small things.” Stop and smell the roses. Appreciate the feeling of the sun on your face. Write a thank-you card. Hug someone.

Accept that all of these actions will inevitably be forgotten. But in the present moment, they are
everything.


Celeste Graves

Where Am I?

You are currently browsing entries tagged with Sci-Fi at Science/Fiction.

%d bloggers like this: