Money Speaks Louder than Human Voices

March 28, 2017 § 3 Comments

“Everything has a price.” This phrase in Margaret Atwood’s Oryx and Crake is not new, but it takes on a new meaning in the context of her novel (139). In today’s world, corporations dominate in every sphere from the economy to religion and politics. While Atwood’s world in which corporations have absolute control is unsettling, her ideas are merely an extrapolation of current times to the future. However, as Atwood shows, commercialism and commodification come at a high price to society and the humans that are a part of it.

Early on in the novel we learn that Jimmy (or later Snowman) lived on a company compound called OrganInc. The corporation controls everything in Jimmy’s life including his school and the rules he has to abide by, enforced through the CorpSeCorps. Later on, we learn that Jimmy and Crake attend what are similar to universities. These “universities,” particularly Crake’s Watson-Crick Institute, aim to generate profits as well, encouraging the very bright students to innovate and develop new technology, carefully securing their facilities, and minimizing interaction with the outside world. In Jimmy’s world, corporations control everything, and their motives clearly dominate.

The corporation-developed compounds seem absurd; however, in reality, they already exist. Massive companies like Amazon and Google have “campuses” that contain everything one needs to live off of. They include restaurants, gyms, childcare facilities, and even sleeping pods – all designed to keep you inside and focused on doing everything possible for the company. Beyond company campuses, universities today mimic those in Atwood’s story. As Vandy students, we even say that we live in a “Vandy Bubble.” Our lives all exist within the confines of our campus as we strive to learn and make new developments in all fields. We are not far off from the fictitious world that Atwood describes.

Images are renderings of future campuses for Google, Amazon, and Apple (from left to right). 

Why does it matter that corporations and technological research centers have such a wide sphere of influence? In a world where profit governs, everything becomes a commodity. This can easily be seen in Oryx and Crake with the story of Oryx. Not only is Oryx commoditized by the pimps that earn money for her sexual acts and pornography but Oryx is also commoditized by every viewer that watches the child pornography, including Snowman. In her discussions of her experience, Oryx has clearly been influenced by the corporation mentality surrounding her, as she states:

“They had no more love…but they had money value: they represented a cash profit to others. They must have sensed that – sensed they were worth something.” (126)

Do we only value human beings for the monetary value they provide? I hope not. Atwood shows a disturbing reality if corporate power continues on its current trajectory. The power of corporations to influence politics and culture even today has implications for cloning and other advanced technology. It is unsettling to think of the development of human clones by companies driven by their own bottom-line. Morality does not seem to have a place in this kind of world.

If we do consider these clones to be “human,” how do we prevent the corporate developers from treating the clones like commodities and not humans, especially when humans today are already commoditized? In the novel, Snowman compares the children in the pornography to “digital clones,” as they did not feel real to him (90). With this statement, Atwood warns of the commodification of both existing humans and potential human clones in the future. If corporations both govern and profit, we cannot prevent abuse and exploitation.

Atwood is not far off in her portrayal of the commodification of human clones. Human cloning has often been criticized for turning human organs into commodities due to their monetary value with cancer treatments and other diseases. President Bush famously rejected all human cloning, stating, “Life is a creation, not a commodity.” He is not alone in being concerned with this idea, as scientists, philosophers, and policy-makers have discussed the implications of human cloning for decades. The Presidents Council on Bioethics expressed the following:

“When the ‘products’ are human beings, the ‘market’ could become a profoundly dehumanizing force.” (The Presidents Council on Bioethics, 2002)

When corporate greed becomes entangled with the morality of health remedies, the potential commodification of humans and human clones is endless. Although Atwood’s fictitious world seems so distant, the reality is that it is much closer to present day than one would first think. From humans to clones to our independence and our value, Atwood shows that everything has a price, and the costs to society are high.

Sources:

Atwood, Margaret. Oryx and Crake. New York: Anchor, 2004. Print.
Arter, Melanie. “Bush: ‘Life Is A Creation, Not A Commodity’.” CNS News. CFC, 07 July 2008. Web. 28 Mar. 2017. http://www.cnsnews.com/news/article/bush-life-creation-not-commodity.
The President’s Council on Bioethics. “Human Cloning and Human Dignity: An Ethical Inquiry.” Georgetown University, July 2002. Web. 28 Mar. 2017. https://bioethicsarchive.georgetown.edu/pcbe/reports/cloningreport/children.html.
Cambria, Nancy. “Our 21st-century Prophet, Margaret Atwood.” St. Louis Post-Dispatch. STLtoday, 26 Sept. 2015. Web. 28 Mar. 2017. http://www.stltoday.com/entertainment/books-and-literature/reviews/our-st-century-prophet-margaret-atwood/article_242b5f9b-3ac6-51e3-9024-e858d178f6e2.html.

Images source: http://www.geekwire.com/2013/4-tech-titans-building-campus/

A New Age of Artifice

January 23, 2017 § 5 Comments

In the fall of 2011, Duke University’s undergraduate literary journal published a rather unassuming poem entitled “For the Bristlecone Snag” (“The Archive”). To the journal’s poetry editors, the poem appeared to be a typical undergraduate work, comprised of several unfulfilled metaphors and awkward turns of phrase. What the editors did not know at the time of publication, however, was that this poem was not written by a human. Instead, it was written by a computer program (Merchant).

When I first learned about “For the Bristlecone Snag”, I was reminded of the writings of Alan Turing, a renowned English computer scientist in the mid 20th century. In his seminal article on the subject of artificial intelligence (A.I.), Turing articulates that the question, “can machines think?”, is “too meaningless to deserve discussion” (Turing 442). After all, he claims, we have no direct evidence that other humans can think, and we merely assume that they do based on their behavior. Turing argues that this “polite convention that everyone thinks” should apply to all beings that can demonstrate human behavior (Turing 446). It is from this line of thought that Turing conceptualized the Turing Test, an experiment in which a computer tries to convince a human of its humanity. According to Turing, if an A.I. can convince a human judge that it is human, then we must assume that the A.I. can think.

While the program that produced “For the Bristlecone Snag” did not complete an extensive and proper Turing Test, it did convince human judges that it was human. At the very least, the poem’s acceptance into an undergraduate literary journal reveals that literate machines can, and will, exist in the near future. The way is paved for more professional and accomplished artificial authors.

Indeed, even in the half decade since “For the Bristlecone Snag” was published, the technology behind artificial intelligence has improved rapidly. Watson, IBM’s “cognitive computing platform”, is a great example of this progress (Captain). In 2011, Watson defeated two reigning champions in Jeopardy, successfully interpreting and answering the game show’s questions. While this feat alone was a remarkable step in cognitive computing, Watson’s analytical abilities have since then contributed to over thirty separate industries, including marketing, finance, and medicine (Captain). For example, the machine can read and understand millions of medical research papers in just a matter of minutes (Captain). As intelligent as Watson is, however, he was never designed to pretend to be human. The chief innovation officer at IBM, Bernie Meyerson, believes ‘“it’s not about the damn Turing Test”’; his team is more interested in accomplishing distinctly inhuman tasks, such as big data analysis (Captain).

While IBM may not be interested in the Turing Test, other artificial intelligence companies have been working specifically towards the goal. In 2014, a program by the name of Eugene Goostman passed the Turing Test using machine learning strategies similar to those that drive Watson (“TURING TEST SUCCESS”). The chatbot, or program that specializes in human conversation, was able to convince several human judges that it was a thirteen-year-old boy (“TURING TEST SUCCESS”). Given the success of Eugene Goostman, and the intelligent accomplishments of Watson, it is indisputable that the Turing Test can be, and has been, passed. Artificial intelligence is a reality. Machines can think.

As an aspiring writer and computer scientist, I can’t help but fixate on the implications that A.I. has for literature. It is entirely possible, even likely, that “For the Bristlecone Snag” foreshadows an era in which the most successful and prolific authors will be machines, an era in which the Pulitzer Prize and Nobel Prize in Literature are no longer given to humans, an era in which humanity no longer writes its own stories.

Yet, this era of artifice should not be greeted with worry or anxiety. Art has always been artificial, a constructed medium for human expression. In the coming decades, we will author the next authors, create the new creators, we will mold the hand that holds the brush. Artificial intelligence should not be feared as an end to art, but rather a new medium, a new age of artifice.

– Zach Gospe

References

Captain, Sean. “Can IBM’s Watson Do It All?” Fast Company. N.p., 05 Jan. 2017. Web. 20 Jan. 2017.

Merchant, Brian. “The Poem That Passed the Turing Test.” Motherboard. N.p., 5 Feb. 2015. Web. 20 Jan. 2017.

“The Archive, Fall 2011.” Issuu. N.p., n.d. Web. 20 Jan. 2017.<https://issuu.com/dukeupb/docs/thearchive_fall2011>.

Turing, A. M. “Computing Machinery and Intelligence.” Mind, vol. 59, no. 236, 1950,  pp. 433–460. www.jstor.org/stable/2251299.

“TURING TEST SUCCESS MARKS MILESTONE IN COMPUTING HISTORY” University of Reading. N.p., 8 June 2014. Web. 21 Jan. 2017.

Where Does The Carbon Go?

November 13, 2015 § 1 Comment

Signing off with your fortnightly dose of science news.

The concentration of carbon dioxide in the atmosphere recently surpassed 400 parts per million, higher than it has been anytime in the past 400,000 years. But only half of the human produced carbon stays in the atmosphere. This has scientists wondering how the other half is absorbed on land and sea, and what this will mean for climate change in the future. Because they don’t really understand the absorption process over land and oceans they don’t know whether oversaturation can occur. If oversaturation is possible, it would potentially force a higher percentage of the carbon dioxide into the atmosphere.

Turns out NASA is working on more than just sending a manned mission to Mars. In July 2014 NASA launched the Orbiting Carbon Observatory 2 (OCO-2), a satellite which measures carbon dioxide levels from the top of the Earth’s atmosphere to the surface. It is the first satellite designed to do so. This provides scientists with near global data on how CO2 behaves in the atmosphere, with 100,000 new measurements each day. While the new data is helpful in showing how carbon dioxide behaves near the surface of the planet, it does not provide concrete information about how CO2 is absorbed on land or water. In order to better understand the process, NASA will use a combination of satellite data, field experiments, and computer models.

The effect on land and water is important to determine because of concern that the ability to absorb carbon will change as temperatures rise. On land, trees and various other vegetation take in carbon dioxide through photosynthesis, some of which gets left behind in the soil once they die. But various factors such as droughts, fires, and deforestation may drastically change the amount of carbon that vegetation can store. Some forests are already releasing more carbon than they are storing.

In the ocean, some carbon dioxide is absorbed directly while some is taken in by phytoplankton. Changes in phytoplankton behavior the past few years are also cause for concern. If not eaten by zooplankton, phytoplankton normally sink to the bottom of the ocean along with all the carbon they have absorbed upon death. Scientists worry that changes in ocean chemistry might alter this process and trigger a release of carbon that has sunk to the bottom of the ocean. Last week NASA launched a ship and airborne study in the North Atlantic to study these changes.

As our world grapples with a growing climate change problem, it becomes increasingly important to understand the processes that affect how it will play out. Otherwise, we may end up facing an unstoppable algae bloom such as the one in Gregory Benford’s Timescape. Climate science fiction touches on an issue very much relevant to today’s world. Hopefully, unlike some of the technology predictions, none of the frightening climate scenarios will ever become a part of our reality.

Confused Vulcan

Changing the Public Perception of “Mad” Science

October 9, 2015 § 4 Comments

Capture 2Capture4Capture 6

Scientists aren’t always depicted in the most flattering of lights. The stereotype that those who commit much of their time to experiments and scientific investigation are “mad scientists” is a trope that has been overused in literature, comics, and film, and it paints quite an unsettling picture of experimenters as extremely eccentric or insane hermits devoid of emotions or social skills who live their lives in laboratories tucked away from civilization. While the nature of scientific research no doubt requires one to spend hours performing experiments, analyzing data, and writing papers, this is far from a projection of a scientist’s personality and is simply the nature of any extended learning process.

This “process,” however, seems to be what keeps the general public away from the science scene. While scientific journals and documentaries about laboratory work may try to engage the common person in the discoveries being made daily in laboratories around the world, the public is not having it. The jargon is intimidating and the theories are complex. So what can be done to create a mutual understanding between those who do science and those who do not?Capture 5Capture

There are dozens of STEM programs—including everything from NASA Robotics Internships to Destination Science summer camps—seeking to involve kids and young adults in scientific activities. So why do kids participating in the “Draw-A-Scientist” Test (DAST) still create the same image of an unkempt, unsmiling, lab coat-clad man when asked to draw a picture of a typical “scientist?”  Some disconnect between actual experiments being conducted and information relayed to the general public prevents children from illustrating a scientist as a happy, tidy, female inventor or explorer. This clearly demonstrates the need to emphasize that there are not two different groups at odds here. Scientists are not “them.” They are “us.” We are all, in a way, engaging in this systematic methodology in our everyday lives. For some reason, people seem to have an inherent suspicion of scientists, believing that they will use the knowledge they acquire against us when in reality, most research is done for the greater benefit. Although we might not be fully knowledgeable on the research being done in nearby labs and universities, the unknown always creates opportunities to learn.

And this is where science fiction can come into play. Science fiction can be the bridge that spans the gap in appreciation between the public and the scientific. Sci-fi is not written to educate. It is created and imagined to explore, redefine, and inspire. Scientific journals might not be the most alluring magazines to read while unwinding on a Friday night. But novels, on the other hand, can present the same ideas and similar material in a more attractive and accessible fashion, generating a greater interest for scientific discovery. As more and more science fiction films and novels have been released in the past decade, public interest in this genre has increased as well. Hopefully, these forms of media will prompt more people to regard scientific work, and those who perform it, as valuable to them and society.

References and Further Reading:

  1. The Draw-A-Scientist Test http://onlinelibrary.wiley.com/doi/10.1002/sce.3730670213/abstract;jsessionid=D380FD12DDF9ED2CDB0F4B4EE07CA6F1.f02t03
  1. Images

http://ed.fnal.gov/projects/scientists/index.html

-Bushra Rahman

Some say the world will end in fire, Some say in ice… I hold with those who favor environmental disaster.

October 9, 2015 § Leave a comment

The scenario is not an uncommon one.  It is X years into the future, and the world is about to end.  But how?  Many authors would descibe a “zombie apocolypse,” some with a more scientific approach to a virus or a disease that wipes out the human race; others describe nuclear fallout, a more plausible way we could cause Armageddon since we seem to have the technology to do it; but ultimately some of the most plausible scenarios for the end of the world involve much less violence, at least at first.

The idea of an environmental disaster bringing about apocolypse is found widely in much literature, most noteably in that of science fiction.  The notion is one that fits the genre; the science behind this kind of apocolypse is interesting and easily understood by readers, plausible enough to seem to never invoke the “tooth-fairy” response more than once.  The variety availiable in environmental disasters is also a draw to writers, as who doesn’t want to be able to choose from a wealth of interesting and equally plausible scenarios for their plot?  The plot could focus on crop failure due to the genetic homogeneity  of the plants leading to suceptibility to a single devastating disease; it could highlight a disastrous disruption in the biosphere ecology and subsequently the world economy due to global climate change; the story could be one on the effect of water shortage due to overuse and pollution of the hydrosphere; it could be like that of Gregory Benford’s Timescape and focus on toxic algal blooms due to eutrification caused by agricultural runoff leading to economic disaster and death; or it could focus on one of the many more examples of disaster, all leading to world panic, likely war over resources, and ultimately bringing about a death toll higher than what we have ever seen.

Such plots draw our attention as readers because we know they could easily become real.  Such plots are less fiction to us as they are an omen of the future, and that both terrifies and intrigues us.  Every event named is one that could easily occur in the modern era.  We grow our plants in genetic monocultures, meaning they are basically clones of one another, all grown together in neat rows, ideal for a virus to target the plants and easily be transferred to entire fields overnight.  If one plant has no reistance, none of the plants will have any resistance.  We are dependent upon these monocultures to allow us to make food crops in the amount currently necessary to sustain the population.  If a staple crop such as corn collapsed, it would lead to starvation and the scarcity of resources would ultimately lead to war and death.  Chaos and apocolypse because of one virus.  Global climate change is an issue known and accepted by most, and its effects can be catastrophic.  The overall warming of the Earth’s atmosphere can lead to melting of global ice caps which in turn leads to raised sea levels, submerging coastal cities where much of the world’s population lives.  Further, the global climate change will disrupt the biosphere ecology.  These two issues combined would lead to scarcity of resources, economic disaster, war, and death.  Water shortage is an issue many in the world are already dealing with.  Besides the shortage in California, the Middle East has always dealt with water shortage issues and wars have occured many times for water rights.  We seem to use water like it is an infinite resource, though water is considered a nonrenewable resource in many states of the U.S. Water pollution and overuse around the world has caused many problems already and, especially when combined with any other of the issues mentioned, could contribute to world catastrophe.  Eutrification is an issue in itsself without the neurotoxic blooms being added in.  A normal algal bloom of algae with no neurotoxic effect will still lead to economic disruption due to the death of fish caused by hypoxic conditions created by the spike in algae and their decomposition.  When neurotoxic organisms are added into the mix, blooms become terrifying.  The infamous neurotoxic “red tides” caused by dinoflagellate blooms have occurred throughout history, but with the mass amount of runoff from fertilizers and other chemicals currently occurring the issue of their increasing frequency and severity is a real concern.  The subsequent effect on the world economy and the death tolls that would be associated with a great increase in blooms would be devastating.

The reason why works with these kinds of disasters always hit home is because they are so real, so plausible.  We as a species cause more stress on our environment than any other species combined.  We have created these issues for ourselves, but even when we see these issues in the books we read or movies we watch and recognize them as so plausible and realistic, we hardly change our actions.  We deal with the here and now and do not plan for the future, just as the society in Benford’s novel.  It seems every day we are digging ourselves into a deeper and deeper hole and its harder and harder to get out.  It is not going to be zombies that bring about the end of the world; we are going to kill ourselves.

-Cassie Woolley

CRW

Red tide off the coast of La Jolla San Diego, California

Red tide off the coast of La Jolla San Diego, California;”La-Jolla-Red-Tide.780″. Licensed under Public Domain via Commons – https://commons.wikimedia.org/wiki/File:La-Jolla-Red-Tide.780.jpg#/media/File:La-Jolla-Red-Tide.780.jpg

TimeEscape

October 9, 2015 § 1 Comment

A line of characters flooded the screen, alternating as my friend shifted her position in my keyboard. My book report was pulled up in Microsoft Word, the cursor blinking frantically as it tried to keep up with my friend’s sabotage. I just laughed as I pushed her off, then entered my newest discovery into the keyboard, taught to us only a week before in 5th grade computer class:

Ctrl+Z.

I was fascinated by the concept of that combination of keys. I could make the most impulsive of edits, write the most ridiculous statement, and delete entire chunks of my paper without worrying about any long term consequences. If I didn’t like the result of my action, I could just push those two keys- Ctrl+Z- and everything would be as it should. A fresh start. Slate cleared. Back to whatever square I chose I wanted to resume work from. I didn’t have to worry about reconstructing any past reality or losing anything to time and effort, because with those two keys in my hand, I could take myself back to any foundation, given I had built the foundation before.

Ctrl+Z.

What a tool.

Hooked as I was on the thrill of “Edit-> Undo,” I was a little taken aback when I realized that this handy shortcut didn’t apply to social interactions. It was irrational, I know- but after a week of riding the high of Ctrl+Z, I had somehow assumed that the same rules that applied to my word processor could apply to real life. And when they didn’t, I was not so much alarmed as unsettled.

I always knew Ctrl+Z was a function of the digital realm. But nonetheless, when my confession of a crush to the boy I liked was met with a blank stare, I found my thumb and forefinger twitching, pushing at the keys that weren’t there:

Ctrl+Z.

I couldn’t edit this unfortunate moment out of my past, couldn’t insert myself back into an earlier version of my life’s document, the one where he didn’t avoid eye contact every time we passed in the hallway. Just like everything  else in the real world, I was bound by time–that immutable, stubborn dimension that refuses to yield to all of human ingenuity, that force that turns into bold, permanent marker the marks that we’d rather be in pencil. There is always the possibility that you can cover up the Sharpie-mask it with the paint of reconciliation, or hide it underneath the tarp of loaded silence.

But no matter what you throw over it, the Sharpie always remains, bleeding through the medium to remind you that yes, this happened. You messed up. You will have always messed up this moment. There’s nothing you can do about it.

Science fiction’s answer to this kick in the brain, this blow of helplessness?

Time travel.

Novels like Timescape take our worst fears–that we might irreparably damage our world, whether that world be the world of individual humans or the literal world of humanity–and puts a bandaid over them, then tucks us into to say goodnight and tells us that everything is going to be okay, because somebody will fix it. Somebody will hit the undo button. The irreparable will become repairable, and we can throw away our tarnished slates and start again.

Time travel grants us control over the fourth dimension and releases us from the chains of time, thereby releasing us from our mistakes. We are fascinated by it because we so deeply want it to be true–to imagine that we can go back and make things right before they ever went wrong.

But at the same time, it’s these wrongs that make us who we are. All of those “character building moments” would be lost if we indulged in easily-accessible time travel-we would never learn anything, because there would be no significant consequences for our actions. Perhaps more importantly, all of the good, unintended consequences of mistakes would be lost. The world would stagnate, because all of the rich innovation that arises out of failure would be lost.

We can’t predict the long term consequences of our actions. Our mistakes can be our biggest triumphs.

However, as Timescape notes, sometimes our triumphs–chemical developments and more efficient methods of manufacturing–can be our biggest mistakes, leading to our downfall–the dismal world Benford describes. And it is this possibility–that we could, as a species, ruin the world–that is the most terrifying to us, because it means that we would tarnish every blank slate born into our mistakes.

Furthermore, it is this possibility that is terrifyingly real.

Gregory Benford might not have the means to time travel in real life, but his fingers are desperately twitching at Ctrl+Z anyway–and as a result of this twitching, typing out a great novel of warning. This book is Benford’s best version of a tachyon, a message to the present urging change and a greater consideration of the future–

because the future will soon become the present, and when it does, we can’t just hit
Ctrl+Z.


Celeste Graves

What Do We Sacrifice For “Perfection”?

September 29, 2015 § 1 Comment

It looked like any other hospital waiting room. Well, any other hospital waiting room in the year 2050. I’ve been told that you weren’t kept behind bars like a common criminal. I’ve been told the doors didn’t always have locks on the outside. Hell, I’ve even been told the rooms had chairs to sit in. . . . I’m pretty sure most of those are myths though. I guess it really doesn’t matter how the rooms might have been though. I’m here now and that is all that really matters.

Eventually they come for you. You don’t know when it will be. You just don’t know . . . but they come eventually. People leave one by one. Where they go I can only imagine, but I guess I will find out soon enough. No one really seems to be nervous and I guess they don’t really have a reason to be. We aren’t here voluntarily. We don’t have a choice, an escape, an alternative. You just accept your fate as it comes.

There was a general trend in the room. None of us looked old than five or six and most of us had obvious defects. You were snatched up as soon as something seemed off with you. For some they were born lucky. Infants with a clear disorder were treated on the spot. They won’t even remember the treatment. Not all of us were so lucky. The boy across from me sat drooped over in his wheelchair. His legs looked frail and thin.

He will walk soon enough. Everything will be fixed soon.

How I found myself in this mess was entirely different. You can’t tell something is wrong with me just by looking at me. The moment I was born my parents could sigh in relief that they would never have to turn their child over to the state. I am more sorry for them than I am for myself as I sit here. I’m an anomaly. It all started when I was three or four and I insisted I was a girl. “But Michael, you’re my baby boy,” my mother would insist. She would force trucks and army men into my hands to play with. She dressed me exclusively in blue. She put me in karate and never let me have girl friends. I was defective.

But medicine can fix all of that now. I am told that after the surgery I won’t even remember wanting to be a girl. I will be my mommy’s strong little man after all.

And with that, they came for me.

——————————————————————————

            I deliberately chose something that would offend or shock. Being transgender is not a defect. It is not something inherently wrong with the person. It is not something to treat. So why did I chose the issue of being transgender as the main driving force of my narrative?

To make you think and question.

Medicine and genetic research has come leaps and bounds from where it began. Thus far, the progress has been something that I support wholeheartedly. Stem cells have incredible potential to change the world. Finding a foolproof cure to cancer would revolutionize the world. But where do we draw the line?

Something I think about is where genetic engineering must stop. My fear is not so much what we as human beings can create, but rather how we choose to use that technology. My greatest fear is that we find ways to change things that are simply hard to understand or not the “norm.”

I have incredibly strong friends who have a wide variety of sexual orientations, gender identities, disabilities, you name it. They are the strongest people I know and I know that they wouldn’t change who they are if given the chance. Nothing is wrong with them. They are unique and beautiful. But as someone who loves them all deeply and unconditionally, I fear that less tolerant people will try to change them. I used to paint my friend’s nails at sleepovers just to take it off in the morning before returning home. Thankfully, Claire*  was able to transition to the person I had always accepted her to be when she moved out of her house. I know her parents would have changed her to accept being male, the sex she had been born with, if given the chance.

I hope I did not offend you, but I do hope I shocked you. We need to think about the limits to genetic engineering. Not the scientific limits, but the moral and ethical limits. Just because something may be possible, doesn’t mean we should necessarily do it. I don’t really know where we should draw the line in the sand, but I hope we can start the dialogue.

Maybe this story can be the pebble that creates waves.

*Name changed for privacy. 

  • S. Jamison

Where Am I?

You are currently browsing entries tagged with Science at Science/Fiction.

%d bloggers like this: