I watched Stanley Kubrick’s masterpiece
2001: A Space Odyssey as a little girl. I felt both extremely frightened and profoundly sad as Hal 9000 begged for his life by saying, “Dave, stop. Stop, will you? Stop, Dave. Will you stop Dave? Stop, Dave.” I knew intellectually that Hal was a machine and not a person, so to think of his death was rather strange. But when he said, “I'm afraid. I'm afraid, Dave. Dave, my mind is going. I can feel it... I'm a... fraid,” I felt conflicted. After all, he said he was “afraid.” A machine having a
mind? And it feels afraid? How is that possible?
Recently, I watched an episode of 60 Minutes on Artificial Intelligence (AI). The episode featured a number of researchers working on the development of AI that could not only mimic but also surpass humans because it can learn through experiences and it never forgets. And it has already shown amazing results. Watson, an IBM computer, won Jeopardy! in 2011 and is now learning to become a cancer expert at University of North Carolina (UNC) at Chapel Hill.
Although the cancer researchers at UNC lauded Watson as a potentially lifesaving tool for doctors because of its capacity to read and search nearly 8,000 cancer research papers being published on a daily basis around the world, some experts are expressing concerns on its capacity to become smarter than humans. For example, both Stephen Hawking and Elon Musk have said there could be dark sides to the future of AI if machines become smarter at the expense of kindness and generosity toward humans.
Consider that scientists can now edit the human genome to cure diseases but are calling for a moratorium on such practices while ethical issues are sorted. I believe that’s a sign scientific achievement must be balanced against the advancement in humanity.
One way to do this is to read more literature pieces that ask the hard questions. Our students can learn the peril of scientific creation sans human guidance by reading Frankenstein; or, the Modern Prometheus by Mary Shelley. Before students become computer scientists, shouldn’t they read I, Robot by Isaac Asimov to learn The Three Laws of Robotics? Unfortunately, many English teachers are now asked to read more informational or nonfiction texts or than novels in the classroom.
Having been an English teacher for over a decade in Bakersfield, California, I like to think that I understand frustration over Common Core State Standards better than many others. However, I would never give up teaching classics like the works of Shakespeare because our students need our fortitude and perseverance more than ever before. In the new world where AI could take over every aspect of our lives without a stronger moral compass, I would hope that our students will be able to recall the horrible fate of the boys in Lord of the Flies.
Kip Glazeris a native of Seoul, South Korea, and immigrated to the United States in 1993 as a college student. She holds California Single Subject Teaching Credentials in Social Studies, English, Health, Foundational Mathematics, and School Administration. In 2014, she was named the Kern County Teacher of the Year. She earned her doctorate of education in learning technologies at Pepperdine University in October 2015. She has presented and keynoted at many state and national conferences on game-based learning and educational technologies. She has also consulted for Center for Innovative Research in Cyberlearning and the Kennedy Center ArtsEdge Program.