Digital Week lecture examines ethical use of robots
Chris Carbonaro | Tuesday, September 22, 2015
At a lecture last night, philosophy professor Don Howard was introduced by a robot.
To kick off Notre Dame’s second annual digital week, Howard delivered a lecture discussing the ethical issues regarding the widespread implementation of robots. After a screen attached to a mobile stand projected the face of a man named Elliot who introduced Howard, the professor said something which quickly proved difficult to refute.
“There is a revolution underway that is going to transform our world more rapidly and more radically than even the Internet and information revolution did. This is the robotics revolution,” Howard said.
This will be even more widespread than the industrial revolution, he said. Both individuals with jobs in the service industry and those requiring higher education are being threatened by mechanization. Howard said this could lead to the unemployment of hundreds of thousands of people.
“Already we’re seeing the almost total displacement of human drivers by wholly automated transport,” he said. “Personally, I think this is the single biggest ethical problem facing us today in connection with robotics.”
With this advent of widespread robots and their increased capability, Howard stressed caution is of the utmost importance when implementing these machines.
“In the past, we have made some really huge mistakes with technology,” Howard said. “We failed to anticipate what the downstream, long-term consequences of a carbon fueled economy were going to be, and now we pay the price for that failure.”
However, Howard said the ethical implications of this sort of replacement are not all negative.
“Driver error is the ultimate cause behind most fatal accidents,” Howard said. “In theory, we could save 30,000 lives in the U.S. alone and 1.2 million lives globally every year if we replaced human drivers with self-driving cars.”
This benefit is impossible to discount, Howard said. Another similar, near future use for robots could include using teleprompters like the one used to introduce Howard earlier to actively engage bedridden students in the classroom, he said. Howard said he expected schools and universities like Notre Dame to begin implementing similar systems soon.
“What is a robot?” he asked. “Not all robots have humanoid features. … We cannot let uncertainty about the consequences of new technologies simply stifle technological development because, as we all know, there are many examples of new technologies which are, for the most part, for the good of human kind.”
Howard urged those attending the lecture to rethink what they consider to be robots. By doing so, their greater capacity for good might be revealed. Rather than create robots and then discuss their ethical implications, Howard said the two processes should be intertwined.
“I think that we need to build a world in which engagement with ethics is an everyday part of the world,” he said.
Howard also said it is ultimately humans who determine the ethical implementation of robots. Nobody else is going to ensure this is done fairly.
“Why are most humanistic robots white or Asian?” asked Howard. “And why do so many of those robots have attractive female features? Have you ever seen an African-American robot?”
According to Howard, humans need to carefully watch themselves to ensure this robotic revolution happens in an ethical manner. It is not the machines which we need to fear. Any concerns regarding an emotionally complex or sentient robot should be distant thoughts, Howard said.
“Whatever you do, don’t turn to Hollywood for advice,” he said. “There is no robot apocalypse in the offing.”