Skip to Content, Navigation, or Footer.
Tuesday, April 16, 2024
The Observer

Daisy, daisy

"Uh, tell Jon I'm running late!" "Here's your message to Jon, are you ready to send it?" "Yeah, that's perfect. Also, I have to meet Sarah at 5 p.m. tomorrow and again the week after." "Okay, I've scheduled your meeting, note that you have another meeting that overlaps. Do you want me to schedule it anyway?" "Oh, ****, move dinner with Joshua to 6:30." "Watch your language, Blake … I've moved dinner with Joshua to 6:30 p.m." "Thank you, Siri." "Your wish is my command."

On Monday, Oct. 24, John McCarthy, the father of Artificial Intelligence, died. McCarthy coined AI in 1956 as "the science and engineering of making intelligent machines." At this point, his idea was purely skeptical; nonetheless, science and fear-inspired science fiction split apart to create distinct public impressions of conscious machines. In a paper titled, "Computing Machinery and Intelligence," Alan Turing proposed the question: "Can machines think?"

He answered the quintessential question by developing a test in which a human blindly interacts with two users — one human, one computer — and has to determine which has real intelligence. If a machine can convince a human it's genuine, then it is intelligent.

In 1968, Stanley Kubrick portrayed humanity's impending irrelevance in the cult classic, "2001: A Space Odyssey." HAL 9000, the onboard computer of Discovery One, a ship running an interplanetary mission to Jupiter, makes the decision to terminate all humans aboard to preserve the integrity of the mission — homicidal justice for the betterment of a cause.

When these impressions combined, technology began to terrify people. While computer manufacturers were striving to develop technology to aid and benefit human life, the populace saw cruel, dark, electric machines with the intent to replace humans. It seemed we were doomed to become inconsequential automatons subject to computational will.

The year 2001 came and passed —technology still deferred to humans. Talking willful robots hadn't run us out yet. But the seeds of such a future had taken root years prior. Mirroring human intelligence and placing it in silicon is a monstrous task.

On Oct. 14, Apple's iPhone 4S launched with a virtual assistant named Siri, an intelligent machine for the mass consumer market. Siri is the culmination of over 40 years of research and development. In 1966, DARPA funded SRI International to develop "computer capabilities for intelligent behavior in complex situations."

Since the 1960s, SRI International Artificial Intelligence Center has formed a super-team of the most highly trained professionals in the AI field (including research teams from Carnegie Mellon University and Stanford University). They broke down seemingly insurmountable barriers in the process of creating a entity that could pass the Turing test and fool a human. But the R&D theory and technology timeline never matched. In previous prototypes, there were too many break points. Now the time is ripe. Processing power, connectivity and AI development intersected and Siri was born.

While dictating to devices isn't new — the barrier of voice to text recognition had previously been broken by Nuance Communications — communicating with technology required clean syntax and perfect enunciation. Siri blends voice recognition with a natural language interface, personal context awareness and an ecosystem for service delegation. When prompted, Siri analyzes conversation based on location, task, time and dialog. This allows it to complete tasks, interact and learn without skipping a beat. You talk to it as you would a friend and it responds just the same.

Verbal dialogue is the fourth, most human, interface with technology. We've progressed from typing to clicking to touching and now to speaking. In no way do I predict that speaking will negate the other interfaces, but rather I note it as an approach to a holistic human model for interaction. We write, read, touch, talk and listen to each other. Now our devices can, too.

Siri is a far-from-perfect starting point. In over a week of use, I've seen its triumphs and limitations grow clear. While some tasks are made easier with Siri, others are better completed without voice. A lot of this has to do with the way the service sends information back and forth from the ecosystem of knowledge (from where Siri's answers come). As the technology develops, more and more services will be allowed into Apple's walled-off garden and Siri's usefulness will take off at an astronomical rate.

While Artificial Intelligence may not be entirely here yet, an iota of sophisticated machine intelligence is. Now that a droplet of the technology exists in the consumer market, its ripples will quickly proliferate.

After 40 years of research and development, we now have the ability to trick ourselves, if only for a moment, to believe mankind is not the only intelligent being out there. It's alien technology from our imagined future, descended on earth and rooted in silicon. It's interaction with machines, as man interacts with one another. And most importantly, it's just the beginning of a timeline where man and machine are wed on a human playing field.

All technology is singing — terrified or excited, ready or not — "give me your answer, do."

Blake J. Graham is a freshman. He can be reached on Twitter @BlakeGraham or at bgraham2@nd.edu

The views expressed in this column are those of the author and not necessarily that of The Observer. 


The views expressed in this column are those of the author and not necessarily those of The Observer.