IBM's Watson edges Harvard students in 'Jeopardy' quiz

31.10.2011
IBM's Watson supercomputer eked out a victory in a "Jeopardy" quiz-show battle with a trio of Harvard Business School students on Monday, pulling out the win with a higher wager on the Final Jeopardy clue that ends every game.

Both Watson and the HBS students got the final answer, Mount Rushmore, correct, but Watson bet more of its winnings and ended up with US$56,331 to HBS's $42,399. A team from the Massachusetts Institute of Technology's Sloan School of Management also played but couldn't find its rhythm during the contest, and ended up with just $100. No actual money was at stake.

Watson marries an array of software for natural-language processing and other tasks with a hardware cluster containing 2,880 Power processor cores, and finds answers to "Jeopardy" questions from a massive content archive prepared by IBM.

The students didn't compete against Watson head-to-head, however. IBM had Watson answer each question ahead of time, and its answers and the speed with which it replied were loaded into a different computer that was used to play the game. However, software used for game strategy was run in real time, said David Ferrucci, principal investigator of the Watson project.

Nor did "Jeopardy" host Alex Trebek make the trek to Boston. Actor Todd Crain stood in for him, lending an animated but polished tone to the proceedings. IBM has hired Crain to host a number of mock "Jeopardy" sessions pitting Watson against former show contestants, he said.

Watson played the top, real-life "Jeopardy" champions Ken Jennings and Brad Rutter earlier this year and both of them.

But for a time on Monday it appeared Watson could fall, as a crowd inside a Harvard campus hall roared its approval.

The Harvard and MIT crews appeared to have an advantage with clues involving various types of wordplay. For example, Harvard buzzed in ahead of Watson when asked to provide the equivalent to "George W.'s rumps." The correct answer: "Bush's tushes."

But in an interview prior to the contest, HBS contestant Jonas Akins described his team's preparations in simple terms. "We watched some shows and talked about our relative strengths," Akins said. One of his teammates, Genevieve Sheehan, was able to lend an insider's edge: She once appeared on the real "Jeopardy" show.

The most successful players must master the show's tricky buzzer, which calls for proper timing after the question is asked, rather than simply hitting the buzzer fastest.

"The buzzers don't get activated until Alex is finished reading each question," . "If you buzz in too early, the system actually locks you out for a fifth of a second or so. But if you're too late, the player next to you is going to get in first."

It appeared during Monday's game that both university teams had some difficulty getting their timing down, but Watson seemed to get locked out now and again too. Still, while the "Jeopardy" games Jennings played were run in a "scrupulously fair" manner, "without its buzzer edge, Watson isn't yet good enough to beat top human players," he wrote on his site.

And Watson wasn't always so adept at parsing human language, Ferrucci said during a talk prior to the game. "There was a lot of work that went into developing the algorithms."

He showed a series of "Jeopardy" clues posed to Watson in its earlier days.

The room exploded with laughter when Ferrucci showed Watson's response to a clue referring to "the Father of Bacteriology," or Louis Pasteur: "How tasty was my little Frenchman."

While Watson has been able to defeat human players, the world should take solace in the sprawling amount of technology and research this feat has required, according to Ferrucci. The human brain "fits in a shoebox," he noted, and can run on the energy supplied by "a tuna sandwich."

The IDG News Service