Leslie Valiant Wins 'Nobel Prize' of Computing
ACM, the Association for Computing Machinery today named Leslie G. Valiant of Harvard University the winner of the 2010 ACM A.M. Turing Award for his fundamental contributions to the development of computational learning theory and to the broader theory of computer science. Valiant brought together machine learning and computational complexity, leading to advances in artificial intelligence as well as computing practices such as natural language processing, handwriting recognition, and computer vision. He also launched several subfields of theoretical computer science, and developed models for parallel computing. The Turing Award, widely considered the "Nobel Prize in Computing", is named for the British mathematician Alan M. Turing. The award carries a $250,000 prize, with financial support provided by Intel Corporation and Google Inc.
Some of Valiant's biggest contributions concern the mathematical foundations of computer learning, an area of study that has led to breakthroughs such as IBM Corp.'s Watson, the machine built to play "Jeopardy!" In matches aired last month, the computer breezed past two of the game show's top winners in a display of how far computer scientists have come in programming computers to understand the subtleties of human language and make decisions based on the mountains of data the machines are able to store.
Computational Learning Theory
Valiant's "Theory of the Learnable," published in 1984 in Communications of the ACM, is considered one of the seminal contributions to machine learning. It put machine learning on a sound mathematical footing and laid the foundations for a new research area known as Computational Learning Theory. He provided a general framework as well as concrete computational models, and his approach of "Probably Approximately Correct" (PAC) learning has become a standard model for studying the learning process. His work has led to the development of algorithms that adapt their behavior in response to feedback from the environment. Mainstream research in AI has embraced his viewpoint as a critical tool for designing intelligent systems.
Some of Valiant's biggest contributions concern the mathematical foundations of computer learning, an area of study that has led to breakthroughs such as IBM Corp.'s Watson, the machine built to play "Jeopardy!" In matches aired last month, the computer breezed past two of the game show's top winners in a display of how far computer scientists have come in programming computers to understand the subtleties of human language and make decisions based on the mountains of data the machines are able to store.
Computational Learning Theory
Valiant's "Theory of the Learnable," published in 1984 in Communications of the ACM, is considered one of the seminal contributions to machine learning. It put machine learning on a sound mathematical footing and laid the foundations for a new research area known as Computational Learning Theory. He provided a general framework as well as concrete computational models, and his approach of "Probably Approximately Correct" (PAC) learning has become a standard model for studying the learning process. His work has led to the development of algorithms that adapt their behavior in response to feedback from the environment. Mainstream research in AI has embraced his viewpoint as a critical tool for designing intelligent systems.