Never-Ending Machine Learning

Jan 29

Wednesday, January 29, 2014

3:30 pm - 4:30 pm
Bryan Research 103


Tom M. Mitchell E. Fredkin University Professor Machine Learning Department Carnegie Mellon University

We will never really understand learning until we can build machines that learn many different things, over years, and become better learners over time. This talk describes our research to build a Never-Ending Language Learner (NELL) that runs 24 hours per day, forever, learning to read the web. Each day NELL extracts (reads) more facts from the web, and integrates these into its growing knowledge base of beliefs. Each day NELL also learns to read better than yesterday, enabling it to go back to the text it read yesterday, and extract more facts, more accurately, today. NELL has been running 24 hours/day for four years now. The result so far is a collection of 70 million interconnected beliefs (e.g., servedWtih(coffee, applePie), isA(applePie, bakedGood)), that NELL is considering at different levels of confidence, along with millions of learned phrasings, morphological features, and web page structures that NELL uses to extract beliefs from the web. Track NELL's progress at, or follow it on Twitter at @CMUNELL. Bio: Tom M. Mitchell founded and chairs the Machine Learning Department at Carnegie Mellon University, where he is the E. Fredkin University Professor. His research uses machine learning to develop computers that are learning to read the web, and uses brain imaging to study how the human brain understands what it reads. Mitchell is a member of the NAE, Fellow AAA, and a Fellow and Past President of AAAI. Reception to follow talk.


Peterson, Kathy