AnalogySpace: Reducing the Dimensionality of Common Sense Knowledge Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: 2009.04.13 From AAAI 2008 Robert.

Slides:



Advertisements
Similar presentations
Reasoning From (Not Quite) Text
Advertisements

Covariance Matrix Applications
Generalised Inverses Modal Analysis and Modal Testing S. Ziaei Rad.
1 Latent Semantic Mapping: Dimensionality Reduction via Globally Optimal Continuous Parameter Modeling Jerome R. Bellegarda.
ConceptNet: A Wonderful Semantic World
LEDIR : An Unsupervised Algorithm for Learning Directionality of Inference Rules Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: From EMNLP.
Comparison of information retrieval techniques: Latent semantic indexing (LSI) and Concept indexing (CI) Jasminka Dobša Faculty of organization and informatics,
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
Lecture 19 Quadratic Shapes and Symmetric Positive Definite Matrices Shang-Hua Teng.
1 Latent Semantic Indexing Jieping Ye Department of Computer Science & Engineering Arizona State University
Information Retrieval in Text Part III Reference: Michael W. Berry and Murray Browne. Understanding Search Engines: Mathematical Modeling and Text Retrieval.
Singular Value Decomposition in Text Mining Ram Akella University of California Berkeley Silicon Valley Center/SC Lecture 4b February 9, 2011.
1 International Workshop on Computer Vision April 26-30, 2004 Tehran,Iran Singular Value Decompositions with applications to Singular Value Decompositions.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
1 Chapter 4 Decision Support and Artificial Intelligence Brainpower for Your Business.
Sparsity, Scalability and Distribution in Recommender Systems
Semantic Web services selection based on context information Hong Qing Yu Department of Computer Science 22th May 2007.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Dimension of Meaning Author: Hinrich Schutze Presenter: Marian Olteanu.
Latent Semantic Analysis (LSA). Introduction to LSA Learning Model Uses Singular Value Decomposition (SVD) to simulate human learning of word and passage.
E.G.M. PetrakisDimensionality Reduction1  Given N vectors in n dims, find the k most important axes to project them  k is user defined (k < n)  Applications:
McGraw-Hill/Irwin ©2005 The McGraw-Hill Companies, All rights reserved ©2005 The McGraw-Hill Companies, All rights reserved McGraw-Hill/Irwin.
Adding Common Sense into Artificial Intelligence Common Sense Computing Initiative Software Agents Group MIT Media Lab.
Opinion mining in social networks Student: Aleksandar Ponjavić 3244/2014 Mentor: Profesor dr Veljko Milutinović.
Advisor: Hsin-Hsi Chen Reporter: Chi-Hsin Yu Date:
Common Sense Computing MIT Media Lab Interaction Challenges for Agents with Common Sense Henry Lieberman MIT Media Lab Cambridge, Mass. USA
Summarized by Soo-Jin Kim
Chapter 2 Dimensionality Reduction. Linear Methods
Presented By Wanchen Lu 2/25/2013
Latent Semantic Analysis Hongning Wang VS model in practice Document and query are represented by term vectors – Terms are not necessarily orthogonal.
ConceptNet 5 Jason Gaines 1. Overview What Is ConceptNet 5? History Structure Demonstration Questions Further Information 2.
MACHINE LEARNING 張銘軒 譚恆力 1. OUTLINE OVERVIEW HOW DOSE THE MACHINE “ LEARN ” ? ADVANTAGE OF MACHINE LEARNING ALGORITHM TYPES  SUPERVISED.
Tutor: Prof. A. Taleb-Bendiab Contact: Telephone: +44 (0) CMPDLLM002 Research Methods Lecture 8: Quantitative.
Push Singh & Tim Chklovski. AI systems need data – lots of it! Natural language processing: Parsed & sense-tagged corpora, paraphrases, translations Commonsense.
Presentation to VII International Workshop on Advanced Computing and Analysis Techniques in Physics Research October, 2000.
EMIS 8381 – Spring Netflix and Your Next Movie Night Nonlinear Programming Ron Andrews EMIS 8381.
Lifestyle Modeling and Visualization Catherine Havasi, Ryan MacDowell, Rob Speer, and Marko Popovic.
Latent Semantic Analysis Hongning Wang Recap: vector space model Represent both doc and query by concept vectors – Each concept defines one dimension.
Web Usage Mining for Semantic Web Personalization جینی شیره شعاعی زهرا.
Pseudo-supervised Clustering for Text Documents Marco Maggini, Leonardo Rigutini, Marco Turchi Dipartimento di Ingegneria dell’Informazione Università.
Authors: Rosario Sotomayor, Joe Carthy and John Dunnion Speaker: Rosario Sotomayor Intelligent Information Retrieval Group (IIRG) UCD School of Computer.
AN INTELLIGENT AGENT is a software entity that senses its environment and then carries out some operations on behalf of a user, with a certain degree of.
Gene Clustering by Latent Semantic Indexing of MEDLINE Abstracts Ramin Homayouni, Kevin Heinrich, Lai Wei, and Michael W. Berry University of Tennessee.
1 CSC 594 Topics in AI – Text Mining and Analytics Fall 2015/16 6. Dimensionality Reduction.
CS532 TERM PAPER MEASUREMENT IN SOFTWARE ENGINEERING NAVEEN KUMAR SOMA.
Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: From AAAI 2008 William Pentney, Department of Computer Science & Engineering University of.
Modern information retreival Chapter. 02: Modeling (Latent Semantic Indexing)
Chapter 10. The Explorer System in Cognitive Systems, Christensen et al. Course: Robots Learning from Humans On, Kyoung-Woon Biointelligence Laboratory.
2004 謝俊瑋 NTU, CSIE, CMLab 1 A Rule-Based Video Annotation System Andres Dorado, Janko Calic, and Ebroul Izquierdo, Senior Member, IEEE.
Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.
1 Friends and Neighbors on the Web Presentation for Web Information Retrieval Bruno Lepri.
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
Advisor: Hsin-Hsi Chen Reporter: Chi-Hsin Yu Date: From Word Representations:... ACL2010, From Frequency... JAIR 2010 Representing Word... Psychological.
Convolution Kernels on Constituent, Dependency and Sequential Structures for Relation Extraction Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date:
Common Sense Inference Let’s distinguish between: Mathematical inference about common sense situations Example: Formalize theory of behavior of liquids.
Sentiment Analysis Using Common- Sense and Context Information Basant Agarwal 1,2, Namita Mittal 2, Pooja Bansal 2, and Sonal Garg 2 1 Department of Computer.
Gaussian Conditional Random Field Network for Semantic Segmentation
Multi-Class Sentiment Analysis with Clustering and Score Representation Yan Zhu.
School of Computer Science & Engineering
Source: Procedia Computer Science(2015)70:
ANTHAN HALKO, PER-GUNNAR MARTINSSON, YOEL SHAOLNISKY, AND MARK TYGERT
CS 456 Interactive Software.
HCC class lecture 13 comments
Word Embedding Word2Vec.
Wikipedia Traffic Forecasting
Parallelization of Sparse Coding & Dictionary Learning
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Restructuring Sparse High Dimensional Data for Effective Retrieval
Introduction to Sentiment Analysis
Lecture 20 SVD and Its Applications
Presentation transcript:

AnalogySpace: Reducing the Dimensionality of Common Sense Knowledge Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: From AAAI 2008 Robert Speer, CSAIL, Massachusetts Institute of Technology Catherine Havasi, Laboratory for Linguistics and Computation, Brandeis University Henry Lieberman, Software Agents Group, MIT Media Lab

Outlines Introduction Common Sense Computing AnalogySpace Evaluation Related Work Conclusion

Introduction AnalogySpace ◦ As a component in ConceptNet3 ◦ As an dimensionality reduction (LSA) technique ◦ As an mechanism to “smooth” the interaction between the web contributors and ConceptNet system ConceptNet Users

Common Sense Computing ConceptNet ◦ Open Mind Common Sense project ◦ > 700k pieces of information ◦ > contributors ◦ > 250k assertions, 3.4% negative polarity

Common Sense Computing (cont.) CatDog a cat is a peta dog is a pet a cat has fura dog has fur a cat has a taila dog has a tail a cat has four legs ?? a dog has four legs ?? Learner System (Chklovski 2003) ◦ reasoning about commonsense by “cumulative analogy” ◦ Step  Dividing statements into objects and features  Hypothesizing new knowledge by analogy step

AnalogySpace Singular value decomposition, SVD ◦ Like LSA in IR ◦ A=U Σ V t,  U, V: orthogonal matrix  Σ : diagonal matrix  Truncated SVD ◦ Row: concept ◦ Column: features

AnalogySpace (cont.) Assertion:“A trunk is part of car.” ◦ Feature (PartOf, “Car”) to the concept “trunk” ◦ Feature (“trunk”, PartOf) to the concept “car” Only included concepts that has min. 4 assertions Score and normalization ◦ Using confidence score as the value of matrix  +n: for positive assertion, -n: for negative assertions, 0: for confidence score <= 0 ◦ Normalizing the rows of the matrix ◦ Scaling the rows down by Euclidean norm

AnalogySpace (cont.)

Evaluation AnalogySpace assures users that the system is learning from their input. Experiment ◦ 40 college students ◦ 60 assertions for each student ◦ 4 sources  25%: existing ConceptNet assertions (CS>2)  25%: produced by AnalogySpace  25%: from a modified AnalogySpace (Gutpa et.al., 2004)  25%: random combinations of concepts and features

Evaluation (cont.)

Assigning score (2=generaltrue,1,0,-1=not true) ◦ 1.315: existing assertions ◦ 1.025: new assertion by AnalogySpace ◦ 0.882: within-relation SVD ◦ : random

Related Work Extract general knowledge ◦ Suh, Halpin, & Klein (2006)  Mining commonsense from wikipedia ◦ Eslick (2006)  Uses data mining techniques to extract common sense from websites on the Internet ◦ KNEXT project (Schubert 2002)  uses patterns to extract semantic relationships from the Penn Treebank

Conclusion AnalogySpace ◦ Has the ability to predict new assertions ◦ Helps to give users confidence that the system is learning from their inputs

Thanks!!

Questions: Knowledge extraction ◦ Types of commonsense knowledge ◦ Usage of commonsense knowledge in difference context Inference ◦ Used for extracting new knowledge ◦ Used for solving specific tasks