Back to Search
Start Over
Measuring Semantic Similarity by Latent Relational Analysis
- Source :
- Turney, Peter D. (2005) Measuring Semantic Similarity by Latent Relational Analysis. [Conference Paper]
- Publication Year :
- 2005
- Publisher :
- Published
-
Abstract
- This paper introduces Latent Relational Analysis (LRA), a method for measuring semantic similarity. LRA measures similarity in the semantic relations between two pairs of words. When two pairs have a high degree of relational similarity, they are analogous. For example, the pair cat:meow is analogous to the pair dog:bark. There is evidence from cognitive science that relational similarity is fundamental to many cognitive and linguistic tasks (e.g., analogical reasoning). In the Vector Space Model (VSM) approach to measuring relational similarity, the similarity between two pairs is calculated by the cosine of the angle between the vectors that represent the two pairs. The elements in the vectors are based on the frequencies of manually constructed patterns in a large corpus. LRA extends the VSM approach in three ways: (1) patterns are derived automatically from the corpus, (2) Singular Value Decomposition is used to smooth the frequency data, and (3) synonyms are used to reformulate word pairs. This paper describes the LRA algorithm and experimentally compares LRA to VSM on two tasks, answering college-level multiple-choice word analogy questions and classifying semantic relations in noun-modifier expressions. LRA achieves state-of-the-art results, reaching human-level performance on the analogy questions and significantly exceeding VSM performance on both tasks.
Details
- Database :
- CogPrints
- Journal :
- Turney, Peter D. (2005) Measuring Semantic Similarity by Latent Relational Analysis. [Conference Paper]
- Publication Type :
- Conference
- Accession number :
- edscog.4501
- Document Type :
- Conference Paper