Back to Search
Start Over
Do Sentence Transformers Learn Quasi-Geospatial Concepts from General Text?
- Publication Year :
- 2024
-
Abstract
- Sentence transformers are language models designed to perform semantic search. This study investigates the capacity of sentence transformers, fine-tuned on general question-answering datasets for asymmetric semantic search, to associate descriptions of human-generated routes across Great Britain with queries often used to describe hiking experiences. We find that sentence transformers have some zero-shot capabilities to understand quasi-geospatial concepts, such as route types and difficulty, suggesting their potential utility for routing recommendation systems.<br />Comment: Presented at the Second International Workshop on Geographic Information Extraction from Texts at ECIR 2024 (https://geo-ext.github.io/GeoExT2024/program/)
- Subjects :
- Computer Science - Computation and Language
Computer Science - Machine Learning
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2404.04169
- Document Type :
- Working Paper