Meeting link: You will receive meeting link after registration confirmation process.
Speaker: Dr. Dagmar Gromann (http://dagmargromann.com/), Assistant Professor at the Centre for Translation Studies at the University of Vienna, Austria.
Abstract: With an ever increasing amount of textual data being published on a daily basis, methods to automatically analyze and systemize knowledge inherent in unstrcutured text have gained in importance, especially in specialized domains. Ongoing and groundbreaking advances in neural language models have changed approaches of (semi-)automatically extracting information from texts across natural languages. One major change is the ability to cover a wide range of natural languages without the availability or large text corpora in the respective language, called zero-shot or transfer learning. This talk will focus on the ability of large pre-trained neural language models to automatically extract domain-specific terms and their systemization in concept systems across natural languages as well as major issues that can be observed in this task.
Short bio: Dr. Dagmar Gromann (http://dagmargromann.com/) is Assistant Professor at the Centre for Translation Studies at the University of Vienna, Austria. Prior to that she worked as a post-doc at IIIA-CSIC in Barcelona and TU Dresden. Her main research interests are knowledge extraction from natural language with a cognitive or terminological focus, e.g. Text2TCS, and socio-technical impacts of language technology, e.g. gender-fair machine translation (GenderFairMT). She is on the editorial board of the Semantic Web journal and the Journal of Applied Ontologies, vice chair of the COST Action NexusLinguarum, local chair of LDK 2023, and responsible for the curriculum of the new master’s program Multilingual Technologies.