Jakob

Jakob Prange

Akademischer Rat / Postdoctoral Research Associate

at the Chair for Natural Language Understanding / Digital Humanities
in the Faculty of Applied Computer Science
of the University of Augsburg

Email:

Publications | Presentations

My primary interest is to explore how human language works and how this can be formalized well and efficiently. I try to accomplish this by combining methods and concepts from formal and distributional semantics, meaning representation, deep learning, old-school AI, and linguistic theory. Within the fields of computational linguistics (CL) and natural language processing (NLP), I thus consider myself on the somewhat more abstract and theoretical side of the spectrum, i.e. more CL than NLP. I generally prioritize efficiency (small and fast models) and explanability, both of which are main advantages of combining neural models with linguistically-structured representations. A very cool practical application of a recent project by M.Sc. student Steffen Kleinle is helping people find answers to their immigration-related questions.

I received a Ph.D. in Computer Science with a concentration in Cognitive Science from Georgetown University, specializing in computational syntax & semantics, meaning representation design, and parsing. My advisor was Dr. Nathan Schneider. My dissertation (titled "Neuro-symbolic Models for Constructing, Comparing, and Combining Syntactic and Semantic Representations") mostly consists of three published papers, which you can find here: abstract, chapter 3 [scroll down], chapter 4 [scroll down], chapter 5 [scroll down]. I am extremely grateful for the guidance I have received from my thesis committee, consisting of Nathan, Dr. Chris Dyer, Dr. Katrin Erk, Dr. Ophir Frieder, and Dr. Justin Thaler.

Previously, I got a B.Sc. in Computational Linguistics from Saarland University under the supervision of Prof. Dr. Manfred Pinkal and Dr. Stefan Thater. My thesis project was on "POS-Tagging of Internet Texts Using Information about Distributional Similarity" and is summarized in this paper [scroll down].

ACL Anthology | GitHub | Google Scholar | Semantic Scholar | DBLP | ResearchGate | LinkedIn

> Augsburg Human Language Technology (HLT) Group


News



Previous Affiliations



Tutorial

Workshop

Publications


Presentations

  • Neuro-symbolic Models for Constructing, Comparing, and Combining Syntactic and Semantic Representations
    Seminar talk @ Augsburg HLT group, December 2023; poster @ KONVENS, September 2023.
    [full dissertation]

  • Reanalyzing L2 Preposition Learning with Bayesian Mixed Effects and a Pretrained Language Model.
    Talk @ HK CogSci Meetup, CUHK, June 2023.
    [full paper]

  • Can a pretrained neural language model still benefit from linguistic symbol structure? Some upper and some lower bounds.
    Invited talk @ HK Machine Learning Meetup, May 2023.
    [abstract], [slides], [full paper1], [full paper2]

  • If you recall, biscuit conditionals are weird.
    Talk @ JWLLP, December 2022.
    [abstract], [slides]

  • Linguistic Frameworks Go Toe-to-Toe at Neuro-symbolic Language Modeling.
    Seminar talk @ HKPolyU LLT group, August 2022; talk @ MASC-SLL, April 2022.
    [full paper]

  • Supertagging the Long Tail with Tree-Structured Decoding of Complex Categories.
    Talks @ EACL, April 2021, and SCiL, February 2021 (as CCG Supertagging as Top-down Tree Generation).
    [extended abstract], [publisher], [full paper]

  • Cross-linguistic Multilayer Semantic Annotation and Parsing with UCCA.
    Seminar talk @ Utah NLP group, August 2019.
    [abstract]

  • Corpus Linguistics: What is it and what can it do for me?
    Invited talk @ 1st IDHN Conference, May 2019.
    [slides]

  • Preposition Supersenses in German-English Parallel Data.
    Poster @ MASC-SLL 2018 and Google Assistant and Dialog Workshop 2018.
    [abstract], [poster]

  • The UdS POS Tagging Systems @ EmpiriST 2015.
    Talk and roundtable discussion @ NLP4CMC 2016.
    [slides]