Jakob Prange

Postdoctoral Fellow

in the Department of Chinese and Bilingual Studies
at Hong Kong Polytechnic University


Affiliations | Papers | Presentations

My primary interest is to explore how human language works and how this can be formalized well and efficiently. I try to accomplish this by combining methods and concepts from formal and distributional semantics, meaning & knowledge representation, deep learning, and old-school AI. Within the fields of computational linguistics (CL) and natural language processing (NLP), I thus consider myself on the somewhat more abstract and theoretical side of the spectrum—but I'm hoping that my foundational research can at some point be applied to something that's useful in the real world (see, e.g., section 5 of our KI article for some hypothetical applications of case and adposition supersense classification).

I received a Ph.D. in Computer Science with a concentration in Cognitive Science from Georgetown University, specializing in computational syntax & semantics, meaning representation design, and parsing. My advisor was Dr. Nathan Schneider. My dissertation (titled "Neuro-symbolic Models for Constructing, Comparing, and Combining Syntactic and Semantic Representations") mostly consists of three published papers, which you can find here: abstract, chapter 3 [scroll down], chapter 4 [scroll down], chapter 5 [scroll down]. I am extremely grateful for the guidance I have received from my thesis committee, consisting of Nathan, Dr. Chris Dyer, Dr. Katrin Erk, Dr. Ophir Frieder, and Dr. Justin Thaler.

Before that, I got a B.S. in Computational Linguistics from Saarland University under the supervision of Prof. Dr. Manfred Pinkal and Dr. Stefan Thater. My thesis project was on "POS-Tagging of Internet Texts Using Information about Distributional Similarity" and is summarized in this paper [scroll down].

ACL Anthology | GitHub | Google Scholar | Semantic Scholar | DBLP | ResearchGate | LinkedIn





  • If you recall, biscuit conditionals are weird.
    Talk @ JWLLP, December 2022.

  • Linguistic Frameworks Go Toe-to-Toe at Neuro-symbolic Language Modeling.
    Invited talk @ HKPolyU LLT group, August 2022; talk @ MASC-SLL, April 2022.
    [full paper]

  • Supertagging the Long Tail with Tree-Structured Decoding of Complex Categories.
    Talks @ EACL, April 2021, and SCiL, February 2021 (as CCG Supertagging as Top-down Tree Generation).
    [extended abstract], [publisher], [full paper]

  • Cross-linguistic Multilayer Semantic Annotation and Parsing with UCCA.
    Invited Talk @ Utah NLP group, August 2019.

  • Corpus Linguistics: What is it and what can it do for me?
    Invited Talk @ 1st IDHN Conference, May 2019.

  • Preposition Supersenses in German-English Parallel Data.
    Poster @ MASC-SLL 2018 and Google Assistant and Dialog Workshop 2018.
    [abstract], [poster]

  • The UdS POS Tagging Systems @ EmpiriST 2015.
    Talk and roundtable discussion @ NLP4CMC 2016.